Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label Computers. Show all posts

ChatGPT: A Threat to Privacy?

 


Despite being a powerful and innovative AI chatbot that has quickly drawn several people's attention, ChatGPT has some serious pitfalls that seem to be hidden behind its impressive features. 

For any question you ask it, it will be able to provide you with an answer that sounds like it was written by a human, as it has been trained on massive amounts of data from across the net to gain the knowledge and writing skills necessary to provide answers that sound like they were created by humans. 

There is no denying that time is money, and chatbots such as ChatGPT and Bing Chat have become invaluable tools for people. Computers write codes, analyze long emails, and even find patterns in large amounts of data with thousands of fields. 

This chatbot has astonished its users with some of its exciting features and is one of the most brilliant inventions of Open AI. ChatGPT can be used by creating an account on their website for the first time. In addition to being a safe and reliable tool, it is also extremely easy to use. 

However, many users have questions about chatbot accessibility to the user's data. OpenAI saves OpenGPT conversations for future analysis, along with the openings. The company has published a FAQ page where its employees can selectively review selected chats to ensure their safety, according to the FAQ page. 

You should not assume that anything you say to ChatGPT will remain confidential or private after sharing. OpenAI discovered a critical bug that has prompted a terrible security issue. 

OpenAI CEO Sam Altman stated that some users could view the titles of other users' conversations on a lesser percentage of occasions. Altman says the bug (now fixed) resides in a library accessible via an open-source repository. A detailed report will be released by the company later as the company feels "terribly about this." 

The outage tracker Downdetector highlights that the platform suffered a brief outage before the company disabled chat history. As per Downdetector's outage map, some users could not access the AI-powered chatbot at midnight on March 23. 

It was designed to synthesize natural-sounding human language through a large language model called ChatGPT. ChatGPT works like a conversation with a person. When you speak to ChatGPT, it can listen to what you say and correct itself when it gets wrong. This is just like when you speak with someone. 

After a short period, ChatGPT will automatically delete your session logs that are saved by ChatGPT. 

When you create an account with ChatGPT, the service collects your personal information. It contains personal information such as your name, email address, telephone number, and payment information. 

Whenever an individual user registers with ChatGPT, the data associated with that user's account is saved. By encrypting this data, the company ensures it stays safe and only retains it if it is needed to meet business or legal requirements. 

The ChatGPT privacy policy explains, though, that even though encryption methods may not always be completely secure, this may not be the case. Users should be aware of this when sharing their personal information on a website like this. 

It is suggested in OpenAI's FAQ that users should not "share any sensitive information in your conversations" because OpenAI cannot delete specific prompts from the history of your conversations. Additionally, ChatGPT is not connected to the Internet, and the results may sometimes be incorrect because it cannot access the Internet directly. 

It has been a remarkable journey since ChatGPT was launched last year and has seen rapid growth since then. Additionally, the AI-powered chatbot is one of the fastest-growing platforms out there.

Reports claim that ChatGPT had 13.2 million users in January, according to a report on the service. ChatGPT's website says these gains are due to impressive performance, a simple interface, and free access. Those who wish for improved performance can subscribe for a monthly fee. 

Upon clearing the ChatGPT data and eliminating the ChatGPT conversations, OpenAI will delete all of your ChatGPT data. It will permanently remove it from their servers. 

This process is likely to take between one and two weeks, but please remember that it can take longer. It is also possible to send a request to delete your account to deletion@openai.com if you would rather not log in or visit the help section of the website.

Post-quantum Cryptography Achieves Standardization Milestone

 

The first four standardised protocols for post-quantum cryptography have been released, providing the foundation for the creation of "future-proof" apps and web services. 

Last Monday, the US federal government's National Institute of Standards and Technology (NIST) announced a quartet of recommended protocols as part of a continuing standardisation process. The chosen encryption algorithms will be included in NIST's post-quantum cryptography standard, which is scheduled to be completed within the next two years. 

Four more algorithms are currently being considered for inclusion in the standard. According to NIST, for most use cases, two basic algorithms should be implemented: CRYSTALS-KYBER (key-establishment) and CRYSTALS-Dilithium (digital signatures). 

In the event that one or more approaches prove insecure, more than one algorithm for each use case is being sought as a backup. NIST recommends CRYSTALS-Dilithium as the principal method for digital signatures, with FALCON for applications that require smaller signatures than Dilithium can offer. SPHINCS, a third algorithm, is slower than the other two but was approved since it is based on a distinct mathematical process and so gives a possibility to increase variety. Dustin Moody of NIST discussed why another round of selection was required.

“Of the four algorithms we selected, one is for encryption and three are for digital signatures,” Moody told The Daily Swig. 

“Of the four algorithms that we will continue to study in the fourth round, all four are encryption algorithms. The primary motivation for this is to find a non-lattice-based signature scheme which is suitable for general purpose use to be a backup for our lattice-based signature algorithms we are standardizing (Dilithium and Falcon),” Moody added. 

He continued: “Our current NIST public-key standards cover encryption and signatures. So that is what our standardization process was targeted for – to replace the vulnerable cryptosystems in those standards. Other functionalities may be considered in the future.” 

The ongoing quest for next-generation cryptographic systems is required since present encryption protocols, such as RSA, rely on solving mathematical problems that are beyond the capabilities of even the most powerful conventional computers. Sufficiently powerful quantum computers, which operate on a fundamentally different paradigm than today's PCs or servers, may be capable of cracking today's public key encryption techniques. Increasing the key length alone will not suffice to counter this possible danger, necessitating the creation of post-quantum cryptography methods. 

Decrypt later, store now

Despite the fact that the present generation of quantum computers is mostly experimental and hampered by engineering hurdles, attackers may be planning for their future availability using "store-now-decrypt-later" assaults.If such attacks are effective, a rising volume of normally encrypted financial, government, commercial, and health-related data will be vulnerable to attack by suitably powerful quantum computers. 

Quantum computers handle computational tasks by relying on the features of quantum states, such as superposition, interference, or entanglement, rather than the basic binary states (0 or 1) of traditional computers. When paired with quantum algorithms, the technology might solve some mathematical problems, such as integer factorization, in a manageably short period, posing a danger to current encryption systems that rely on the current intractability of such issues. Quantum-resistant algorithms are based on arithmetic problems that both traditional and quantum computers should struggle to solve.

Is making hacking unprofitable the key to cyber-security?

Billions are being lost to cyber-crime each year, and the problem seems to be getting worse. So could we ever create unhackable computers beyond the reach of criminals and spies? Israeli researchers are coming up with some interesting solutions.

The key to stopping the hackers, explains Neatsun Ziv, vice president of cyber-security products at Tel Aviv-based Check Point Security Technologies, is to make hacking unprofitable.

"We're currently tracking 150 hacking groups a week, and they're making $100,000 a week each," he tells the BBC.

"If we raise the bar, they lose money. They don't want to lose money."

This means making it difficult enough for hackers to break in that they choose easier targets.

And this has been the main principle governing the cyber-security industry ever since it was invented - surrounding businesses with enough armour plating to make it too time-consuming for hackers to drill through. The rhinoceros approach, you might call it.

But some think the industry needs to be less rhinoceros and more chameleon, camouflaging itself against attack.

"We need to bring prevention back into the game," says Yuval Danieli, vice president of customer services at Israeli cyber-security firm Morphisec.

"Most of the world is busy with detection and remediation - threat hunting - instead of preventing the cyber-attack before it occurs."

Morphisec - born out of research done at Ben-Gurion University - has developed what it calls "moving target security". It's a way of scrambling the names, locations and references of each file and software application in a computer's memory to make it harder for malware to get its teeth stuck in to your system.

The mutation occurs each time the computer is turned on so the system is never configured the same way twice. The firm's tech is used to protect the London Stock Exchange and Japanese industrial robotics firm Yaskawa, as well as bank and hotel chains.