Search This Blog

Powered by Blogger.

Blog Archive

Labels

This Cryptocurrency Tracking Firm is Employing AI to Identify Attackers

Using OpenAI’s ChatGPT chatbot, the company says that it will be able to organize data faster and in large chunks.

 

Elliptic, a cryptocurrency analytics firm, is incorporating artificial intelligence into its toolkit for analyzing blockchain transactions and risk identification. The company claims that by utilizing OpenAI's ChatGPT chatbot, it will be able to organize data faster and in larger quantities. It does, however, have some usage restrictions and does not employ ChatGPT plug-ins. 

"As an organization trusted by the world’s largest banks, regulators, financial institutions, governments, and law enforcers, it’s important to keep our intelligence and data secure," an Elliptic spokesperson told Decrypt. "That’s why we don’t use ChatGPT to create or modify data, search for intelligence, or monitor transactions.”

Elliptic, founded in 2013, provides blockchain analytics research to institutions and law enforcement for tracking cybercriminals and regulatory compliance related to Bitcoin. Elliptic, for example, reported in May that some Chinese shops selling the ingredients used to produce fentanyl accepted cryptocurrencies such as Bitcoin. Senator Elizabeth Warren of the United States used the report to urge stronger regulations on cryptocurrencies once more.

Elliptic will employ ChatGPT to supplement its human-based data collecting and organization procedures, allowing it to double down on accuracy and scalability, according to the company. Simultaneously, large language models (LLM) organize the data.

"Our employees leverage ChatGPT to enhance our datasets and insights," the spokesperson said. "We follow and adhere to an AI usage policy and have a robust model validation framework."

Elliptic is not concerned about AI "hallucinations" or incorrect information because it does not employ ChatGPT to generate information. AI hallucinations are occasions in which an AI produces unanticipated or false outcomes that are not supported by real-world facts.

AI chatbots, such as ChatGPT, have come under fire for successfully giving false information about persons, places, and events. OpenAI has increased its efforts to resolve these so-called hallucinations in training its models using mathematics, calling it a vital step towards establishing aligned artificial general intelligence (AGI).

"Our customers come to us to know exactly their risk exposure," Elliptic CTO Jackson Hull said in a statement. "Integrating ChatGPT allows us to scale up our intelligence, giving our customers a view on risk they can't get anywhere else."


Share it:

AI

Artificial Intelligence

Automation

ChatGPT

Crypto

Crypto Tracking Hackers

Technology