Author: Vince Quill, CoinTelegraph; Translator: Deng Tong, Golden Finance
OpenAI co-founder Ilya Sutskever recently spoke at the Neural Information Processing Systems (NeurIPS) 2024 conference in Vancouver, Canada.
Sutskever said that the growth of computing power through better hardware, software and machine learning algorithms is outpacing the total amount of data available for AI model training. The AI researcher likened data to fossil fuels, which will eventually run out.
"Data is not growing because we only have one internet. You could even say that data is the fossil fuel of AI," Sutskever said.
"It was created in a certain way, and now we use it, and we have reached peak data, and there will be no more data." "- We have to deal with the data we have."
The OpenAI co-founder predicts that agent AI, synthetic data, and inference time computing are the next evolution of AI, which will eventually give rise to AI superintelligence.
Chart comparing computing power and dataset size for AI pre-training. Source: TheAIGRID, Ilya Sutskever
AI Agents Are Taking the Crypto World by Storm
AI agents will surpass current chatbot models, be able to make decisions without human input, and have become a popular narrative in the crypto space with the rise of AI Memecoins and Large Language Models (LLMs) such as Truth Terminal.
Truth Terminal quickly gained popularity after LLM began promoting a meme coin called Goatseus Maximus (GOAT), which eventually reached a market cap of $1 billion and attracted the attention of retail investors and venture capitalists.
GOAT token market information. Source: CoinMarketCap
Google’s DeepMind artificial intelligence lab has launched Gemini 2.0, an AI model that will power artificial intelligence agents.
According to Google, agents built using the Gemini 2.0 framework will be able to assist with complex tasks such as coordination between websites and logical reasoning.
Advances in AI agents that can act and reason independently will lay the foundation for AI to break free from data hallucinations.
AI hallucinations occur due to incorrect datasets and the increasing reliance of AI pre-training on using old LLMs to train new ones, which degrades performance over time.