It is amazing that the new standard framework for AI Agent 3D modeling @hyperfy_io, which has been praised by @shawmakesmagic, has been speed-passed to nearly 200M after it was launched. This is the opening lineup for the honorable selection of #ai16z family bucket. Want to ask why? In fact, the logic is very simple:
1) Hyperfy wants to build a MetaAIverse AI Agent metaverse, and the goal vision is very grand. If the Metaverse that emerged after the last round of NFT craze is too conceptual, the AI-enabled metaverse may have greater room for imagination.
Because the Metaverse in the past was mainly trapped by the difficulty of content production, all 3D modeling had to rely on third-party platforms, and the development cost was high (creative ideas - 3D modeling, materials, animations, scene combinations, etc.), and it was easy to collapse due to poor experience.
The logic of the Metaverse under the AI Agent trend is completely different: AI automatically generates 3D modeling content, and can input commands in real time for intelligent scene linkage, which is likely to bring a purely personalized free virtual world without preset solidified NPCs.
From the narrative perspective, the project does have a lot of room for imagination, but it's a pity that it was quickly passed to such a high market value at the opening, which has exceeded the judgment of technical logic and perspective, so I can't give any investment advice, DYOR.
2) However, in terms of business logic, Hyperfy should be classified as a framework and standard series, directly targeting the #Virtual ecosystem of $GAME. After all, Virtual has attracted the attention of a large number of developers through the GAME game framework. Obviously, Shaw's introduction of Hyperfy into the whole family will directly give birth to a framework that can rival or even surpass GAME, further consolidating ai16z's moat, which is a strategic consideration.
Hyperfy's technical architecture is not difficult to understand. It is mainly based on the underlying capabilities of Three.js, and uses webGL to implement 3D scene rendering using GPU. AI can automatically generate 3D scenes based on text descriptions, adjust scene layout and details in real time, and even NPCs will automatically perceive user behavior for personalized responses, etc. The most important thing is that the entire complex workflow can be rendered in real time through the browser, achieving an experience with complex backend but extremely low threshold for the front end.
To know the specific technical maturity, you can visit Hyperfy official website to experience it. I don’t think the current experience can be much better, but the imagination space is too big.
Shaw spared no effort to promote it, directly targeting Virtual’s GAME framework, which is directly beneficial to the entire ai16z large commercial IP. (If we anchor GAME’s valuation system, it is not surprising that it was quickly launched to this height.
3) Imagine that in the future, if the Metaverse world driven by AI Agent matures, there will be many application scenarios, such as: users can input multimodal input prompts such as text, sketches, and voice to AI agents to get 3D models and animations that respond immediately, and can also use the computer’s visual system to automatically recognize gestures and even expressions to make corresponding inductions...
Based on this, AI Agents can be used to simulate 3D educational environments, subvert 3D game interactive experiences, build smart exhibition halls and AI virtual conference rooms, etc.
You see, the Metaverse wave, those beautiful scenes that the market YY, the AI Agent era has finally been continued, and the imagination space and feasibility will be greater.
How can it not make people excited?