Apple’s Ambitious Plans to Integrate Cameras Into Future Apple Watches
Apple is reportedly working on a significant update for its Apple Watch line, with plans to integrate cameras into both its standard Series and Ultra models.
The company aims to make these upgrades by 2027, ushering in new possibilities for artificial intelligence (AI) features that could transform the functionality of the popular wearable.
Cameras to Revolutionise the Apple Watch's Capabilities
The addition of cameras would allow the Apple Watch to "see" the world around it, opening the door for new AI-driven features.
According to Bloomberg’s Mark Gurman, the camera for the Series version will be embedded within the display itself, similar to the front-facing lens seen on iPhones.
Meanwhile, the Apple Watch Ultra is expected to have a slightly different camera placement, positioned on the side of the device, near the digital crown and button.
Apple Watch Ultra 2 (left) and Series 10 (right). (Source: hwz)
Apple Intelligence and the Role of AI
This move towards camera-equipped devices ties into Apple’s broader push to enhance its AI capabilities.
With the introduction of Apple Intelligence, launched in the iOS 18.1 update in October, the company began incorporating a visual search tool that uses AI to identify objects and locations, and offer relevant information to users.
IMAGE
Cameras on the Apple Watch would enable similar features, giving users an enhanced experience with more contextually aware devices.
AI Models and In-House Development
While Apple currently utilises AI models from other companies to power these features, Gurman notes that the company is working towards developing its own in-house models by 2027.
This shift will coincide with the expected release of the new Apple Watches and AirPods, which will feature similar AI-powered camera technology.
Camera-Equipped AirPods Are Also in the Works
Apple isn’t stopping with the Apple Watch.
Gurman reported that the company is exploring ways to incorporate cameras into its AirPods as well.
The addition of infrared cameras, expected by 2026, could enable functionalities like hand gesture detection in the air and improve spatial audio when paired with devices such as the Apple Vision Pro.
These advancements are part of Apple’s long-term strategy to make its products more interconnected, intuitive, and intelligent.
Leadership Behind the AI Push
Mike Rockwell, the former head of Apple’s Vision Pro team, is now leading the development of these AI-driven features, including the much-anticipated Siri LLM upgrade.
Rockwell, who has also overseen visionOS development, is key to Apple’s efforts in making its wearables smarter and more responsive to the user’s environment.
Mike Rockwell, known for leading the development of Apple's Vision Pro, is now taking charge of the company's Siri team, aiming to improve its AI capabilities.
Apple’s move into incorporating AI features and cameras into its wearables shows how serious the company is about pushing the boundaries of what smart devices can do.
As the company works to enhance the intelligence of its products, users can expect a more immersive and functional experience from their Apple Watches and AirPods in the coming years.