
AirPods and Apple Watch cameras can make Apple Intelligence much smarter
A report earlier this month suggested that Apple is working on embedding cameras into AirPods, followed by another yesterday that suggests that it would also include an Apple Watch camera.
Apple has been exploring the idea of adding a camera to your Apple Watch for many years, but the latest reports suggest a very specific usage…
AirPods and Apple Watch Cameras
There is a potential debate for adding cameras to your Apple Watch for traditional applications. For example, some joggers prefer to leave their iPhone and home and rely on cellular watches while running. Camera equipped clocks can be used for improvised photo opps.
It also suggests that Apple Watch cameras may be useful for FaceTime Call, Jetsons style.
But what Bloomberg’s Mark Garman explains that using cameras can be used to view the wearer’s environment.
Apple is exploring the idea of adding cameras and visual intelligence capabilities to its smartwatches. Pushing the company into the AI wearable market (…) The current idea is to place the camera inside the series version of the display, like the front lens of the iPhone. Ultra takes a different approach, with the camera lens sitting next to the clock near the crown and button.
This is also linked to previous reports on AirPods with cameras.
Apple is working on a new version of the AirPods Pro, which uses external cameras and artificial intelligence to understand the outside world and provide information to users. This is basically a smart glass pass, but there are no actual glasses.
Visual Intelligence
The first and most obvious application is for visual intelligence. Currently, you need to pull your iPhone out of your pocket, push the camera control button out for a long time, and “show” what Apple Intelligence is looking at. With a forward-facing camera embedded in the AirPods, it may be possible to use the “Hey, Siri” command in the future.
Using an Apple Watch may not be that seamless, especially if the camera is embedded in the display, but sometimes it is more convenient than an iPhone.
However, it can also provide awareness of the surroundings
We reportedly have been seeing the launch of frankly ridiculous AI hardware. He reportedly seeks a billion-dollar bet for his own attempt. However, while the idea of buying and wearing AI-specific hardware is clearly pointless, there is one element of the device we already wear. We are constantly aware of where we are and what we are in the environment.
This means not only activate the camera when asking questions, but also use regular photos and short video clips to provide Apple Intelligence with context as to where we are and what we are doing.
For example, if I can see my Apple device stepping into the video games section of the Science Museum, it can provide useful context to understand the questions I’m likely to ask while I’m there. Knowing where I am and what lies around me can quickly make Siri quite smart.
Also, Apple Intelligence will help you learn more about the world more generally by just grabbing a snippet of my daily life.
Of course, the use of unemployed cameras raises privacy concerns. To the best of your knowledge, it would be obviously unacceptable for Apple devices to grab a photo or video clip. However, strong privacy protections could make this viable. for example:
- Ambient recognition is strictly an opt-in function
- As it uses only processing within the device, surrounding visual data will not leave the device
- Images and videos will be permanently deleted immediately after processing
- It’s geographical what is disabled at home and optionally at work
- Similarly, when we see us approaching a residential building it is deactivated as well
- Control Center toggles are instantly accessible for manual disabling
Given the privacy controls you consider to be acceptable, do you need that option on your device? Share your thoughts in the comments.
Highlighted accessories
Photography by photo
(TagStoTRASSLATE) AirPods (T) Apple Intelligence (T) Visual Intelligence