At the recent Apple Event 2024, Apple announced a new AI-powered feature called “Visual Intelligence,” set to debut on its latest iPhone models, the iPhone 16 and 16 Plus. This feature is part of Apple’s broader suite of AI capabilities known as Apple Intelligence and aims to enhance how users interact with their devices using a combination of reverse image search and text recognition. Visual Intelligence with the iPhone 16 Camera Control lets you use the camera to learn about things around you. There’s even a Click to Google feature. Okay this is sickkkkkkk #appleevent pic.twitter.com/J1A4cGbjkd — Ray Wong (@raywongy) September 9, 2024 Users can activate visual intelligence by pressing this button, allowing them to perform various tasks based on what the camera captures. For example, suppose the camera is pointed at a restaurant. In that case, the feature will display essential details such as opening hours, customer ratings, and provide options to view the menu or make a reservation. Wow this is interesting — The sage (@ibrahimramsey8) September 10, 2024 A key aspect of this new feature is Apple’s commitment to user privacy. Apple has made it clear that “Apple services will never store your images,” ensuring that all data processing occurs directly on the device itself. This approach aligns with Apple’s broader philosophy of prioritizing user data protection and minimizing the need for data storage on external servers. Google Lens had this functionality for many years already — Cork King (@0korkle0) September 9, 2024 Users can now send queries directly to ChatGPT, an advanced language model developed by OpenAI, which can help with tasks ranging from solving homework problems to finding quick answers to complex questions. This integration with ChatGPT offers users a more flexible tool for various day-to-day needs. Comment The rollout of visual intelligence is scheduled to begin in October 2024, starting with U.S. English language users in a beta phase. Apple plans to expand the availability of this feature to other countries by December 2024 and early 2025. By combining image recognition and AI-driven text capabilities, Apple aims to make everyday tasks simpler and more accessible for its users. Comment The introduction of visual intelligence marks an essential step in Apple’s continued push to integrate advanced AI tools into its devices while maintaining a strong focus on user privacy and data security. 10-September-2024: Apple’s New Intelligence Upgrade Promises a Better Siri Experience! 09-September-2024: ‘Glowtime’ 2024: Apple to Launch iPhone 16, Apple Intelligence, and More! 09-September-2024: iPhone Gets a Major AI Upgrade with Arm’s Chip Tech, FT Reports! 09-September-2024: Apple Glowtime Event Live: iPhone 16, Apple Watch 10, AirPods 4, and More Launching Today! For more news and trends, visit AI News on our website.
Visual Intelligence can be accessed through a newly introduced button on the iPhone called Camera Control, located on the right side of the device.
Similarly, if a user scans a flyer, the tool can instantly extract and add relevant details like the event’s title, time, date, and location to their calendar.
In collaboration with OpenAI, Apple has further enhanced the functionality of the Camera Control button.
byu/Skullghost from discussion
inapple
byu/Skullghost from discussion
inappleMore iPhone Updates:
iPhone Users Get AI-Powered Visual Search with Latest Update!
Key Takeaways:
Was this article helpful?
YesNo