iPhone Users Get AI-Powered Visual Search with Latest Update!

  • Editor
  • September 10, 2024
    Updated
iphone-users-get-ai-powered-visual-search-with-latest-update

Key Takeaways:

  • Apple has introduced a new feature called “Visual Intelligence” on its iPhone 16 models, combining reverse image search with text recognition capabilities.
  • The feature emphasizes user privacy, with all data processing conducted directly on the device, ensuring that images or data are not stored on Apple servers.
  • In collaboration with OpenAI, Apple has enhanced Visual Intelligence to allow users to send queries to ChatGPT, making the feature versatile for tasks such as solving homework problems or gathering information.
  • The feature will be available in beta starting October 2024 for U.S. English language users, with a global rollout expected by early 2025.

At the recent Apple Event 2024, Apple announced a new AI-powered feature called “Visual Intelligence,” set to debut on its latest iPhone models, the iPhone 16 and 16 Plus.

This feature is part of Apple’s broader suite of AI capabilities known as Apple Intelligence and aims to enhance how users interact with their devices using a combination of reverse image search and text recognition.


Visual Intelligence can be accessed through a newly introduced button on the iPhone called Camera Control, located on the right side of the device.

Users can activate visual intelligence by pressing this button, allowing them to perform various tasks based on what the camera captures.

For example, suppose the camera is pointed at a restaurant. In that case, the feature will display essential details such as opening hours, customer ratings, and provide options to view the menu or make a reservation.


Similarly, if a user scans a flyer, the tool can instantly extract and add relevant details like the event’s title, time, date, and location to their calendar.

A key aspect of this new feature is Apple’s commitment to user privacy. Apple has made it clear that “Apple services will never store your images,” ensuring that all data processing occurs directly on the device itself.

This approach aligns with Apple’s broader philosophy of prioritizing user data protection and minimizing the need for data storage on external servers.


In collaboration with OpenAI, Apple has further enhanced the functionality of the Camera Control button.

Users can now send queries directly to ChatGPT, an advanced language model developed by OpenAI, which can help with tasks ranging from solving homework problems to finding quick answers to complex questions.

This integration with ChatGPT offers users a more flexible tool for various day-to-day needs.

Comment
byu/Skullghost from discussion
inapple

The rollout of visual intelligence is scheduled to begin in October 2024, starting with U.S. English language users in a beta phase. Apple plans to expand the availability of this feature to other countries by December 2024 and early 2025.

By combining image recognition and AI-driven text capabilities, Apple aims to make everyday tasks simpler and more accessible for its users.

Comment
byu/Skullghost from discussion
inapple

The introduction of visual intelligence marks an essential step in Apple’s continued push to integrate advanced AI tools into its devices while maintaining a strong focus on user privacy and data security.

 

For more news and trends, visit AI News on our website.

Was this article helpful?
YesNo
Generic placeholder image

Dave Andre

Editor

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *