Apple’s Google Lens-Like Visual Intelligence Rolls Out to iPhone 15 Pro



Apple is set to expand its Visual Intelligence feature to the iPhone 15 Pro and iPhone 15 Pro Max with the upcoming iOS 18.4 update. Previously exclusive to the iPhone 16 lineup, this AI-powered tool functions similarly to Google Lens, allowing users to analyze objects, text, and scenes through their camera for real-time information.

According to 9to5Mac, Apple had previously confirmed to Daring Fireball that the feature would be coming to the iPhone 15 Pro models but had not specified when. Now, with iOS 18.4 in developer beta, it is clear that the rollout is set for April, barring any last-minute changes.

Originally introduced for the iPhone 16 lineup in September 2024, Visual Intelligence was accessible through the Camera Control button. However, since the iPhone 15 Pro and Pro Max lack this button, users will instead activate the feature using the Action Button or via Control Center—similar to how the functionality works on the newly launched iPhone 16E.

The latest iOS 18.4 developer beta also extends the Action Button and Control Center access for Visual Intelligence to the entire iPhone 16 lineup, giving users multiple ways to activate the feature.

This update marks Apple’s continued investment in AI-powered functionality within iOS, providing iPhone users with a more intuitive way to interact with their environment. With the official rollout of iOS 18.4 expected in April, iPhone 15 Pro owners will soon gain access to this powerful feature without needing to upgrade to a newer device.

Source: 9to5mac

Advertisement





Source link

Related Posts

About The Author

Add Comment