Apple and Google's search partnership takes a new turn! iPhone 16 camera directly accesses Google search
At present, Google's parent company Alphabet pays Apple about $20 billion annually to make Google the default search engine for its Safari browser. Analysis suggests that Apple's new "visual intelligence" feature introduces a new mode of interaction with software and services, where users can communicate with AI assistants through a new interface or send text, rather than relying on apps from the app store to complete tasks
With Apple unveiling the "Visual Intelligence" feature at Monday's event, analysts believe that a new shift is happening in the partnership between Apple and Google in search.
Currently, Alphabet, Google's parent company, pays Apple about $20 billion annually to make Google the default search engine on its Safari browser. However, this move was ruled by a U.S. court to violate U.S. antitrust laws, making it the largest antitrust case in over twenty years in the United States. Google's decades-long dominance in the search market may be overturned. For Apple, a previous article by Wall Street News stated that without this revenue, Apple's pre-tax profit would be impacted by about 15%.
But now, Apple seems to be able to "upgrade" its cooperation with Google with the newly released feature: iPhone 16 users can access Google's search engine and its visual search function by clicking on the new camera control button (Capture Button) on the device.
Apple explains that the camera control button not only allows users to quickly take photos or record videos, but also adjust options such as zoom, exposure, or depth of field by sliding the button.
However, this button also provides an entry point to Apple's new "Visual Intelligence" search feature, which is key to the cooperation with Google.
Initially, this camera control button seemed like a new term for Apple's "shutter button," but as the event progressed, Apple further explained more uses of this hardware feature. Through the visual intelligence search feature, users can not only easily understand everything the camera sees, but also access third-party services more conveniently through this feature without launching separate apps.
Some media analysts believe that this feature is essentially similar to Google's Google Lens or Pinterest Lens, with Apple describing visual intelligence search as a way to instantly understand what you see. In several examples, Apple demonstrated how users can click on the camera control button to get information on the operating hours of a restaurant seen in the city, or how to use the feature to identify the breed of a dog seen while walking. The feature can also convert event posters on the wall into personal schedules, including all the details.
Apple's Senior Vice President of Software Engineering, Craig Federighi, then mentioned that this feature can also be used to access Google search. In the demonstration, when a user points the iPhone at a bicycle and clicks the camera control button, a purchase information window similar to an option pops up on the screen. The "More Google Results" button displayed on the screen indicates that users can continue with Google search by clicking again ![] (https://wpimg-wscn.awtmt.com/2caf9b9c-5f5b-42a0-9de2-213a2e4e6e96.jpeg)
However, Apple did not explain when or how clicking the camera control button would turn to a third-party partner rather than the built-in Apple service to provide answers, nor did it detail how users can control or configure this feature. Federighi only vaguely mentioned, "Of course, you can always control when to use third-party tools."
While a Google spokesperson said there is currently no more information to share, this feature is believed to be part of the existing partnership between the two companies and is unrelated to Google's Gemini AI.
Analysis suggests that the interesting aspect of this feature is that it introduces a new mode of interaction with software and services, surpassing the services that come with the iPhone. With the development of artificial intelligence technology, users can interact with AI assistants through new interfaces or send text, rather than relying on apps from the app store to complete tasks. By partnering with third-party services, Apple maintains a close connection with emerging technologies while avoiding direct competition with AI tools like ChatGPT.
At the event, OpenAI's ChatGPT was also showcased as a third-party partner, and users can access ChatGPT through Siri. The demonstration showed that users only need to point their phones at classroom notes, click a button, and get help with concepts or questions