I/O 2025: el buscador de Google ahora habla contigo y hace todo por ti
Google continues improving its Ia to facilitate the form in the that its users perform searches on internet
Google está decidido a cambiar por completo la forma en la que realizamos búsquedas en Internet. Lo dejó claro durante el evento Google I/O 2025, donde la compañía presentó oficialmente AI Overviews y el llamado AI Mode, una función que transforma la clásica lista de enlaces azules en algo mucho más parecido a una conversación con un asistente inteligente.
This change brand a before and a after for the more more use of the world. In place of show you ten links blue, Google will speak you, te will explain and, si all all va well, will solve your doubts in time real as if you you were chat.
Ai mode: like this is the Google conversational that arrives to stay
The great protagonist of the Conference was ai mode, a conversational system that already is is deploying in states united y that more ahead will reach to other countries. This new modality no Replace by complete to traditional search engine, but yes is convert the form predetermined of search when the system detect that you are making a complex question or very very specific.
By example, Si Questions Something As “Which is the better destination for a holiday in family with children small and budget adjusted?”, In place of receive one list of links to blogs and agencies travel, Google te will answer with a direct and customized explanation, having in account multiple factors contextuals.
The interesting is that no is is only to give you answers writing: the system te will guide with additional questions, suggestions y up to will perform concrete actions as search flights, check your calendar or find relevant information in your email gmail, all all all you you you you you .
DEEP SEARCH Y SEARCH Live: New Tools For Go more there
In addition of the conversational mode, Google also presented other tools thought to improve even more the experience search. A of them es dev SEARCH, a function oriented to those who need deepen much in a Theme. Imagina that you are investigating on the effects of the change climate in coastal cities: deep SEARCH can help to generate a structured report structured, with quotes and relevant links links, all based in sources verified.
Por otro lado, tenemos Search Live, una herramienta que se apoya en la cámara del teléfono y en la realidad aumentada para ofrecerte información en tiempo real sobre lo que estás viendo. Ya sea una planta que no reconoces, un cartel en otro idioma o una prenda que te gustaría comprar, basta con apuntar la cámara y dejar que la IA haga su trabajo. Es algo que ya habíamos visto en Google Lens, pero ahora potenciado por Gemini 1.5 Flash, el nuevo modelo de IA ligero y rápido que debutó en el I/O 2025.
Gemini, Image and I see: The others large ads of the event
Although ai mode was the center all the looks, Google took advantage the scenario to show how its Ecosystem of ia is is expanding in multiple addresses. Gemini 1.5 pro, per example, now is capable read documents extensive from up to 1.5 million tokens, what that allows to analyze pdfs whole, emails long or even lines lines code complex.
We also knew image 3, the new version of its model generator of images, that now can images more crisp, coherent and realistic. And for quiene S work with video, arrives I see, a capable system generate clips in high resolution with greater creative control , ideal for creators content and publicists.
In addition, Google confirmed that its assistant gemini will be much more integrated in Android 15, allowing complex actions as edit videos, write emails or plan travel directly directly from the device, without need to change app.
All this leaves clear that Google no wants that only look on internet: want what interact with the network as if we were speaking with a p Ersona Real. The bet is ambitious y could redefine no only the form in the that we use the search engines, but also how we access to knowledge.

