According to Google, more than 250 million cars currently support Android Auto, with over 50 models featuring Google integrated directly into their infotainment systems. These vehicles are set to become even more intelligent as Gemini is being introduced to Android Auto.
Gemini for Android Auto will function similarly to its use in other contexts, allowing access to dialogues, insights, helpful suggestions, and additional features.
Here’s a glimpse of a few distinct ways Gemini can be utilized while driving with Android Auto.
1. Discover the ideal dining spot
When hunger strikes on the road, Gemini can assist you in finding what you need. Just ask Gemini to “locate excellent burger restaurants along the route,” and it will provide a selection of places known for burgers. You can inquire for insights from reviews or ask typical questions like their operating hours. After making your choice, Gemini can connect with Google Maps to help you get there.
2. Get ready for an important discussion
If an essential conversation or call is on the horizon, you can leverage Gemini’s conversational abilities to prepare without taking your hands off the wheel. Simply say, “Hey Google, let’s talk,” to initiate. Ask Gemini something like, “I need to discuss a promotion with my boss. What approach should I take?” and you’ll receive strategic advice on what to mention.
This isn’t limited to work; you can seek Gemini’s help for any type of conversation, whether it’s how to handle a gentle breakup, resolve a disagreement with a friend, or discuss financial matters with a partner.
3. Translate messages
Messaging someone who speaks a different language can be challenging, especially while driving. You can instruct Gemini to always send messages to a specific contact in a chosen language (for instance, “always message John in Spanish”). The next time you text that person, Gemini can assist you in crafting the message instead of translating it literally, ensuring the context is clear.
4. Search through Gmail
Can’t recall where your child’s soccer game is this week? You can ask Gemini to locate that information for you. If the details are buried in a pile of emails, just tell Gemini to find it, and it will retrieve the pertinent information.
5. Check your calendar
Need to walk into work well-informed? Ask Gemini to share what’s scheduled on your Google Calendar.
If Gemini isn’t your preference, Android Auto is also receiving various independent updates. Google mentions that the app catalog is growing to include additional categories like games and video while expanding the digital car key feature to more cars, including Audi, Volvo, and Polestar.
Gemini will become accessible in vehicles that support Android Auto in the upcoming months, as well as in cars that have Google integrated, such as the Lincoln Nautilus, Renault R5, and Honda Passport later this year.
Google I/O, the company’s annual developer event where it showcases its newest features, is just a few weeks away. Google is providing a sneak peek of I/O announcements with the launch of its latest AI model.
Gemini 2.5 Pro Preview (I/O edition)
On Tuesday, Google unveiled early access to Gemini 2.5 Pro Preview (I/O edition), a version that significantly enhances coding capabilities, particularly in developing interactive web applications, according to the announcement. This update also aims to improve the model’s performance in various coding tasks, such as code transformation, editing, and creating agentic workflows.
Initially planned for introduction at Google I/O, the model update was released early due to positive feedback so users could begin utilizing it right away.
These updates position Gemini 2.5 Pro at the forefront of the WebDev Arena Leaderboard, boasting an Elo score that is 147 points higher than its predecessor. It also achieved an 84.8% score on the VideoMME benchmark.
Users can access the updated Gemini 2.5 Pro through the Gemini API for developers in Google AI Studio and Vertex AI. Members of the general public can also access the updated model in the Gemini app, where it enhances features like Canvas.
Google I/O is scheduled for May 20 to May 21 at the Shoreline Amphitheatre in Mountain View, California. The keynote session, featuring the majority of announcements regarding Google’s latest hardware and software, will occur on Day One at 10 a.m. PT / 1 p.m. ET.
Developers can now sign up for the event’s digital experience on the Google I/O landing page for free. Everyone is welcome to watch the live stream at no charge. ZDNET will also cover the event live with updates posted on the site as news unfolds.
Today, Google enhanced AI Mode, making it more suitable for planning your summer getaway, shopping, or exploring various topics. Additionally, if you’re looking to try out the tool, it has become much more accessible.
Google’s updated AI Mode merges Search’s real-time information access with advanced generative AI features – including longer natural queries and tailored responses – to deliver an AI-driven search engine experience. This is particularly useful for complex questions involving multiple criteria, such as shopping and travel planning.
“We are truly empowering you to shop and discover local businesses, as well as to organize your travel plans more effectively,” stated Soufi Esmaeilzadeh, director of Search product management, to ZDNET. “We’re enhancing AI Mode by tying it to the rich content within our shopping graph and our extensive database of local businesses.”
Shopping and trip planning cards are already available through AI Mode for learning about various subjects, such as products or travel destinations. The latest update enhances AI Mode by introducing visual place and product cards that you can click for more details.
While AI agents can now facilitate trip bookings directly, AI Mode acts as a research companion, assisting you in gathering the information needed to finalize your plans effortlessly. The same applies to shopping as well.
When searching for a specific location like a restaurant, you will receive details such as ratings, reviews, and operating hours, while product searches reveal available shipping options along with current prices, images, local stock, and more. All this information is pulled fresh from the web, ensuring you get the latest details.
“If you pose a query such as, ‘I’m searching for a midcentury modern furniture store where I want to buy a dresser with these specific features,’ the model can consider that and provide responses, identify the best businesses for you, and then give you access to the wealth of information we’ve spent years compiling about those businesses,” Esmaeilzadeh explained.
AI Mode will also introduce a new left-side panel displaying your previous searches. This functionality enables you to refer back to these searches later or resume the conversation with follow-up inquiries.
“In AI Mode, users genuinely engage in follow-ups, often approaching it as part of longer research journeys, so we want to make it convenient for them to refer back to their previous work and continue from where they left off,” Esmaeilzadeh noted.
While this may seem like a simple feature, imagine a scenario where you could never see your Google Search history. Similarly, the ability to access past conversations will serve as a valuable productivity tool.
To try AI Mode, users typically visit Google Labs and join a waiting list – but that’s changed. If you’re interested in experiencing AI Mode yourself, US users can now gain immediate access. Simply head to Google Labs to begin.
User feedback on AI Mode has been favorable, with many reporting it to be beneficial, according to Google. Consequently, the company is broadening accessibility through a limited test outside of Labs. A small percentage of US Google Search users will notice the AI Mode tab on the Search page. Google plans to continue utilizing user feedback to improve the model.
Google adopted a similar phased rollout strategy when it released AI Overviews, the feature that adds an AI-generated summary at the top of your search results page. This feature eventually transitioned out of Labs and became a standard component of users’ search results.
Leave a Reply