Hey Google, What Happened at I/O’19?

Merve Başak
Hey Google!
Published in
5 min readMay 13, 2019

--

Google I/O’19 happened on May 7–9 this year. I couldn’t attend this year, but I was very excited about Google Assistant ‘s new announces. If you would like to learn more informations about Google I/O’19, you can check out the official web site. In this article, I will mention Google Assistant ‘s new announces.

Google I/O’19 Keynote

Google Duplex on the Web

Last year, Google announced Google Duplex, an automated assistant that makes phone calls to complete tasks on your behalf. For eaxample, booking restaurant reservations and hair salon appointments.

This year, Google announced that it’s bringing Google Duplex to the web and extending it beyond voice. Duplex on the phone would handle phone calls on your behalf, Duplex for the web will handle web browsing and form-filling. Google Duplex uses Google profile, Gmail, calendar and other sources of information to complete online transactions for you and Duplex for the web will run on top of any site and doesn’t require any action from your devs or site admins. Now, Google Duplex is only active in 43 US states.

Next Generation Assistant

Google Assistant gets rapid response:

It’s rapidly increased the speed of the AI that powers the Assistant and put it on device, rather than in the cloud. A process that used to take 100GB of space in the cloud has been shrunk down to 0.5GB, fitting onto a phone and reducing network latency to a point where the response to a query is 10x faster.

Continued conversation:

It may not be very pleasant to constantly alert when interacting with the assistant. Things like ‘ Hey Google’ , ‘Ok Google’ .

That’s the ability to make several requests at a time or in succession, without having to say ‘Hey Google’ each time. It called ‘Continued conversation’.

Multitask across apps:

Now we will be able to hop from your messages into your photos, filter your photos and send one to a friend, all with three voice commands.

Complex Task:

We will be able to compile and send a complete email without touching the screen. Any more, Google Assistant can perform complex tasks like this.

Interactive Canvas

Google Assistant introducing a developer preview of Interactive Canvas which lets you create full-screen experiences that combine the power of voice, visuals and touch. Canvas works across Smart Displays and Android phones, and it uses open web technologies you’re likely already familiar with, like HTML, CSS and Javascript.

Personalized Help

Picks for you:

The assistant now knows you better. For example, if you want a recommendation for dinner, the assistant can now list according to your tastes.

Personal Reference:

With personal references, we will be able to ask for personal things like the ‘directions to mum’s house’ and the Assistant will know that you’re talking about your actual mother, as opposed to the cafe down the street with the same name. It uses something called ‘reference resolution’ to understand the difference between the two.

Smart Home

Introducing a preview of the Local Home SDK which enables you to run your smart home code locally on Google Home Speakers, Nest Displays and use its radios to communicate locally with your smart devices. Google Assistant has been working with some amazing partners including Philips, Wemo, TP-Link, and LIFX on testing this SDK and we’re excited to open it up for all developers next month.

Stop Alarms without Saying ‘Hey Google’

Any more, stop your alarms and timers by saying ‘stop’, no ‘Hey Google’ needed.

App Actions

App Actions

If I wanted to track my run with Nike Run Club, I could just say “Hey Google, start my run in Nike Run Club” and the app will automatically start tracking my run. Or, let’s say I just finished dinner with my friend Chad and we’re splitting the check. I can say “Hey Google, send $15 to Chad on PayPal” and the Assistant takes me right into Paypal, I log in, and all of my information is filled in — all I need to do is hit send.

Driving Mode

Last year, Google brought the Assistant to Android Auto and included it in Maps. Now, Assistant will have a ‘driving mode’, activated by saying ‘Hey Google, let’s drive’ on Android.

Driving mode lets you do the kind of things you’d expect, such as get directions and play music, but it has some great quirks such as allowing you to continue playing podcasts from where you left off before you got into the car.

In addition to this, new codelab is ready , you can check out 👇

There are lots of sessions about Actions on Google in Google I/O’19. If you would like to watch, you can visit these links :

I would like to share two of my favorite sessions :

Thanks for reading!

Also, I have a GitHub repository for my Google Assistant series. I will share codes, links, and posts there. 👇

--

--