This was a good keynote and other tech companies should study it and copy it. Goodbye!

Thanks for reading, everybody! Much more to come!

Alright, well, with that, we out!

It’s about “building products for everyone, products that matter.” Product focus is interesting! It’s not quite the usual thing Google focuses on.

"We know that to truly build for everybody, we need lots of perspectives in the mix," Fitzpatrick says.

We’re wrapping up shortly - with some talk about Google’s values.

Jen Fitzpatrick is coming back onstage to close out the portion on AI.

They’re testing their car in snow now, arguing that machine learning lets them see through all the sensor noise.

One good example of this is driving in bad weather. Doing so in the real world could be dangerous, but doing so in simulation can help the cars learn in a safe environment.

Waymo is running a constant simulation with 25k cars in it. At the end of this keynote, we will learn that we are all living in a simulation, as well.

Waymo uses simulation to train its self-driving cars, and the systems have driven more than 5 billion miles in simulation.

Vanessa Carlton's "A Thousand Miles" is suddenly so much less romantic if those miles are self-driven in the future.

It would be hilarious if they showed an Uber self-driving car run a red light.

Waymo cars have driven more than 6 million miles on public roads to test and train its systems.

And people don't always act the same in every traffic scenario. That's apparently where AI can really help, by building an understanding of human behaviors.

All the new features coming to Google Assistant, including manners

That helps with people detection. But Waymo says it needs to predict human beings, both pedestrians and drivers.

Waymo cars can detect pedestrians in dinosaur costumes. The future has arrived!

We're seeing virtual representations of what Waymo cars see in real time, thanks to LIDAR systems and sensors.

I think this keynote is going to last Waymo minutes.

We're now getting a dive into two important self-driving car features: perception and prediction.

Google Lens is moving into the camera app and gaining new features

Deep learning and other techniques are what will help Waymo achieve Level 5 autonomy, which is the far-off dream of self-driving advocates. It means the car can drive it self in every situation.

Krafcik now talking about using deep learning, a subset of machine learning, that he says helped reduce Waymo's error rate for detecting pedestrians by 100x.

Points for matching shirts.

Waymo intends to partner with a lot of different companies, and not just one car company.

"We're not just building a better car. We're building a better driver," Krafcik says.

Nobody in the driver’s seat, either. Huh.

Phoenix is launching Waymo later this year.

"It's not about science fiction. When we talk about building self-driving technology, these are the people we're building it for," Krafcik says.

This cheerful video is truly contrasting with some of the other news that is still coming out of the Uber's accident a few months ago.

That was a cute video. Now show the footage of a human driver spinning out into a Waymo vehicle.

This is part of the Waymo Early Rider program, which is basically aimed at making human beings more comfortable with self-driving cars.

Everyone in this audience is praying they're about to get a free self-driving car.

We're seeing a lot of people on their phones and even yawning. Some taking selfies, while the car is driving itself.

We're getting a video now featuring Waymo self-driving demos with real riders.

(BTW: pretty much everybody onstage has been wearing Android Wear watches. I’m sure that they totally do that when they’re not onstage.)

Oh wow, Krafcik name dropped Alphabet at I/O. That's a... first maybe?

Krafcik is waxing nostalgic about the company's first self-driving tests in the parking lots right outside Shoreline, which is where the main Googleplex offices are located.

This is a GOOGLE event. Who is letting those scrubs from WAYMO here? They’re obviously completely different companies and nobody is confused at all about the relationship between Google and other Alphabet companies.

Do NOT point Google Lens at a poster of Charlie Puth. It will begin playing music by Charlie Puth.

Ooo now we're onto self-driving cars with Waymo CEO John Krafcik.

“Over time” they want to overlay live results on the image. “Over time” is a specialized technical term in tech. It means “Lol don’t hold your breath.” The rest is coming “in the next few weeks.”

Some of these features are coming to Google Lens in the next few weeks.

I bought Swing Time but didn't read it.

Google wants to overlay the live results directly on top of items like street signs and concert posters automatically, so you can get a relevant YouTube video to pop up when you scan it.

I get the feeling I'll overlay Lens on my "cooking" and it'll just say "Golly, IDK what that is"

There's been a nice, subtle mix today of features that Google can make money from, in addition to services that people will just enjoy for free. Buying tickets from search results, and searching for similar shoppable items using Lens, are two good examples.

Lens working in real time is impressive.

Now we're talking about real-time results for Lens, so you don't have to take the photo. Lens will automatically populate the viewfinder with information.

This is basically what Pinterest's entire future business model revolves around.

Golly, Lens will help you buy stuff. Somehow I doubt it’ll work with Amazon, though.

Aparna now talking about Style Match, which will let Google Lens tell you about similar styles of clothing and furniture.

Good, usable OCR is really tough. I hope it works here!

Google News is getting an overhaul and customized news feeds

Google Lens will also bring up photos of food dishes when you pan the phone camera over a restaurant menu.

Oh man, copy and paste from Lens is a big deal for me. Imagine using this as a student.

This will let you copy and paste from an image of text on a book into a message. That is... really wild.

New features for Lens now. Smart text selection will help the phone camera understand words.

That's big for Lens, which, prior to this, was stuck in the Assistant app.

Sorry, Bixby.

Wow, a ton of manufacturers are putting Lens in their camera apps. Like all the majors except Samsung.

Starting next week, Google Lens will be integrated right into the camera app of Google Pixel devices, LG G7, and other Android phones.

Google Lens using image and object recognition to identify stuff in photos: landmarks, food, etc.

This Lens + Maps thing would save my mom. I literally have to take Street View photos of landmarks for her every time she needs directions!

Okay, now we're onto Google Lens, which is the underlying technology to the AR overlays in Street View.

I would like to publicly apologize to Natt for draining her spare battery.

This is the walking map feature I have wanted my entire life. Brilliant.

Street View will now call out locations with on-screen overlays. Google is even working on including a digital fox to help guide you.

It combines with street view to label stuff on the street. This has been an AI fantasy for awhile. Is it real now or is this just a demo?

Okay, it’s not Lens. It’s cooler. (Applause.) The camera shows you what direction you’re looking.

Google will now combine AI with Street View on Google Maps.

This is really where all of Google's products come together: AI, Maps, Google Photos — all of it basically comes together to help you find stuff through the smartphone camera.

What if you punch in your work address on Google Maps and the score comes back as a 31 and Google suggests you go to grad school?

Will you be able to promote your Google business location for a sponsored "match"? Scandal!!

Camera time - bet it’s going to be Google Lens.

Now Aparna Chennapragada is onstage.

It will be interesting to see whether Google Maps actually suggests that you not go to places — algorithmic scores and promoting local businesses can be in tension. Does this tide lift all boats, or does it just lengthen brunch lines at four restaurants in San Francisco?

Soon, you'll be able to get updates from small businesses and book appointments and make orders from within Google Maps' For You tab.

Dieter keeps making me pick where we're eating dinner. Not sure this will help because he will just say EHHEHEH.

Fitzpatrick now talking about helping local businesses. She says Google Maps helps more than 9 billion connections to businesses, 1 billion phone calls, and 3 billion direction requests.

Okay, voting on places in Maps with friends is DOPE. If Google can actually solve “I dunno what do you want to do,” it will take over the damn world.

These features are all coming to iOS and Android versions of Google Maps this summer, Fitzpatrick says.

The shortlist feature will also feature real-time voting.

Some other new features here. You can long press on a place to create a shortlist of places so you can share that list with your friends for them to weigh in on where to eat.

Thanks, Google. I’m fat now.

So, Google is going to recommend places sort of like how Spotify recommends music, based on preferences. But, like, I will score 100 percent for ice cream shops, so... Google is trying to make me fat.

Basically my dream dating app.

This is OkCupid for cheeseburgers. And I am here for it.

Android P launches today in public beta

Now, Google Maps will have a new score called Your Match that is powered by AI to find you places that you might like based on your preferences.

So both Google News and Google Maps have a tab called “For You.” I look forward to seeing that tab in every other Google app next year.

A new feature called For You will feature recommendations, trending listings, and other interesting locations.

So Google has been working on a new version of Maps designed specifically to help people discover new things.

"We want to make it easy for you to explore and experience more of what the world has to offer," she says.

Okay, you can type complex searches into Maps, like “restaurants open now near me that serve pizza.”

Fitzpatrick says people are turning to Maps for discoverability and finding new stuff.

Motorbike routes separate from car routes is clever. I wonder how that works.

A few "Woos" from the crowd at the mention of Lagos, Nigeria!

Google is now able to add new addresses, businesses, and other locations directly to the map using AI and satellite imagery. This is big for developing countries.

Fitzpatrick says Google has mapped over 220 countries and territories. And given "more than 1 billion people the chance to travel the world."

Jen Fitzpatrick is onstage now talking Google Maps.

CASEY

Casey.

I can't believe they didn't play the P tape.

If you've really gotta P, you can do it today.

That’s a big deal. It might be a good sign that updates are actually going to happen for more phones more often (No Samsung or LG on that list though. Of course.)

We're now moving to Google Maps with a dramatic intro video.

HEY! The beta is getting on seven devices TODAY!

Samat says Android P Beta, a new developer preview of the OS, is now available today. For Google Pixel and seven more devices today.

Unfortunate to see Google appropriating "greyscale" from Game of Thrones. It's disrespectful to Shireen Baratheon.

"Digital well-being is going to be a long-term theme for us," Samat says.

"All the colors return in the morning when you wake up," Samat says.

(Wind down lets you go to grayscale when you’re supposed to go to bed.)

Samat now talking about Wind Down mode, which turns the whole phone grayscale at night.

Grayscale is the new black.

CODENAME: SHUSH

Setting up contacts that can break through dnd is smart — But I wish it was smart enough to allow more than phone calls. Some texts, eg from Signal, are important.

You can, of course, set what are called Starred Contacts to make sure you don't miss a critical call from an important person.

There's now a new gesture called Shush that puts the phone in Do Not Disturb mode when you flip it over on a table.

I would passive aggressively put DND on some of my family and friends' phones during dinner, but... they don't use Androids. Welp.

For Do Not Disturb, Samat says Google is making improvements to silence not just phone calls and texts, but also visual interruptions.

What is going to happen when we all find out how much YouTube we watch?

The app icon will also be grayed out to remind you of the limit you set.

I am setting a 15-minute limit on Twitter ASAP. (JK not JK)

Android P will also give you hard app controls to manage your time on your phone. Android P will let you set time limits, for instance.

You'll get time watched metrics for YouTube, for instance.

Now, developers can link to more detailed breakdowns of app usage through the Dashboard.

"It's like watching TV. Catching up on your favorite shows after a long day can feel pretty good. But watching an infomercial can make you feel something else," Samat says.

You can see what time of day you use each app. You can see how many notifications they send.

This dashboard also stresses me out, too, tbh. Big data is still hella data. Making sense of it is still kind of on you to commit to taking action.

This dashboard is wild. It is so much data. It’s the quantified self but for your phone use.

We're talking about the new Android P Dashboard, which gives you info on app usage, notifications, and so on.

I am going with well-being because I respect our copy editors! Unless I have that wrong, in which case I am sorry, Kara.

We are all very distracted by “wellbeing” as one word. (It is really bad. —Kara)

"Helping people with their digital well-being is more important to us than ever," Samat says.

I am not used to seeing "wellbeing" as a single, non-hyphenated word. (It is bad. —Kara)

He was actually able to disconnect and have a good vacation, he says.

Samat is telling story about locking his phone in a hotel safe on vacation. "I was shocked. I was kind of angry. After a few hours, something pretty cool happened," he says.

The most important update, though, isn’t all this gesture stuff. It’s what Google calls “Digital wellbeing.”

We're moving onto digital well-being. Burke handing the stage off to Sameer Samat for this part.

I was just whining to Dieter earlier today btw that most smartphones are too large for me to reach the top of the device to drag down and around. The volume use case is neat, though I agree that it doesn't seem to accomplish what the side buttons already do well.

Also, you can annotate screenshots natively now. Hooray!

That's huge. I hate opening the control center on iOS to do it.

The new “only when you need it” rotate button is cool, though.

New rotation button also appears right in the corner of the screen when you physically rotate your Android phone.

I hate the new volume controls. The hardware buttons adjust media volume by default - applause for that, which is wrong. The problem is that you can’t use the volume button to quickly jump into vibrate or silent mode. You have to tap the screen.

The key difference is that the slider "now adjusts the media volume" by default. People are clapping for this!

Android P’s ‘actions’ and ‘slices’ are a whole new way to use mobile apps

Burke says small changes can make a big difference. Talking about volume control now.

It's a simple thing, but it feels so obvious that the search bar should be at the bottom rather than the top of your giant smartphone.

(And, don’t worry, the back button still exists, it just hides when it’s not needed.)

App content is "now glanceable," so you can see what's happening live in the app from the multitasking viewer.

Sliding the home button over lets you quickly scroll through recent apps, which is fast and smart.

It looks really clean and simple. Swiping up past multitasking brings you to the app drawer.

I am sort of excited about having one-swipe access to a search bar from anywhere in the OS (like Just Type on webOS heyyyy).

Burke showing off a live demo of this new button alongside iPhone X-like gestures for swiping up to access multitasking.

The single home button looks like a little lozenge. Overview has recent apps, predicted apps, and a Google search button.

We're talking about the new Android P home button.

Dave Burke uses one of those hold-your-phone sticker things.

"With Android P, we put a special emphasis on simplicity by addressing many pain points and when the experience was more complicated than you thought it would be," Burke says.

I spy "smart reply" in ML Kit! Google released Reply as a standalone app earlier this year to add the feature to other messaging apps outside of Google. https://www.theverge.com/2018/2/21/17036126/google-smart-reply-app-slack-facebook-messenger-whatsapp-hangouts-hands-on

Android is now using AI to help manage your battery life

If you like gestures, I have some good news for you.

DO YOU LIKE GESTURES?

We're moving on from intelligence to simplicity, another key goal of Android P.

ML Kit can work with Google’s cloud or just directly on the device.

It's cross-platform, too, so it runs on iOS and Android.

Being able to order a Lyft from the mobile search bar is another great touch. This is the best set of new Android features I have ever seen. Google is leapfrogging iOS in truly useful new ways.

You can access "on-demand APIs to image labeling and text recognition and more," he says.

Burke says AI expertise is hard to come by, and it's intimidating to developers. So Google is launching a new set of AI APIs called ML Kit.

An early developer program for Actions/Slices will open next month, Burke says.

Slices are like a subset of actions. They let apps put their buttons elsewhere in the UI. Starting with search. Android is going as far as it can to get away from the “go home, launch app, do thing in app, go home, launch another app,” thing. It wants actions and slices to “decompose” your apps out into the OS.

This works well with an app like Lyft, which Burke just showed on-screen.

Basically, you get a "slice" of an app surfacing in search and other parts of Android to render the app naturally.

Burke now talking about Slices, which combine with Actions to help surface app design.

(It’s sort of like Just Type back in the day for webOS. There is your first Palm mention of the day. There will be more.)

Actions are really cool. They let you get into ppp stuff without digging through the app’s UI.

This will work with third-party apps, too, and Burke says devs just have to add a bit of code to make it work on Android.

For example, connecting headphones suggests resuming listening to your music.

Now, Google is trying to predict not just apps, but mobile actions.

Burke says search on Android has a 60 percent success rate predicting what app you want to launch.

Adaptive Brightness. IDK that brightness really needs AI, tbh, but okay. One of the themes here is that Google is getting better at knowing what AI can and can’t do on a phone. It can help with predicting what apps you want to launch and brightness, it’s not that great yet at the stuff they were trying to do with Google Now a few years ago.

Google is taking on the iPhone X with new gesture navigation in Android P

It's led to fewer manual brightness adjustments than any past version of Android.

This would be useful today, where I am Instagram blogging more than usual.

Adaptive Brightness, a new Android P feature, will do it for you by learning your preferences, Burke says.

Burke now talking about battery life and how most people misuse it and cause battery problems.

So, Google “worked with DeepMind” to add AI to fixing battery life on Android. The division between Google AI and Alphabet AI is weirdly fascinating to me. Anyway, yes, please more battery life.

The phone will use AI to manage batter life and expend it only on "apps you care about," Burke says.

Google partnered with its AI subsidiary DeepMind to work on what it calls Adaptive Battery.

There's going to be a video about this new version of Android. A P tape, if you will.

Ha - battery as the foundation of Maslow’s pyramid.

Burke talking about battery life now as an area where AI can help with Android.

"We think smartphones should be smarter. They should learn from you and adapt to you."

And Android P is a big pillar for Google to merge mobile and AI, he says.

Android P: an exclusive first look at Google's most ambitious update in years

It's helped "fuel the shift of computing from desktop to mobile," he adds. "AI is going to profoundly change industries."

"Through this journey, we've seen Android become more than just a smartphone OS," Burke says.

So uh. If you don’t want to wait: https://www.theverge.com/2018/5/8/17327302/android-p-update-new-features-changes-video-google-io-2018

Burke talking about the beginning of Android with the T-Mobile G1.

Dave Burke is taking the stage after the 10-year anniversary video.

Google just gave a stunning demo of Assistant making an actual phone call

We’re starting with video. It’s close to the 10-year anniversary for Android. So, you know, MONTAGE TIME!

At last! We are moving on to the main attraction.

Now we're shifting gears to Android P.

It will come to everybody by next week, he adds.

Upstill says Google News is rolling out on iOS, Android, and the web in 127 countries starting today.

Google built it with 60 publishers from around the world and it's rolling out in a few weeks.

“Subscribe with Google” — Okay cool, easier news subscriptions is good. 60 publishers on board. I wonder if Google is taking a cut, though?

It's a new feature called Subscribe with Google.

"No more forms, credit card numbers, or new passwords," Upstill says of subscribing to publishers via Newsstand.

1,000 magazines included.

Now we're onto how Newsstand integrates into the new Google News.

“Everyone has access to the same information.” Interesting! Also quite the Facebook dig. Also quite the level of “responsibility” Googe is taking on here.

This news product is hugely ambitious, and I want to spend a lot of time with it. It's hard not to worry, though, given how YouTube has pushed people so aggressively toward fringe ideologies.

"It's an unfiltered view of events from a range of trusted news sources," he says of Full Coverage.

"It's a true 360-degree view that goes well beyond what I get from just scanning a few headlines," Upstill says.

Sister site Vox.com at the top of full coverage. Explain the News, y’all. There’s also a timeline of “key moments.” This seems in-depth, but also a little... yikes? In political news, how many opportunities will there be for partisans to get angry at what’s in these timelines?

Casey likes to use the full terminology.

It's designed around a timeline template, which is a neat idea.

TEMPORAL CO-LOCALITY

Upstill using the power outage in Puerto Rico as an example of how Google News uses the Full Coverage feature to make sense of an event.

(Here you go, Dieter.)

Upstill says getting "full coverage" is a big focus, and it's assembled using a special technique to gather and present information.

(That is quite the Android Wear watch he’s wearing on stage, BTW.)

It looks a little bit like an Instagram Story format with the tappable cards.

Now we're onto understanding the full story, which will undoubtedly talk about fake news and getting both sides of a story.

The tabs are “For you, Headlines, Favorites, Newsstand), BTW. Feel free to tweet your feelings about Google Reader anytime.

Google’s Smart Displays are going on sale in July

He's also introducing a new media format called Newscasts exclusive to Google News.

Upstill is now talking about the design of the app, which is using Google's new Material Theme design philosophy.

It integrates YouTube, too. I’m sure that won’t go haywire, given how trustworthy the YouTube algorithm has been lately? I wonder how different the News algo will be from YouTube’s own algo.

We're now seeing a deep dive into how the app uses GIFs and short videos.

He says Google News will use reinforcement learning, an AI training technique, to better understand your behaviors.

Heyyyy tho, sister site SBNation is up on the slide. Hi, SBNation!

"I didn't have to tell the app I followed politics, love to bike, and like to follow news about the Bay Area," Upstill says.

The Google Feed is ALREADY one of the biggest untold influences on the web, barely anybody talks about it. Now that they’re unveiling a revamped news app with algo-chosen top stories, that is about to change.

Here are the six new voices coming to Google Assistant, including John Legend

The app will feature big headlines, local news, YT videos, and more.

New Google News will give you a top 5 stories right at the top when you open the app.

He outlines three focuses: keeping up with news, understanding the full story, enjoying and supporting sources you love.

It's a real Upstill battle.

Okay, a new Google News. I’m sure publishers will love Google being involved more in news just like they have Facebook. ..... ... ... wait.

Trystan Upstill is now onstage to talk Google News redesign.

"We want to make sure we're giving them deeper insight and the fuller perspective," Pichai says.

He's talking about a new AI-powered Google News product.

"I think there's more great journalism being produced today than ever before," Pichai says. "It's also true that people turn to Google in times of need and we have a responsibility to provide that information."

”I think there is more great journalism produced today than ever before.” Sundar Pichai is a Verge reader.

He's talking about Google News now.

(He keeps using that word. I do not think it means what he thinks it means.)

"We want to work with organizations and journalists to help develop innovative products and programs that help the industry."

Pichai recalls a story from his childhood about getting the newspaper in India and how news was digested back then.

Another instance where Google feels a “responsibility.” Again, what does “responsibility” mean?

Pichai now talking about news and supporting quality journalism. "It's foundation to how democracies work."

Be Best or Be Internet Awesome? YOU PICK.

Kids are awesome by default, Sundar.

All of these tools are launching with the company's digital well-being site later today.

Pichai now talking about helping kids become "safe explorers of the digital world," with a new initiative called Be Internet Awesome.

Thank god — I like some YT notifications, but I’ve been turning them off more and more lately.

Another new idea: combining notifications into a daily digest. YouTube will roll out all these features this week, he says.

There’s a dashboard that will show you how much you use your phone and tell you to stop.

Pichai talking about break reminders for apps like YouTube.

We're basically transitioning to Android P talk here and how it will help users manage time spent on apps.

(that was a joke)

Pichai outlines four ways to help users with digital well-being: understanding habits, focusing on what matters, switching off and winding down, and finding balance with your family.

“JOMO, the actual joy of missing out.” There’s also IOMO, the indignation of missing out.

Pichai drops a FOMO reference. "We think there's a chance for us to do better," Pichai says.

We're onto "digital well-being," the increasing social pressure to respond to anything right away and tech anxiety.

I keep thinking back to how Facebook's solution for this same problem is for every business to build a complicated messaging tree inside a bot built exclusively for Facebook Messenger. Meanwhile, Google is just like WE CAN FAKE ENTIRE HUMAN BEINGS AT SCALE.

Hm, “Digital well-being.”

"We are working hard to get users back time," Pichai says. This is the big ethos from this year's I/O.

Good lord this is also wild. Google will call, get an answer, and then update all Google properties with the new information.

Can "John Legend" call and make me a fancy dinner reservation? Under "John," obvi.

Google can do this automatically by phoning those businesses using AI.

Pichai says most people want to contact businesses to know opening hours, especially on holidays.

"We want it to work in case you're busy and your kid is sick and you need to call for a doctor's appointment," Pichai says.

Can this really work? “We’re still developing this technology,” Pichai says.

Pichai says the Assistant will ask for things like wait times and "handle the interaction gracefully."

Now it’s going wrong. The person on the line isn’t getting it right, speaking with an accent. Lots of follow-up questions. This is just completely wild. The Assistant is just sort of figuring it out a little?

We're watching Google Assistant basically trick real people onstage right now. Using phrases like "Ok, gotcha!"

Missed opportunity not to end the call with "These violent delights have violent ends."

Yeah, this is truly next-level AI stuff, the kind of real-world usefulness we haven't really seen quite like this before.

WHAT IS REAL ANYMORE

This is incredible.

"That was a real call you just heard," Pichai says of the demo. The Assistant can understand the nuances of a conversation.

”Google Duplex”

But it’s amazing, too? Pichai says it was “a real call.”

This all feels super creepy! The woman at the salon apparently can’t tell she’s talking to a robot.

Okay, this is borderline dystopian! How will you know when you're talking to a robot??

This is super wild. Facebook did a version of this with something called M inside of Messenger. But they had to rely on human beings to do it, and it was so expensive they shut it down after only rolling it out to a few thousand people.

It even says “mmm hmm.” It sounds way more natural and realistic than the usual voice.

This is like what Facebook tried to do with M, but instead, it used human contractors in Menlo Park. This is wild.

We're getting a recorded demo of this. It's... creepy!

OMG.

Google Assistant is behaving like a legit personal assistant in this hair appointment use case.

Pichai says Google Assistant will now make phone calls on your behalf. Woah!

"We want to connect users to businesses in a good way," he says. 60 percent of businesses don't have an online booking system set up he adds.

"A big part of getting things is done is making a phone call” — Maybe if you’re a telemarketing scammer, heyooo.

"A big part of getting things done is making a phone call," Pichai says.

That's it for new Assistant features. Sundar Pichai is back onstage.

Er, correction: in Google Maps, not Android Auto.

Whoa - Android Auto playing music from YouTube (no video ofc). Coming this summer.

Sharing ETA with voice commands and playing music. Assistant coming to Google Maps this summer.

Rincon now describing new Google Maps flow for Assistant.

Exactly, Nick!

This feels like a reworked Google Now built into Assistant. The new screen will launch on Android this summer and iOS later this year.

Ha wow. When is Google going to just kill Google Now? It’s built into the Assistant now, default populated when you invoke it.

Rincon shows off a new swipe-up gesture for Assistant from the Android home screen.

Google has been working with Starbucks, DoorDash, Domino's, and other restaurants for Assistant, Rincon says.

She is setting her living room temperature to 66 degrees. That is too cold for me. The point here is that you can mix visual and voice interactions, but I can’t stop thinking about that chilly room.

She's also using those follow-up queries that Huffman described. Works really well it seems.

Rincon showing off smart home requests and Google search queries.

I would just like to remind everyone that these smart displays look good, but we still don’t know the full UI here. But it will be more voice than touch. Don’t imagine an Android tablet on your counter.

We're getting into design changes to Assistant on phones now. A new live demo is up.

Rincon now talking about video calling, Google Maps, and other apps on smart displays.

Rincon talking cooking videos accessed via voice, which is a nice feature.

Also, how sad is Amazon about the Echo not having YouTube? And by “sad” I mean “probably livid.”

I bet Amazon is feeling that dispute it had with Google last year over the Echo Show.

Rincon says YouTube and YouTube TV will be available on its smart displays.

Gmail’s Smart Compose will write emails for you

But it’s great that proper YouTube TV is coming to smart displays. That makes them an actual kitchen TV.

Rincon asked to play Jimmy Kimmel Live, and it took quite a bit to load.

This Lenovo smart display looks better from the back (which is wood) than the front, which has a speaker grille I don’t really love.

We're getting a live demo of these smart displays now.

Ha, okay then: they are coming in July.

I really hope Google is actually going to give us release dates for these smart displays. We saw them at CES.

Rincon talking about new visuals for Assistant.

Huffman handing off the stage to Lilian Rincon now to talk more AI.

The Assistant now understands these niceties. Pretty Please is coming later this year, Huffman says.

I mean, I dunno, maybe it’s okay for kids to learn they can boss a computer around.

Oh man Chaim totally called this! Say Please to your assistants! Maybe they'll spare you when when robotics eventually take over.

Google is looking into this, and it's calling the changes to its system "Pretty Please," to help teach kids to ask nicely.

Huffman says there's a concern that kids are learning to be bossy and demanding when talking to AI.

Huffman moving on to games, stories, and other family-focused Assistant features.

Google Photos will soon be able to colorize old photos

This was a popular trick to test AI a few years ago, and almost every system failed it. Being able to nest questions and requests into one another is big.

Huffman is now giving us a linguistics lesson. Today, it’s “coordination reduction.” Tomorrow, it will be etymology.

He's talking about the difficulty of using software to parse language, and how to look for different requests in the same sentence.

Who connects a popcorn maker to a Google Home? I mean, that’s a fire hazard waiting to happen. I’m not saying, I’m just saying.

Huffman points out how Assistant can now handle threaded requests, what Google calls Multiple Actions.

Alexa got this feature a few weeks back.

The feature is called Continued Conversation, and it's coming in the next few weeks.

”Continued conversation.”

"I was able to have a natural, back-and-forth conversation," Huffman says of new improvements to Assistant.

Okay, so this might just be for follow-up questions. Which, cool, but... I still hate the original hotword.

Huffman is basically talking about follow-up queries, which is a pretty challenging for voice-based AI.

"It shouldn't be so hard," Huffman says. Now, you won't have to say 'Hey, Google' every time to access the Assistant.

Huffman playing a viral video of a grandma using a Google Home Mini, with a little trouble.

OH MY GOD. “It gets a little annoying to say 'Hey Google.'” IS THIS HAPPENING?

"It gets a little annoying to say 'Hey, Google' every time you want to get its attention," Huffman says.

Huffman now discussing naturally conversational AI, and how to make software understand the social dynamics of conversations.

30 languages, 80 countries: that’s soon going to be a huge competitive advantage for the Assistant.

Huffman now talking about Assistant in other countries. In 30 languages and in 80 countries later this year, he says.

Assistant is in cars from over 40 brands and on over device from over 500 manufacturers.

Chaim Gartenberg once wrote about saying Please and Thank You to your robot assistants. I have been and will def do it all the time for my robo Celeb Assistants. https://www.theverge.com/circuitbreaker/2017/12/10/16751232/smart-assistants-please-thank-you-politeness-manners-alexa-siri-google-cortana

Huffman says Assistant is now on over 500 million devices.

If this is the part where they tell me that I can say something other than “Hey Google” I will rush the stage and hug the presenter.

Pichai has handed the stage off to Scott Huffman.

This is a cute “Hey Google” video, you’ve seen some of these on TV already. Lots of celebrities.

And remember, if you take a photo of John Legend, Google Photos can now turn him into a PDF.

John Legend's voice is coming to Google Assistant later this year, Pichai says.

(Back in the day, I used to have an Ozzy Ozborne voice for my GPS unit. It was hilarious because he could never get through the directions before I missed my turn.)

Wow, robot John Legend is... something else. I can live my dream of pretending to be Chrissy Teigen now.

Pichai says WaveNet helps Google really capture John Legend's voice in clarity and without the robotic monotone-ness.

Wait. His voice is ACTUALLY coming to the Assistant “in certain contexts.”

"You can find me on all kinds of devices — phones, Google Homes, and if I'm lucky in your heart," Legend says in a video on-screen.

UM, YES, JOHN LEGEND VOICE!

I hope he sings about what happened to Holly.

Ha! John Legend once again doing work for Google. He is practically a brand ambassador. I wonder if he’ll become a “product manager” soon.

He'll sing, too!

John Legend will now talk to you through Google products.

But now, he's talking about featuring a famous voice. Perhaps a celebrity... oh it's John Legend.

Pichai says it's important to get the right accents and dialects represented.

Wow six new voices is wild. They sound fairly natural. Each has completely different affects to their voice.

Pichai says Wavenet will now power six new voices for Google Assistant.

Huh. The main voice for Google Assistant was originally nicknamed “Holly.” The company deserves credit for calling its voices by generic names, not gendered names.

Pichai now recalled WaveNet, a breakthrough from DeepMind that helps create a more natural AI voice.

Google wants its Assistant to be controlled primarily by voice, Pichai says.

Pichai moving fast. Now onto Google Assistant.

So many of these AI features are neat, but kind of hard to discover. I think either it’ll be so pervasive that you are using AI in your UI all the time, or you’ll ignore it. Either way, I don’t think you’re going to think about the AI in your phone that much. Hard to market it!

He says they are eight times more powerful than last year's chips.

"These chips are so powerful that for the first time we had to introduce liquid cooling to our data centers."

Pichai just announced TPU 3.0, the next-gen chips.

Pichai now talking about Google's Tensor Processing Units, the special custom-designed machine learning chips Google uses.


Before & after

Coloring old photos is big, and it's a really interesting application of image recognition.

Google Photos can now use AI to separate subjects in photos and pop the color or re-create a grayscale photo in color.

Finally - Photos will recognize documents. Will auto-convert them to PDF. This has been a great thing on the iPhone, and I'm glad it’s here, too.

The last one, the document converting, is huge. That will be immensely helpful.

If a photo is underexposed, the AI in Google Photos will offer a suggestion to fix brightness or recognize a document and convert it right to PDF.

So, a new AI-powered feature called Smart Actions in Google Photos will understand who's in the picture and offer to share the photo with those people.

Pichai says every single day there are over 5 billion photos viewed in Google Photos.

Now we're onto Google Photos.

But also, how often have you typed on your phone by just hitting the autocomplete suggestions to make a whole sentence? Not as a joke I mean, but in actual usage?

Smart Compose coming to all Gmail users later this month.

"It takes care of mundane things like addresses," Pichai says. "I've been sending a lot more emails to the company. Not sure what the company thinks of it."

This is straight out of that autocomplete meme where you write "I love" and see how much machine owns you.

Wow, this Gmail feature is rad.

The new feature is called smart compose. Google uses machine learning to suggest phrases for you as you type. All you have to do is hit Tab.

Pichai switching gears, still in the AI theme, with Gmail.

Morse code for Gboard will be available in beta later today.

(I know I shouldn’t be thinking about this, but I’m just going to say Pichai is wearing a pretty good jacket. )

"Tania and Ken are actually developers. They really worked with the team," Pichai says.

Google in Morse code.

Tania, the woman in the video who helped Google implement this, is here live at the keynote.

Lots of cheers for that video.

Gboard now supports Morse code as an input, with text out with predictions and suggestions.

The video features someone with a disability who requires a Morse code device to communicate. Google says these devices can now be powered by Gboard, the company's algorithmic keyboard.

Pichai now talking about applying machine learning to Morse code. Google brought a video to showcase this.

I wish Google could solve the problem of people yelling at each other on TV, though.

AI can help Google, via YouTube for instance, discern who is talking when even in a shouting match and automatically generate captions.

I'm stressed.

Closed captioning sucks for when people talk over each other on TV. Google is trying to solve the captioning problem with closed captioning and machine learning.

Honestly, this is good and all but maybe just don't watch people yell. Yelling is bad.

He uses a talking head shouting match on the evening news as an example.

Pichai now talking about accessibility and developing thoughtful technology for everybody.

"We can actually qualitatively predict the chance of readmission 48 hours ahead of time," Pichai says. Google is publishing a paper on this later today, he says.

I admire any keynote that begins with seven disembodied eyeballs projected onto the stage.

Pichai needs to be super careful when talking about AI in medicine. It’s definitely a feel-good story if handled well, but terrifying if not.

Pichai now says AI can help doctors predict so-called medical events, which are when a patient gets very sick and preventative measures can save lives.

(As always, connectivity at these events is sketchy, despite the Ethernet and dongles, btw.)

He says AI can help predict cardiovascular risk, and detect it non-invasively.

"Our AI systems offered more insights" than humans did, Pichai says.

He's talking about field trials to diagnose diabetic retinopathy in developing countries.

"AI is going to impact many many fields," Pichai says. Now talking health care.

Both Nadella and Pichai feel a “responsibility.” But, I dunno, when I am given a “responsibility,” I am held accountable for it if I screw up. Who will hold Google and Microsoft accountable? I sometimes feel saying “responsibility” is a little empty, is all.

Sundar now talking about Google's mission and how it ties into AI.

"We know the path ahead needs to be navigated carefully and deliberately. We feel a deep sense of responsibility to get this right."

Okay, good. Pichai is talking about privacy and AI ethics a bit.

"It's clear technology can be a positive force. But it's equally clear we can't just be wide-eyed about the changes technology creates."

Sundar is basically talking about the need for digital skills to help people around the world, part of Google's worldwide training programs.

"Expectations for technology vary greatly depending on where you are in the world," Sundar says.

(Weirdly, joking about a “serious bug” and then going to emoji feels like maybe a Facebook dig?)

Sundar now getting down to business.

Google foam floats above the beer because [insert cloud computing joke here]

Now Sundar is talking about beer emoji, and the explanation about why the foam is floating above the beer.

Pichai: “The irony of the whole thing is I’m a vegetarian in the first place.”

"I never knew so many cared about where the cheese is," Sundar says. Surprise: he's a vegetarian.

Sundar now talking about a major bug in one of Google's core products... he's talking about emoji.

Pichai says “we have a lot to cover.” I bet.

Sundar says over 7,000 people are here at I/O today.

Now, Google CEO Sundar Pichai is onstage.

“Make good thigns together” is apparently the new “Be together, not the same.”

The little blocks revived their friend! Zooming out to a message reading, "Make good things together." A new motto, perhaps?

We're all just a bunch of squares, apparently.

This is all very adorable.

This feels almost like a Pixar short, in line with the last few Google I/O keynote openers. Though instead of cute animals, it's anthropomorphic blocks.

The traditional I/O countdown is with weird little cube guys.

Here we go, folks. The keynote is starting, with a very loud animated countdown.

We have started.

Almost ready. The crowd is cheering randomly.

I am looking very hard for the back of my head, but, unfortunately, it is insignificant in this crowd.

Hey, Nick, Natt, Vjeran: while I was away, Sundar Pichai took your photo.

Whew just got back online. Looks like Ethernet just kicked the bucket. Thankfully, everyone at The Verge travels with 87 backup connectivity options.

10 mins until kickoff!

World Draw is a new AI experiment that lets you draw from your phone and... animate the world? It certainly looks cool. Not sure how AI fits in.

They’re doing a cool thing with some “Worlddraw” app. Http://g.co/worlddraw


Spotted: Rick Osterloh, Google's hardware boss.

Android P should be about privacy

Google has a closed captioning screen under the projector, which has just said "[music]" for the past 15 minutes.

This ambient music fest has been going on for about 10 minutes now, and I feel like the audience is having a collective existential crisis. Or just me? Maybe just me.

Dani Deahl wrote about it back in March for those who want to learn more: https://www.theverge.com/circuitbreaker/2018/3/13/17114760/google-nsynth-super-ai-touchscreen-synth

It kind of looks like a visualizer mixed with an arpeggiator. It's got four giant knobs and a colorful screen in the middle.

Now, someone is onstage with the Nsyth Super, as the device is called, demoing it live.

The first video onstage here at the keynote is about using neural networks to help musicians build new hardware. It looks... very cool!

BTW there is a poll on our IG Story that you should take part in! Android Porg is pretty good, fits the 'tasty treats' theme.

Vjeran.

Ooh, Star Wars theme, Dieter? Android P...org?

Dieter was just pulled away to go take selfies with fans. He is a celebrity.

If you have been watching closely, you’ve seen me post and delete Motion Still gifs several times. But they’re not moving. I would love it if everybody would just adopt a modern moving image standard that works across all platforms. That would be nice. Oh well. We can’t have nice things. This the stuff you think about when you’re waiting for a keynote to begin. (Another 45 minutes). Soon, I will be discussing XMPP. Apologies in advance.

I am here, and I am prepared to fry in the hot Mountain View sun for all things Google.

To all the haters out there making fun of me for taking a photo with my iPad, I have only one thing to say....

I AM NOT SORRY

So this is happening and I cannot stop it.

I regret to inform you that it is already warm at 8:17 in the morning. This means that it will be hot enough to cause several tech bloggers and developers to collapse before the end of the keynote. If I go silent today, please inform my family that I'm in the hands of an AI EMT robot.

Google may launch a new set of Android controls to help you manage phone use

Google says Android Things is finally ready for smart devices

Google partners with JBL for an Android TV-powered soundbar

AI is so important to Google it’s rebranding its research division

You're here a little early! The Google I/O keynote will start at 10AM PT / 1PM ET, or you can find more timezone math here. Meanwhile, here are eight things to expect today, plus Vlad Savov on why he thinks Android P should stand for privacy.

Event Details

Google is headed back to the Shoreline Amphitheater in Mountain View, California this week for Google I/O, which these days revolves around the future of Android and Google’s artificial intelligence efforts. In addition to those two pillars of Google’s business, the company is expected to make news announcements regarding Search, YouTube, and its new rebranded wearable tech platform, Wear OS. There will also surely be new updates to Google Assistant to discuss and more on the company’s smart home competition against Amazon.
Start time:
10:00 AM PDT, 5/08/2018

Liveblog Tips

  • When new posts are available, a button appears on top of the page
  • Scroll to the bottom of the page to load older posts
  • Click on a timestamp to link to a specific update