These days, you expect a lot from a smartphone. You want a premium camera that can take vivid, share-worthy photos wherever you go. You need a tool that connects you to the world with all your favorite apps and also helps out during the day. And you want a phone that has a battery that’s reliable for long stretches, while it stays secure and up to date with the latest software. You also don’t want it to break the bank. The new Pixel 3a and Pixel 3a XL are all of those things and more, for half the price of premium phones.Pixel 3a is designed to fit nicely in your hand, and includes an OLED display for crisp images and bright colors. It comes in three colors—Just Black, Clearly White, Purple-ish—and two sizes, with prices in the U.S. starting at $399 for the 5.6-inch display and $479 for the 6-inch model. High-end features: camera, Google Assistant, battery life and securityGoogle Pixel 3a delivers what you’d expect from a premium device. Starting with the award-winning camera, Pixel 3a lets you take stunning photos using Google’s HDR+ technology with features like Portrait Mode, Super Res Zoom, and Night Sight to capture clear shots in low light. Google Photos is built in, so you can save all your high-quality photos and videos with free, unlimited storage. And it comes with an 18-watt charger so you get up to seven hours of battery life on a 15-minute charge and up to 30 hours on a full charge.1Squeeze Pixel 3a for the Google Assistant to send texts, get directions and set reminders—simply using your voice. Plus, the Google Assistant’s Call Screen feature (available in English in the U.S. and Canada) gives you information about the caller before you pick up, and shields you from those annoying robocalls. We’ll make sure your Pixel 3a is protected against new threats, by providing three years of security and operating system updates. In a recent industry report, Pixel was rated the highest for built-in security among all smartphones. It also comes with the custom-built Titan M chip to help protect your most sensitive data. New features at a more accessible pricePixel makes it easy to use Google apps like YouTube, Google Photos and Gmail. And you'll get access to new features first. Pixel 3a and the entire Pixel portfolio will get a preview of AR in Google Maps—the next time you're getting around town, you can see walking directions overlaid on the world itself, rather than looking at a blue dot on a map.. This helps you know precisely where you are, and exactly which way to start walking (in areas covered on Street View where there’s a good data connection and good lighting).Time lapse is coming to Google Pixel 3a, so you can capture an entire sunset in just a few seconds of video—great for posting on social media or messaging to your friends.Buy it from more places, use it on more networksPixel 3a and Pixel 3 are now available through more carriers, including Verizon, T-Mobile, Sprint, US Cellular, Spectrum Mobile (Charter), C Spire and Google Fi, as well as being supported on AT&T. If you’re new to Pixel, you can transfer photos, music and media quickly with the included Quick Switch Adapter. If you need a little extra help, 24/7 support from Google is just a tap away in the tips and support link in the settings menu. You can even share your screen for guided assistance. Look for Pixel 3a in the Google Store in countries where Pixel is sold beginning today, and through our partners beginning tomorrow. 1 Approximate battery life based on a mix of talk, data, standby, mobile hotspot and use of other features, with always on display off. An active display or data usage will decrease battery life. Charging rates are based upon use of the included charger. Charging time performance statistics are approximate. Actual results may vary.
Whether it’s delivering search results in the correct language or recommending the quickest route home, data can make Google products more helpful to you. And you should be able to understand and manage your data—and make privacy choices that are right for you. That’s why easy-to-use privacy features and controls have always been built into our products. At I/O, we announced a number of additional privacy and security tools across our products and platforms: Making it easier to control your dataOne-tap access to your Google Account from all our major productsPrivacy controls should be easy to find and use. A few years ago, we introduced Google Account to provide a comprehensive view of the information you’ve shared and saved with Google, and one place to access your privacy and security settings. Simple on/off controls let you decide which activity you want to save to your account to make Google products more helpful. You can also choose which activities or categories of information you want to delete. As the number of Google products has grown, we’re making it even easier to find these controls. Today you’ll see your Google Account profile picture appear in the top right corner across products like Gmail, Drive, Contacts and Pay. To quickly access your privacy controls, just tap on your picture and follow the link to your Google Account. The prominent placement of your profile picture also makes it easier to know when you’re signed into your Google Account. We’re bringing this one-tap access to more products this month, including Search, Maps, YouTube, Chrome, the Assistant and News.Easily manage your data in Search, Maps and the AssistantLast year, we made it easier for you to make decisions about your data directly within Search. Without leaving Search, you can review and delete your recent Search activity, get quick access to the most relevant privacy controls in your Google Account, and learn more about how Search works with your data. Now we’re making it easier to manage your data in Maps, the Assistant and YouTube (coming soon). For example, you'll be able to review and delete your location activity data directly in Google Maps, and then quickly get back to your directions. Auto-delete now available for Web & App Activity, coming soon to Location HistoryLast week we announced a new control that lets you choose a time limit for the amount of time your Location History and Web & App Activity data will be saved—3 or 18 months. Any data older than that will be automatically and continuously deleted from your account if you choose. This new control is available today for Web & App Activity and coming next month to Location History.Bringing Incognito mode to Google appsSince launching more than a decade ago, Incognito mode in Chrome has given you the choice to browse the internet without your activity being saved to your browser or device. As our phones become the primary way we access the internet, we thought it was important to build Incognito mode for our most popular apps. It’s available in YouTube and coming soon to Maps and Search. Tap from your profile picture to easily turn it on or off. When you turn on Incognito mode in Maps, your activity—like the places you search or get directions to—won’t be saved to your Google Account. Building stronger privacy controls into our platformsWe also made announcements today about privacy across our platforms and products: Android Q is bringing privacy to the forefront of Settings and creating more transparency and control around location. Chrome announced plans to more aggressively restrict fingerprinting across the web and improve cookie controls. Finally, we announced plans to give users more visibility into the data used to personalize ads and the companies involved in the process for the ads that Google shows on our own properties and those of our publishing partners.Doing more for users with less dataFederated learning makes products more helpful while keeping data on your deviceAdvances in machine learning are making our privacy protections stronger. One example is federated learning, a new approach to machine learning. It allows developers to train AI models and make products smarter—for you and everyone else—without your data ever leaving your device. These new AI techniques allow us to do more with less data. Gboard, Google’s keyboard, now uses federated learning to improve predictive typing as well as emoji prediction across tens of millions of devices. Previously, Gboard would learn to suggest new words for you, like “zoodles” or “Targaryen”, only if you typed them several times. Now, with federated learning, Gboard can also learn new words after thousands of people start using them, without Google ever seeing what you’re typing. We’ve also invested in differential privacy protections, which enable us to train machine learning models without memorizing information that could reveal specific details about a user. We published early research on this topic in 2014, and since then we’ve used it in Chrome, in Gmail with Smart Compose, and in Google Maps to show you how busy a restaurant is. And with the release of the TensorFlow Privacy open-source project, ML developers can now more easily use differential privacy technology.The strongest security across our products and platformsYour data is not private if it’s not secure. We’ve always invested in systems to keep our users safe—from our Safe Browsing protection that protects nearly 4 billion devices every day to blocking more than 100 million spam and phishing attempts in Gmail every day. Security keys provide the strongest form of 2-Step Verification against phishing attacks, and now they are built into phones running on Android 7.0 and above, making it available to over one billion compatible devices.And beginning this summer, anyone with a Nest Account will have the option to migrate their Nest Account to a Google Account, which comes with the added benefits of tools and automatic security protections, like 2-Step Verification, notifications that proactively alert you about unusual account activity and access to Security Checkup. We strongly believe that privacy and security are for everyone. We’ll continue to ensure our products are safe, invest in technologies that allow us to do more for users with less data, and empower everyone with clear, meaningful choices around their data.
Today we’re bringing the Home products under the Nest brand. It’s a natural next step since our products work together to help you stay informed, feel more comfortable and safe, keep an eye on home when you’re away, and connect to friends and family. Now we’re taking our first step on a journey to create a more helpful home.We’re introducing Nest Hub Max, the first product from our newly-formed team. Nest Hub Max has all the things you love about Nest Hub (formerly Google Home Hub). It has a digital photo frame powered by Google Photos and the home view dashboard, which gives you full control of your connected device. With our new display, you'll get a bigger 10-inch HD screen and a smart camera that helps you keep an eye on your home and keep in touch with family and friends. Nest Hub Max is specifically designed for those shared places in the home where your family and friends gather.The new kitchen TVThe big screen makes Nest Hub Max the kitchen TV you’ve always wanted. With a subscription, Hub Max can stream your favorite live shows and sports on YouTube TV. Tell it what you want to watch, or if you need help deciding, just ask the Assistant. But unlike your kitchen TV, it can also teach you how to cook, play your music, and see who’s at the front door. And you’re getting full stereo sound, with a powerful rear-facing woofer.Smart cameraNest Hub Max has a Nest Cam to help you keep an eye on things at home: you can turn it on when you’re away and check on things right from the Nest App on your phone. Just like with your Nest Cam, it’s easy to see your event history, enable Home/Away Assist and get a notification if the camera detects motion, or doesn't recognize someone in your home.The camera on Hub Max also helps you stay connected to your family and friends, and video calling is easy with Google Duo. The camera has a wide-angle lens, and it automatically adjusts to keep you centered in the frame. You can chat with loved ones on any iOS or Android device, or on a web browser. You can also use Duo to leave video messages for other members of your household.And now when the volume’s up, instead of yelling to turn it down or pause the game, you can use Quick Gestures. Just look at the device and raise your hand, and Nest Hub Max will pause your media, thanks to the camera’s on-device gesture recognition technology.Help just for youHub Max is designed to be used by multiple people in your home, and provide everyone with the help they need in a personalized way. With Nest Hub, we offered you the option to enable Voice Match, so the Assistant can recognize your voice and respond specifically to you. Today with Nest Hub Max, we’re extending your options for personalized help with a feature called Face Match. For each person in your family who chooses to turn it on, the Assistant guides you through the process of creating a face model, which is encrypted and stored on the device. Face Match's facial recognition is processed locally with on-device machine learning, so the camera data never leaves the device.Whenever you walk in front of the camera, Nest Hub Max recognizes you and shows just your information, not anyone else’s. So in the morning, when you walk into the kitchen, the Assistant knows to greet you with your calendar, commuting details, the weather, and other information you need to start your day. And when you get home from work, Hub Max welcomes you home with reminders and messages that have been waiting for you. The Assistant offers personalized recommendations for music and TV shows, and you can even see who left you a video message.Per our privacy commitments, there’s a green light on the front of Hub Max that indicates when the camera is streaming, and nothing is streamed or recorded unless you explicitly enable it. In addition, you have multiple controls to disable camera features like Nest Cam, including a hardware switch that lets you physically disable the microphone and camera.When, where, and how muchLater this summer, Nest Hub Max will be available in the U.S. for $229 on the Google Store and at Best Buy, Target, Home Depot and more. It’ll also be available in the UK for £219 and in Australia for AUS$349.We’re also bringing Nest Hub to 12 new countries—Canada, Denmark, France, Germany, India, Italy, Japan, the Netherlands, Norway, Singapore, Spain and Sweden. And Nest Hub will now be available in the US for $129. Finally, we have updated pricing for our speakers, starting today Google Home is $99 and Google Home Max is $299.We’re excited to make the helpful home more real for more people.
Since Nest joined Google’s hardware team last year, we’ve been working to make the smart home a little less complicated, and well, more helpful. It’s a home where products work together to help you feel comfortable and safe, keep an eye on things when you’re away, and connect you to friends and family. Today, we’re committing to that goal by bringing the Home products under the Nest brand. Our first step as Google Nest is to go beyond the idea of a “smart home,” and to focus instead on creating a “helpful home.”As part of that, we’re adding to our lineup of Nest devices, privacy commitments, accounts and platforms.Our commitment to privacy in the homeTo give you a better understanding of how our connected home devices and services will work in your home, we’ve outlined a set of privacy commitments that apply when these devices and services are used with Google Accounts:We’ll explain our sensors and how they work. The technical specifications for our connected home devices will list all audio, video, and environmental and activity sensors—whether enabled or not. And you can find the types of data these sensors collect and how that data is used in various features in our dedicated help center page.We’ll explain how your video footage, audio recordings, and home environment sensor readings are used to offer helpful features and services, and our commitment for how we’ll keep this data separate from advertising and ad personalization. We’ll explain how you can control and manage your data, such as providing you with the ability to access, review, and delete audio and video stored with your Google Account at any time.Our goal is simple: earn and keep your trust by clearly explaining how our products work and how we’ll uphold our commitment to respect your privacy. To learn more, please check out our commitment to privacy in the home.One secure accountAnother step we’re taking to help keep your Nest devices secure is making Google Accounts available to anyone using existing Nest devices and services. Google has a long history of providing billions of people with industry-leading automatic security protections when accessing products like Gmail, Photos, Maps and Drive with their Google Account.Beginning this summer, you’ll have the option to migrate your Nest Account to a Google Account, which comes with the added benefits of tools and automatic security protections:Suspicious activity detection sends you notifications whenever we detect unusual or potentially dangerous activity, such as suspicious sign-ins to your account.Security Checkup provides personalized guidance to help you secure your account and manage your online security.2-Step Verification strengthens your account security by adding an advanced second verification step whenever you sign in, including additional features like a prompt from a trusted device or the use of a physical security key.If you already have a Google Account for Gmail and other Google products, you can migrate your Nest Account to that account. New Nest users will automatically start with a Google Account at the same time that existing users are invited to migrate to Google Accounts. You’ll be able to use your Google Account across both the Nest app and the Google Home app, and with all of Google’s other products. Having a single account will also enable our devices and services to work better together—for example, Nest Hub shows the live video from Nest Hello, so you can see who's at the front door, without any additional setup.One home developer platformAnd finally, we’re unifying our efforts around third-party connected home devices under a single platform for developers to build features and apps for a more helpful home. To accomplish this, we’re winding down the Works with Nest developer program on August 31, 2019, and delivering a single consumer and developer experience through the Works with Google Assistant program—which works with more than 30,000 devices from over 3,500 home automation brands.For more details about all the changes, check out the What’s Happening with Google Nest FAQ, where you can learn more.As technology becomes a bigger part of our lives—especially when we're at home—privacy and security are more important than ever. We recognize that we’re a guest in your home, and we respect and appreciate that invitation—and these updates are among the many ways we hope to continue to earn your trust.
This year, Android is reaching version 10 and operating on over 2.5 billion active devices. A lot has changed since version 1.0, back when smartphones were just an early idea. Now, they’re an integral tool in our lives—helping us stay in touch, organize our days or find a restaurant in a new place.Looking ahead, we’re continuing to focus on working with partners to shape the future of mobile and make smartphones even more helpful. As people carry their phones constantly and trust them with lots of personal information, we want to make sure they’re always in control of their data and how it’s shared. And as people spend more time on their devices, building tools to help them find balance with technology continues to be our priority. That’s why we’re focusing on three key areas for our next release, Android Q: innovation, security and privacy and digital wellbeing. New mobile experiencesTogether with over 180 device makers, Android has been at the forefront of new mobile technologies. Many of them—like the first OLED displays, predictive typing, high density and large screens with edge-to-edge glass—have come to Android first. This year, new industry trends like foldable phone displays and 5G are pushing the boundaries of what smartphones can do. Android Q is designed to support the potential of foldable devices—from multi-tasking to adapting to different screen dimensions as you unfold the phone. And as the first operating system to support 5G, Android Q offers app developers tools to build for faster connectivity, enhancing experiences like gaming and augmented reality.See how Asphalt 9 adapts screen dimensions as you unfold the phone, a feature Android Q was built to support.We’re also seeing many firsts in software driven by on-device machine learning. One of these features is Live Caption. For 466 million deaf and hard of hearing people around the world, captions are more than a convenience—they make content more accessible. We worked closely with the Deaf community to develop a feature that would improve access to digital media. With a single tap, Live Caption will automatically caption media that’s playing audio on your phone. Live Caption works with videos, podcasts and audio messages, across any app—even stuff you record yourself. As soon as speech is detected, captions will appear, without ever needing Wifi or cell phone data, and without any audio or captions leaving your phone.A video that gives background on the history of captioning. In 2009, Google added automatic captions to videos on YouTube, taking a step towards making videos universally accessible. With Live Caption, we're bringing these captions to media on phones.On-device machine learning also powers Smart Reply, which is now built into the notification system in Android, allowing any messaging app to suggest replies in notifications. Smart Reply will now also intelligently predict your next action—for example, if someone sends you an address, you can just tap to open that address in Maps.Security and privacy as a central focusOver the years, Android has built out many industry-first security and privacy protections, like file-based encryption, SSL by default and work profile. Android has the most widely-deployed security and anti-malware service of any operating system today thanks to Google Play Protect, which scans over 50 billion apps every day. We’re doing even more in Android Q, with almost 50 new features and changes focused on security and privacy. For example, we created a dedicated Privacy section under Settings, where you’ll find important controls in one place. Under Settings, you’ll also find a new Location section that gives you more transparency and granular control over the location data you share with apps. You can now choose to share location data with apps only while they’re in use. Plus, you’ll receive reminders when an app has your location in the background, so you can decide whether or not to continue sharing. Android Q also provides protections for other sensitive device information, like serial numbers.Finally, we're introducing a way for you to get the latest security and privacy updates, faster. With Android Q, we’ll update important OS components in the background, similar to the way we update apps. This means that you can get the latest security fixes, privacy enhancements and consistency improvements as soon as they’re available, without having to reboot your phone.Helping you find balanceSince creating our set of Digital Wellbeing tools last year, we’ve heard that they’ve helped you take better control of your phone usage. In fact, app timers helped people stick to their goals over 90 percent of the time, and people who use Wind Down had a 27 percent drop in nightly phone usage.This year, we’re going even further with new features like Focus mode, which is designed to help you focus without distraction. You can select the apps you find distracting—such as email or the news—and silence them until you come out of Focus mode. And to help children and families find a better balance with technology, we’re making Family Link part of every device that has Digital Wellbeing (starting with Android Q), plus adding top-requested features like bonus time and the ability to set app-specific time limits.Available in Beta todayAndroid Q brings many more new features to your smartphone, from a new gesture-based navigation to Dark Theme (you asked, we listened!) to streaming media to hearing aids using Bluetooth LE. You can find some of these features today in Android Q Beta, and thanks to Project Treble and our partners for their commitment to enable faster platform updates, Beta is available for 21 devices from 13 brands, including all Pixel phones.
For the past three years, the Google Assistant has been helping people around the world get things done. The Assistant is now on over one billion devices, available in over 30 languages across 80 countries, and works with over 30,000 unique connected devices for the home from more than 3,500 brands globally. We’ve been working to make your Assistant the fastest, most natural way to get things done, and today at Google I/O we’re sharing our vision for the future.The next generation AssistantTo power the Google Assistant, we rely on the full computing power of our data centers to support speech transcription and language understanding models. We challenged ourselves to re-invent these models, making them light enough to run on a phone. Today, we’ve reached a new milestone. Building upon advancements in recurrent neural networks, we developed completely new speech recognition and language understanding models, bringing 100GB of models in the cloud down to less than half a gigabyte. With these new models, the AI that powers the Assistant can now run locally on your phone. This breakthrough enabled us to create a next generation Assistant that processes speech on-device at nearly zero latency, with transcription that happens in real-time, even when you have no network connection.Running on-device, the next generation Assistant can process and understand your requests as you make them, and deliver the answers up to 10 times faster. You can multitask across apps—so creating a calendar invite, finding and sharing a photo with your friends, or dictating an email is faster than ever before. And with Continued Conversation, you can make several requests in a row without having to say “Hey Google” each time. The next generation Assistant is coming to new Pixel phones later this year, and we can’t wait for you to try it out. Running on-device, the next generation Assistant can process and understand your requests as you make themRunning on-device, the next generation Assistant can process and understand your requests as you make them, and deliver the answers up to 10 times faster.Bringing Duplex to the webLast year, we showed you how the Assistant can book restaurant reservations over the phone using Duplex technology. Since then, we’ve brought this feature to the Assistant on Android and iOS devices in the U.S., and are hearing positive feedback from both the people who’ve used it and local businesses.Today we’re extending Duplex to the web, previewing how the Assistant can also help you complete a task online. Often when you book things online, you have to navigate a number of pages, pinching and zooming to fill out all the forms. With the Assistant powered by Duplex on the web, you can complete these tasks much faster since it fills out complex forms for you.Just ask the Assistant, “Book a car with national for my next trip,” and it will figure out the rest. The Assistant will navigate the site and input your information, like trip details saved in your Gmail or payment information saved in Chrome. Duplex on the web will be available later this year in English in the U.S. and U.K. on Android phones with the Assistant for rental car bookings and movie tickets.Book a car with National for an upcoming tripWith the Assistant powered by Duplex on the web, you can complete online tasks much faster.A more personal AssistantFor a digital assistant to be helpful, it needs to understand the people, places and events that are important to you. In the coming months, the Assistant will be able to better understand references to all of these through Personal References. You’ll be able to ask for things more naturally like, “What’s the weather like at mom’s house this weekend?” or, “Remind me to order flowers a week before my sister’s birthday.” You always have control over your personal information, and can add, edit or remove details from the “You” tab in Assistant settings at any time.As the Assistant understands you better, it can also offer more useful suggestions. Later this summer on Smart Displays, a new feature called “Picks for you” will provide personalized suggestions starting with recipes, events and podcasts. So if you’ve searched for Mediterranean recipes in the past, the Assistant may show you Mediterranean dishes when you ask for dinner recommendations. The Assistant also takes contextual cues, like the time of day, into account when you’re asking for help, giving you breakfast recipes in the morning and dinner at night.“Picks for you” will provide personal suggestions starting with recipes, events and podcasts.Introducing driving modeIn the car, the Assistant offers a hands-free way to get things done while you’re on the road. Earlier this year we brought the Assistant to navigation in Google Maps, and in the next few weeks, you’ll be able to get help with the Assistant using your voice when you’re driving with Waze.Today we’re previewing the next evolution of our mobile driving experience with the Assistant’s new driving mode. We want to make sure drivers are able to do everything they need with just voice, so we’ve designed a voice-forward dashboard that brings your most relevant activities—like navigation, messaging, calling and media—front and center. It includes suggestions tailored to you, so if you have a dinner reservation on your calendar, you’ll see directions to the restaurant. Or if you started a podcast at home, you can resume right where you left off from your car. If a call comes in, the Assistant will tell you who’s calling and ask if you want to answer, so you can pick up or decline with just your voice. Assistant’s driving mode will launch automatically when your phone is connected to your car’s bluetooth or just say, “Hey Google, let’s drive,” to get started. Driving mode will be available this summer on Android phones with the Google Assistant.Preview of the next evolution of our mobile driving experience with the Assistant’s new driving modeThe Assistant’s new driving mode features a voice-forward dashboard that brings your most relevant activities—like navigation, messaging, calling and media—front and center.Streamline your drive with remote vehicle controlsWe’re also making it easier to use the Assistant to control your car remotely, so you can adjust your car’s temperature before you leave the house, check your fuel level or make sure your doors are locked. Now the Assistant can do these things with just one or two commands—for example, “Hey Google, turn on the car A/C to 70 degrees.” You can also incorporate these vehicle controls into your morning routine to kickstart your commute. This new experience will be available in the coming months to existing car models that work with Hyundai’s “Blue Link” and Mercedes-Benz’s “Mercedes me connect.”Just say “stop” to turn off your timer or alarmSometimes, you want help from your Assistant without having to say “Hey Google” every time. Starting today, you can turn off a timer or alarm by saying, “stop.” This feature runs completely on-device and is activated by the word “stop” when an alarm or timer is going off. This has been one of our top feature requests, and is available on Google Home speakers and all Smart Displays in English-speaking countries globally.With faster responses using new on-device processing, a better understanding of you and your world, and more help in the car, the Assistant is continuing to get better at helping you get things done. And today, we announced two new devices where you get help from the Assistant: Google Nest Hub Max and our new line of Pixel phones.
Last year, I read a social media post from a young woman in Israel. She shared a story about a guy she was in a relationship with, who was deaf, struggling to fix the internet connection at their home. The internet service provider’s tech support had no way to communicate with him via text, email or chat, even though they knew he was deaf. She wrote about how important it was for him to feel independent and be empowered.This got me thinking: How can we help people make and receive phone calls without having to speak or hear? This led to the creation of our research project, Live Relay.Live Relay uses on-device speech recognition and text-to-speech conversion to allow the phone to listen and speak on the users’ behalf while they type. By offering instant responses and predictive writing suggestions, Smart Reply and Smart Compose help make typing fast enough to hold a synchronous phone call.Live Relay is running entirely on the device, keeping calls private. Because Live Relay is interacting with the other side via a regular phone call (no data required), the other side can even be a landline.Of course, Live Relay would be helpful to anyone who can’t speak or hear during a call, and it may be particularly helpful to deaf and hard-of-hearing users, complementing existing solutions. In the U.S., for example, there are relay and real-time text (RTT) services available for the deaf and hard-of-hearing. These offer advantages in some situations, and our goal isn’t to replace these systems. Rather, we mean to complement them with Live Relay as an additional option for the contexts where it can help most, like handling an incoming call or when the user prefers a fully automated system for privacy consideration.We’re even more excited for Live Relay in the long term because we believe it can help all of our users. How many times have you gotten an important call but been unable to step out and chat? With Live Relay, you would be able to take that call anywhere, anytime with the option to type instead of talk. We are also exploring the integration of real-time translation capability, so that you could potentially call anyone in the world and communicate regardless of language barriers. This is the power of designing for accessibility first.Live Relay is still in the research phase, but we look forward to the day it can give our users more and better ways to communicate—especially those who may be underserved by the options available today.Follow @googleaccess for continued updates, and contact the Disability Support team (g.co/disabilitysupport) with any feedback.
Most aspects of life involve communicating with others—and being understood by those people as well. Many of us take this understanding for granted, but you can imagine the extreme difficulty and frustration you’d feel if people couldn’t easily understand the way you talk or express yourself. That’s the reality for millions of people living with speech impairments caused by neurologic conditions such as stroke, ALS, multiple sclerosis, traumatic brain injuries and Parkinson's.To help solve this problem, the Project Euphonia team—part of our AI for Social Good program—is using AI to improve computers’ abilities to understand diverse speech patterns, such as impaired speech. We’ve partnered with the non-profit organizations ALS Therapy Development Institute (ALS TDI) and ALS Residence Initiative (ALSRI) to record the voices of people who have ALS, a neuro-degenerative condition that can result in the inability to speak and move. We collaborated closely with these groups to learn about the communication needs of people with ALS, and worked toward optimizing AI based algorithms so that mobile phones and computers can more reliably transcribe words spoken by people with these kinds of speech difficulties. To learn more about how our partnership with ALS TDI started, read this article from Senior Director, Clinical Operations Maeve McNally and ALS TDI Chief Scientific Officer Fernando Vieira.Example of phrases that we ask participants to readTo do this, Google software turns the recorded voice samples into a spectrogram, or a visual representation of the sound. The computer then uses common transcribed spectrograms to "train" the system to better recognize this less common type of speech. Our AI algorithms currently aim to accommodate individuals who speak English and have impairments typically associated with ALS, but we believe that our research can be applied to larger groups of people and to different speech impairments.In addition to improving speech recognition, we are also training personalized AI algorithms to detect sounds or gestures, and then take actions such as generating spoken commands to Google Home or sending text messages. This may be particularly helpful to people who are severely disabled and cannot speak.The video below features Dimitri Kanevsky, a speech researcher at Google who learned English after he became deaf as a young child in Russia. Dimitri is using Live Transcribe with a customized model trained uniquely to recognize his voice. The video also features collaborators who have ALS like Steve Saling—diagnosed with ALS 13 years ago—who use non-speech sounds to trigger smart home devices and facial gestures to cheer during a sports game. Project Euphonia: Helping everyone be better understoodWe’re excited to see where this can take us, and we need your help. These improvements to speech recognition are only possible if we have many speech samples to train the system. If you have slurred or hard to understand speech, fill out this short form to volunteer and record a set of phrases. Anyone can also donate to or volunteer with our partners, ALS TDI and the ALS Residence Initiative. The more speech samples our system hears, the more potential we have to make progress and apply these tools to better support everyone, no matter how they communicate.
Sometimes, the easiest way to wrap your head around new information is to see it. Today at I/O, we announced features in Google Search and Google Lens that use the camera, computer vision and augmented reality (AR) to overlay information and content onto your physical surroundings -- to help you get things done throughout your day.AR in Google SearchWith new AR features in Search rolling out later this month, you can view and interact with 3D objects right from Search and place them directly into your own space, giving you a sense of scale and detail. For example, it’s one thing to read that a great white shark can be 18 feet long. It’s another to see it up close in relation to the things around you. So when you search for select animals, you’ll get an option right in the Knowledge Panel to view them in 3D and AR.Bring the great white shark from Search to your own surroundings.We’re also working with partners like NASA, New Balance, Samsung, Target, Visible Body, Volvo, Wayfair and more to surface their own content in Search. So whether you’re studying human anatomy in school or shopping for a pair of sneakers, you’ll be able to interact with 3D models and put them into the real world, right from Search.Search for “muscle flexion” and see an animated model from Visible Body.New features in Google LensPeople have already asked Google Lens more than a billion questions about things they see. Lens taps into machine learning (ML), computer vision and tens of billions of facts in the Knowledge Graph to answer these questions. Now, we’re evolving Lens to provide more visual answers to visual questions.Say you’re at a restaurant, figuring out what to order. Lens can automatically highlight which dishes are popular--right on the physical menu. When you tap on a dish, you can see what it actually looks like and what people are saying about it, thanks to photos and reviews from Google Maps. Google Lens helps you decide what to orderTo pull this off, Lens first has to identify all the dishes on the menu, looking for things like the font, style, size and color to differentiate dishes from descriptions. Next, it matches the dish names with the relevant photos and reviews for that restaurant in Google Maps. Google Lens translates the text and puts it right on top of the original wordsLens can be particularly helpful when you’re in an unfamiliar place and you don’t know the language. Now, you can point your camera at text and Lens will automatically detect the language and overlay the translation right on top of the original words, in more than 100 languages. We're also working on other ways to connect helpful digital information to things in the physical world. For example, at the de Young Museum in San Francisco, you can use Lens to see hidden stories about the paintings, directly from the museum’s curators beginning next month. Or if you see a dish you’d like to cook in an upcoming issue of Bon Appetit magazine, you’ll be able to point your camera at a recipe and have the page come to life and show you exactly how to make it. See a recipe in Bon Appetit come to life with Google LensBringing Lens to Google GoMore than 800 million adults worldwide struggle to read things like bus schedules or bank forms. So we asked ourselves: “What if we used the camera to help people who struggle with reading?”When you point your camera at text, Lens can now read it out loud to you. It highlights the words as they are spoken, so you can follow along and understand the full context of what you see. You can also tap on a specific word to search for it and learn its definition. This feature is launching first in Google Go, our Search app for first-time smartphone users. Lens in Google Go is just over 100KB and works on phones that cost less than $50.All these features in Google Search and Google Lens provide visual information to help you explore the world and get things done throughout your day by putting information and answers where they are most helpful—right on the world in front of you.
It’s been two years since we built Family Link to help parents introduce their kids to technology. And while the devices we use every day open the door for families to explore, learn and play together online, they can also bring a new set of worries for parents. Over the past two years, we've helped families across the globe set digital ground rules with the Family Link app: A tool that offers parents a way to create and supervise Google Accounts for their kids, manage the content they see online and the amount of time spent on their devices. Available on every Android deviceToday at Google I/O, we announced we’ll be making Family Link part of every Android device, starting with Android Q. This means that Family Link will be accessible from device settings, making setup even smoother for families. Look for it under the setting “Digital Wellbeing and parental controls” in Android Q devices rolling out later this summer.App-specific time limits and bonus screen timeWe’re also giving parents the ability to set app-specific time limits, Since not all screen time is created equal, parents will soon be able to set app-specific time limits to help kids make better choices about how they’re spending time on their device. And while parents love that they can set a bedtime or daily screen time limit, sometimes kids just need a few more minutes to finish up what they’re doing on their devices. Soon, parents will be able to give kids bonus screen time directly from their own device.A screenshot showing where you can access Family Link under “Digital Wellbeing and parental controls” in Settings.A screenshot showing a phone's list of apps with certain corresponding time limits.A screenshot showing how a parents can add Bonus Time to their child's device.A foundation for healthy digital habitsSince 2017, we’ve heard your feedback loud and clear: 67 percent of parents are worried about the amount of time their kids are spending on devices. In addition to today’s updates, we’ve been focused on making sure that time spent on making sure that the time your family spends on technology is the best it can possibly be. Here are a few ways we’ve done that:While Family Link was originally designed for kids under 13, we heard from parents that the app was still useful as their kids became teenagers. Last year, we rolled out the ability for parents around the world to use Family Link to supervise their teen’s existing Google Account. Beyond mobile phones, we added better Chromebook support so parents and children can use Family Link across different Google platforms. When apps come with a stamp of approval from a teacher, parents can feel more at ease knowing their children are engaging in healthy, educational content online. That’s why we have teacher recommendations: a collection of educational Google Play apps recommended by teachers that are a good fit for children of specific ages.Families can learn, play and imagine together with the Assistant on Google Home, other smart speakers and eligible phones with over 50 games, activities, and stories designed for families with kids. Be sure to check out the latest updates and if you want to share your ideas with us, just open the Family Link app, click the menu in the top left corner and tap “Help and feedback.”
To kick off this year’s Google I/O, we hosted our fourth annual Google Play Award ceremony to recognize the most innovative developers behind the top apps and games on Google Play over the past year. These apps and games had stiff competition across nine categories, including new additions like Most Inventive, Best Living Room Experience and Most Beautiful Game. We’re sharing the winners that rose to the top for providing the best experiences for fans, making an impact on their communities and raising the bar for quality content on Google Play.Standout Well-Being AppApps empowering people to live the best version of their lives, while demonstrating responsible design and engagement strategies.Woebot: Your Self-Care Expert by Woebot LabsBest Accessibility ExperienceApps and games enabling device interaction in an innovative way that serve people with disabilities or special needs.Envision AI by Envision Technologies BVBest Social ImpactApps and games that create a positive impact in communities around the world (focusing on health, education, crisis response, refugees, and literacy).Wisdo by Wisdo LTD.Most Beautiful GameGames that exemplify artistry or unique visual effects either through creative imagery, and/or utilizing advanced graphics API features.SHADOWGUN LEGENDS by MADFINGER GamesBest Living Room ExperienceApps that create, enhance, or enable a great living room experience that brings people together.Neverthink: Handpicked videos by NeverthinkMost InventiveApps and games that display a groundbreaking new use case, like utilize new technologies, cater to a unique audience, or demonstrate an innovative application of mobile technology for users.Tick Tock: A Tale of Two by Other Tales InteractiveStandout Build for Billions ExperienceApps and games with optimized performance, localization and culturalization for emerging markets.Canva: Graphic Design & Logo, Flyer, Poster maker by CanvaBest Breakthrough AppNew apps with excellent overall design, user experience, engagement, retention and strong growth.SLOWLY by Slowly Communications Ltd.Best Breakthrough GameNew games with excellent overall design, user experience, engagement, retention and strong growth.MARVEL Strike Force by FoxNext GamesTo check out this year’s winners, head over to play.google.com/gpa2019.
When you’re in the driver's seat, there’s a lot to think about: from getting directions to your next destination to staying connected on the go. That’s why we created Android Auto—to help make your driving experience easier and safer. Since we started five years ago, Android Auto has expanded to support more than 500 car models from 50 different brands, and we aren’t pumping the brakes there!Today, we’re introducing a new design that will roll out to all Android Auto compatible cars later this summer. The new interface is built to help you get on the road faster, show more useful information at a glance and simplify common tasks while driving. Buckle up, as we walk you through Android Auto’s new look.boardwalk_allupview_maps_spotify widget.pngSee turn-by-turn directions while controlling other apps on the same screen.boardwalk_allupview_launcher.pngThe new launcher introduces a familiar way to easily discover and start apps compatible with Android Auto.boardwalk_allupview_media playback.pngA new dark theme blends in with modern automotive interiors, while incorporating the best of Google’s Material Design.boardwalk_allupview_incoming call notification.pngNotifications for incoming phone calls and messages make it easy to stay connected.boardwalk_allupview_notification center.pngThe new notification center shows recent messages and calls in a place that’s familiar and easy to access.Get on the road faster: As soon as you start your car, Android Auto will continue playing your media and show your navigation app of choice. Simply tap on a suggested location or say “Hey Google” to navigate to a new place. Stay on top of your apps: With the new navigation bar, you’ll be able to see your turn-by-turn directions and control your apps and phone on the same screen. Do more with less taps: With the new navigation bar, you'll be able to easily control your apps with one tap. Get turn-by-turn directions, rewind your podcast or take incoming call all on the same screen.Easily manage communications: The new notification center shows recent calls, messages and alerts, so you can choose to view, listen and respond at a time that’s convenient and safe for you. A color palette that’s easy on the eyes: We’re evolving Android Auto’s design to fit in better with your car’s interior. A dark theme, coupled with colorful accents and easier to read fonts, also helps improve visibility. A screen fit for more cars: If you have a car with a wider screen, Android Auto now maximizes your display to show you more information, like next-turn directions, playback controls and ongoing calls.Get ready to hit the road later this summer with all these new features! If you’re joining us at I/O this week, check out these updates at the Android for Cars Sandbox. We’ll also be sharing details at the ‘What's New with Android for Cars’ session on May 7 from 4-5:00 p.m. PST.
From May 7 through May 9, more than 7,000 developers will head to Shoreline Amphitheatre in Mountain View for I/O, Google’s annual conference—and take part in talks and events in an area that’s usually a parking lot. In charge of turning that blank space into a festival-like atmosphere is Amanda Matuk, who has been part of the team running the conference for the past 10 years.Amanda, who is the event’s executive producer, has been in charge of I/O for the past four years. The process takes six to nine months to plan every year, and ends with three hectic days on site. For this installment of The She Word, I asked Amanda exactly how she gets it done—and the songs she blasts in her car to get her pumped up for the big day.How do you describe your job at a dinner party?I build things: teams, processes and ideas. My role at Google is split. As the Head of Hardware Experiences, I manage all our hardware activities that take place in real life, from press moments to consumer installations where folks can get hands-on with our products. As the internal executive producer of I/O, I look after an 80+ person team, taking I/O from an idea on paper in November to a three-day live experience in May.Attendees at Shoreline Amphitheatre in 2018.You were on the team that moved I/O from San Francisco to Mountain View. How did that change the event?The change of location was a very core moment to the company. It was late 2015 when we decided to make the move, as Sundar Pichai had just stepped up as CEO of the company. We wanted to connect back to our roots with the developer community who are based in Silicon Valley.We physically connected back with our roots, and celebrated the developer community in a venue typically reserved for concerts. In doing so, we challenged the standard conference format, and also put developers—our core users on many of our platforms—at the center of the conversation.Sundar Pichai delivers last year’s keynote at I/O.Your schedule must be jam-packed, especially the week of I/O. How do you stay calm throughout the madness?I operate under the principle that if you can do something now, do it now. Procrastination is a really natural thing I think we all do, but especially on site when there are a thousand tiny micro-decisions that come up in a given day, it’s important to do what you can in the moment.Also, it’s super cheesy, but I make a playlist that I listen to on the drive in on I/O days. Last year’s playlist included “Unstoppable” by Sia, “Run the World (Girls)” by Beyoncé, and “I’m Every Woman” by Whitney Houston. There’s nothing like starting the day with a bit of musical female empowerment. (Told you I’m cheesy!)I average 28,000 steps a day during I/O.What’s your schedule like the week of I/O?Once we get to the week of I/O, my job is to support the team. Nobody builds a conference of this scale and level of creative detail alone. My only true solo moment is on the first day. I like to arrive at 6 a.m. and walk the grounds before we open the gates. I started this ritual on the first I/O at Shoreline to remind myself that what once was a parking lot is now effectively a city layout ready for thousands of developers to occupy for the following three days.A typical day is spent checking in with teammates, managing the various production teams who operate on a rolling schedule, and monitoring potential challenges like the ever-present lunch rush. I average 28,000 steps a day during I/O.After Dark, our nighttime setup, at I/O 2018.What’s one moment you’ll remember from your years on the team?Something I’ll remember for years to come is the opening moment in 2016. To have Sundar, a former product manager, stand on the stage as the CEO and open what felt like a rock concert of a conference was something really special. We had our new leader, speaking to the developer world, making them feel celebrated in a very real and genuine way, and we ushered in a new style of conference.Did you always want to run big events like this? What advice would you have for women starting out in their careers?I started my career thinking I was going to be a lawyer. I was working in a law firm, studying for the LSAT, but I wasn’t energized by the work. I took a hard left turn and got into tech, starting in sales and eventually moving into marketing on the events and experiences team. My main advice is something I have to remind myself everyday: the path’s not linear. Just because you’re on a certain path now does not mean it is “the path.” When you’re starting out in your career, keep your eyes open to possibility, really listen to your intuition and if an opportunity speaks to you, it’s probably worth a listen. Your career is not necessarily going to be a straight line, but it can absolutely be a fun journey.
Editor’s note: Happy Teacher Appreciation Week! We’re honored to have the 2019 National Teacher of the Year, Rodney Robinson, as today’s guest author (and Doodler), who shares more about his journey and all the ways we’re celebrating teachers this week and beyond.I went into teaching to honor my first teacher: my mother, Sylvia Robinson. Growing up in rural Virginia, she dreamed of becoming an educator but was denied the chance due to poverty and segregation; instead, she ran an in-home daycare center for all the neighborhood children, where she made each of us feel like we were the most important person on earth.My mother always said, “every child deserves the proper amount of love to get what they need to be successful in life.” My sister, who had cerebral palsy, often needed more of my mother’s love and care than me and my other siblings did. Through her parenting, I learned what it meant to create a culture of equity—where every person gets the right amount of support they need to be successful—which has proven critical in my own teaching journey. Today I teach social studies in a juvenile detention facility in Virginia, where I work to create a positive school culture and empower my students to become civically-minded social advocates. When I was selected as Virginia’s Teacher of the Year, and then National Teacher of the Year, I was elated—mostly for my students. Their stories don’t often fit into the typical educational story in America, but they represent the power and possibility of second chances. They deserve a great education to take advantage of that second chance, and I’m eager to advocate for what they—along with other students from underprivileged backgrounds—need to be successful. That’s also why I’m so happy that Google is showing up this Teacher Appreciation Week, including a new $5 million grant to DonorsChoose.org, to make it easier for us to build classrooms that reflect the diversity of our students.Today’s Doodle was co-designed by the 57 2019 Teachers of the Year, representing each U.S. state, extra-state territories, the District of Columbia and the Department of Defense Education Activity.Google’s homepage today is a tribute to teachers, and I feel proud to see the contribution I made—alongside my 56 fellow State Teachers of the Year—up there for everyone to see. Since Google is a sponsor of The Council of Chief State School Officers’ (CCSSO) National Teacher of the Year program, we had the opportunity to spend a few days at Google’s Bay Area headquarters where I learned a lot about technology and using storytelling, advocacy and leadership in my practice. I am glad to see companies like Google have teachers’ backs.While at Google, I got to engage in meaningful discussions with my fellow 2019 Teachers of the Year about how together we can advocate for solutions to some of the biggest issues in education.A $5 million investment to bring teachers’ ideas to lifeToday Google is making one of its largest teacher-focused grants to date, through a $5 million Google.org grant that will unlock over $10 million for teachers through DonorsChoose.org, a nonprofit organization that helps teachers get funding for classroom resources and projects. For every dollar you donate to a teacher’s classroom on DonorsChoose.org today, Google will put in an extra fifty cents to help teachers get funding, from 8:00 AM EST on Monday, May 6 until 3:00 AM EST on Tuesday, May 7, up to $1.5 million total.Later this month, the remaining $3.5 million of this grant will also go toward supporting underrepresented teachers and those looking to create more inclusive classrooms. Representation means so much to my students, which is why it’s so important to have teachers who value their cultures and look like them .Free resources and trainings for educators, by educatorsGoogle is also launching free online and in-person resources and trainings. In the Teacher Center, you’ll find a new section with teacher guides and lesson plans—created for teachers, by teachers—made to help create classrooms that best reflect our students. And throughout the week, you can attend free in-person trainings for educators in the new Google Learning Center in New York City, led by teachers like me(!) and 2015 National Teacher of the Year Shanna Peeples, as well as teacher-focused organizations like TED-Ed. I’ll also be doing an Education On Air session later this week, so you can even tune in virtually.Making it easier for teachers to learn from one anotherAs teachers, we often learn from each other. That’s why all of the 2019 State Teachers of the Year have recorded words of insight and encouragement to share with our fellow educators as part of CCSSO and Google’s “Lessons from Teachers of the Year” YouTube series.As part of our work with Google, we also received early access toTED Masterclass, a new TED-Ed professional learning program they sponsored that supports educators in sharing their ideas in the form of TED-style talks. You can now check out several of my fellow educators’ TED Talks on the newly launchedTED-Ed Educator Talks YouTube Channel. More than 5,000 educators, including Google Certified Innovative Educators, are busy developing their Talks.I hope you’ll join us in celebrating teachers everywhere who go the extra mile to help every student succeed. You can start exploring classroom projects eligible for today’s match on DonorsChoose.org, and of course, remember to #thankateacher—because we deserve it.
During the tragic events of September 11, 2001, people struggled to find timely, trustworthy news and information in Google Search. When they looked for information about what was going on in New York, our algorithms showed results about the city’s history or recommendations for travelers.Soon after, in 2002, we launched Google News to solve this problem. We built Google News’ homepage to help users discover diverse perspectives from multiple news outlets about the news of the day, prompting them to dive deeper into individual articles and making it easier to compare different views.Over the past 17 years, we have integrated that thinking into the news products and features we have built for Google Search, YouTube, the Assistant, Discover and more. During this same time, the online news ecosystem has become richer, more diverse and more complex. The modern news industry creates opportunities for everyone to explore more of the world than we ever could before, and to be exposed to perspectives we may not have encountered otherwise. That said, it can also make it difficult to stay informed and to understand which sources to trust.In response to these changes, we continue to evolve our news experiences in Google products. While we’ve already done a lot to explain How Google Search Works, people often ask us how we go about building news experiences in Google Search, Google News, Discover, YouTube or the Assistant. So today, we are launching a How News Works website to do just that. It outlines the objectives of our work, the principles we follow and the approaches we take in the design of news experiences in Google products.Supporting the news ecosystem, and its readersGoogle aims to help everyone better understand the world by connecting them with high quality news from a variety of perspectives. We do this in real-time for Google News editions around the world. The algorithms used for our news experiences analyze hundreds of different factors to identify and organize the stories journalists are covering, in order to elevate diverse, trustworthy information.Google does not make editorial decisions about which stories to show, except for the infrequent case of designated topical experiences. In these cases, we may want to make sure that there is a dedicated topic in Google News for a significant event, such as the Oscars or World Cup. We make it clear to users when these topical experiences take place.News experiences rely on the sustainability of high-quality journalism, so we strive to help journalism flourish by bringing new audiences to publishers. Google’s news products and features send web traffic to news sources all around the world, helping them expand their reach. In addition, we develop tools to help publishers turn their readers into subscribers, and the Google News Initiative offers programs to help address industry-wide challenges and fuel innovation in journalism.How we build news experiencesEveryone has different expectations and preferences when it comes to exploring news. Over the course of one day, we might want to know the stories that are on top of the day’s agenda, get the latest on topics that we personally care about or get more context about a story we want to explore further. That’s why Google provides three distinct but interconnected ways to discover news across our products and devices:Top News, for everyone, with features like Headlines in Google News or Breaking News on YouTube. They showcase the important stories being covered at a given point in time, and are not personalized.News personalized for you, with products like Discover or features like For You in Google News, or the Latest tab of the YouTube app on TVs, that help you stay informed about subjects that matter to you.Deep context and diverse perspectives, featuring unpersonalized news from a broad range of sources within Top Stories in Search, Top News search results on YouTube or Full Coverage in Google News.New features need to pass a rigorous evaluation process that involves both live tests and thousands of trained external Search quality raters around the world. We also seek user feedback before and after product launches to understand how to further improve the services we provide.You will find more information about these topics on our How News Works website, including some of the signals our ranking systems look at and more details about the news experiences currently available on Google.
Spring was blooming as we welcomed thousands of people to our big annual event: Google Cloud Next ’19. We heard great stories about how customers are growing their businesses with cloud, and we planted the seeds for lots of new ideas and connections. Read on to catch up on Next and what else was new last month with Google Cloud.We introduced our new cloud service.Our big news in April, introduced by Sundar on stage at Next ’19, was Anthos, our open cloud platform. Anthos is a way for application developers to cut out the extra time they currently have to spend writing applications. For example, a developer might currently write code for an app running in their data center, and then have to rewrite it to work on another cloud, or in Google Cloud. Anthos lets developers package their code once, then run it across different types of clouds, which will save a ton of time. There’s a new way to stash data in the cloud.If you’ve ever had to track down an old email containing important information, you know how suddenly necessary that email becomes, even if you received it years ago. That’s the idea behind archive storage, which is a way for companies to store data for a long time, more cheaply than data they might need to access regularly. At Next ‘19, we announced a new type of cloud storage that will let companies store that data in a way that will let them access it right away if and when they need it. It’s like a digital storage unit, and could be particularly useful for companies that have to store data for legal or compliance reasons.There are some new ways to crunch the numbers.One popular way to use Google Cloud is for analyzing data. BigQuery is one of our cloud-based tools for storing data (known as a data warehouse) that can quickly store and analyze huge amounts of data. Companies can learn from the data and get new information that would be slow or difficult to get from traditional, non-cloud data warehouses, helping them make decisions and work more efficiently. These tools are popular and growing—the volume of data analyzed in BigQuery grew by more than 300 percent in the past year. We recently added new features that make it easier to get a company’s data into the cloud, and then to clean it up (by taking out duplicate information, for example) and organize it.We wished Gmail a happy birthday.Now that Gmail is 15, it lets you schedule when you send emails and reply to a Google Docs comment directly within an email thread (no more flipping back and forth). Plus, Smart Compose in Gmail is now available in Spanish, French, Italian and Portuguese.122 other things happened at Next.So many new products, features, and customer stories came out of Next ‘19 that we made a list of all 122 announcements from the show. The list starts with new regions coming online—those are new data centers that can help users get faster cloud access—and runs down all kinds of techie news. Scroll down to see some cool customer stories, such as UPS using GCP to help efficiently deliver millions of packages per day.That’s a wrap for April! We’ll see you next month.
To help people better understand the election ads they see online and support the integrity of elections, earlier this year we implemented a new process to verify advertisers for the EU Parliamentary election. We also require that these verified election ads incorporate a clear “paid for by” disclosure. Today, we are expanding our portfolio of transparency reports to include an EU Political Advertising on Google Transparency Report to show voters who is purchasing election ads on Google in the EU and how much money is being spent. This report includes a searchable election Ad Library that provides valuable data such as which ad had the highest impressions, what the latest election ads running on the Google platform are, and how the ads are targeted in terms of age, gender, and location.Anyone can access and use this information, which is searchable, so users can easily sort through the data. This report will be updated weekly to capture verified election ads that have one or more impression.As with our U.S. and India election advertising Transparency Reports, the data from the EU election advertising Transparency Report and Ad Library will be available for anyone with an interest in transparency soon afterwards on Google Cloud's BigQuery. Using BigQuery’s API, researchers, political watchdogs and private citizens can write code and run their own unique queries on this data set to develop charts, graphs, tables or other visualizations of election advertising on Google platforms.Supporting elections around the world is hugely important to us. We’ll continue to work in partnership with the EU through its Code of Practice on Disinformation, including by publishing regular reports about our work to prevent abuse, as well as with governments, law enforcement, others in our industry and the NGO community to strengthen protections around elections, protect users, and help combat disinformation.
For more than 20 years, generations of fans have delved into the fantastical Pokémon universe on a mission to meet them all. Starting May 10th, you can experience the adventure in theaters in the first Pokémon live-action movie, “POKÉMON Detective Pikachu.”But you don’t have to wait for the movie to come out to see Pokémon in the wild. We’re launching the POKÉMON Detective Pikachu Playmoji pack today in Playground, a creative mode in your smartphone camera. Now you can partner up with Detective Pikachu, Charizard, Jigglypuff and Mr. Mime to create action-packed scenes in the real world. All you have to do is point your camera and drop one of the Playmoji (or all four) into a scene to bring them to life in your photos and videos.Detective Pikachu, a yellow pokémon with an electric tail, standing on a table next to a cup of coffee.Mr. Mime, a pokémon who's a mime with blue hair, standing in front of a garage.JigglyPuff, a pink pokemon with large blue eyes, in a park.Charizard, an orange dragon pokémon, standing on a beach.The pack features Pokémon from the movie, fully animated and sounding just like their film counterparts. And thanks to ARCore’s motion tracking, light estimation and ability to understand the real world, they feel like they’re really there with you. You can even take a selfie with Detective Pikachu and share a smile as he reacts to your facial expressions in real time via machine learning.So, whether you’re singing alongside Jigglypuff or breathing fire with Charizard, partner up with our #PikaPlaymoji and start sharing your scenes with #DetectivePikachu on social today. Download the POKÉMON Detective Pikachu Playmoji pack now on Pixel and find it on select Motorola and LG devices.Welcome to the world of Pokémon.
Whether you still live with mom or are a roadtrip away, Google Maps can help you get things done so you can make the most of Mother’s Day (less than two weeks away!). Here are a few helpful features in Maps to try out to make that second Sunday in May a real mom-pleaser. Mother’s Day brunch, without the waitMother’s Day brunch has always been a thing. And lately, brunch every weekend has become an even bigger thing. Standing in line for hours for those delicious eggs benny you promised your mom could really put a damper on the day. With Google Maps, you can add yourself to a participating restaurant’s waitlist and get a text when your table's ready. Instead of waiting in line, use the time instead to stroll around a nearby park and get mum caught up on your new hobbies (or maybe even your love life). If you’re more of a planner, you can always book a reservation in advance from a restaurant’s Business Profile in Maps and Search, too.Bring out her inner kidIf you were a lucky kid, your mom probably spent a good chunk of her free time planning fun and educational outings to help your little mind expand. This Mother's Day, show your mom how thankful you are for all of those excursions by bringing her back to the places she used to take you. Just find the aquarium, zoo, or any other place that brings back fond memories and tap the “buy tickets” button to make it happen.Take a walk down memory laneIf mom is the nostalgic type, you can create a list in Maps highlighting all the favorite places you’ve visited together over the years. Create a custom scavenger hunt to guide her to each place, or just visit a few of them during the day to remember all the fun you had and make new memories. Even if you don't get to all of the spots on Mother's Day, you can share the list with her directly from Maps, so you can pick up where you left off next time you're together.Even though Mother’s Day comes only once a year, don’t forget to add a label to mom’s address in Maps, so you can easily get real-time ETA and directions every time you head home for a visit.