Collaborating to protect nearly anonymous animalsCollaborating to protect nearly anonymous animals

According to WWF, wildlife populations have dwindled by 60 percent in less than five decades. And with nearly 50 species threatened with extinction today, technology has a role to play in preventing endangerment.

With artificial intelligence (AI), advanced analytics and apps that speed up collaboration, Google is helping companies like WWF in their work to save our precious planets’ species. Here are some of the ways.

  • Curating wildlife data quickly. A big part of increasing conservation efforts is having access to reliable data about the animals that are threatened. To help, WWF and Google have joined a number of other partners to create the Wildlife Insights platform, a way for people to share wildlife camera trap images. Using AI, the species are automatically identified, so that conservationists can act quicker to help recover global wildlife populations.
  • Predicting wildlife trade trends. Using Google search queries and known web page content, Google can help organizations like WWF predict wildlife trade trends similar to how we can help see flu outbreaks coming. This way, we can help prevent a wildlife trafficking crisis quicker.
  • Collaborating globally with people who can help. Using G Suite, which includes productivity and collaboration apps like Docs and Slides, Google Cloud, WWF and Netflix partnered together to draft materials and share information quickly to help raise awareness for Endangered Species Day (not to mention, cut back on paper).

What you can do to help
Conservation can seem like a big, hairy problem that’s best left to the experts to solve. But there are small changes we can make right now in our everyday lives. When we all collaborate together to make these changes, they can make a big difference.

Check out this Slides presentation to find out more about how together, we can help our friends. You can also take direct action to help protect our planet on the “Our Planet” website.

Make your smart home more accessible with new tutorialsMake your smart home more accessible with new tutorials

I’m legally blind, so from the moment I pop out of bed each morning, I use technology to help me go about my day. When I wake up, I ask my Google Assistant for my custom-made morning Routine which turns on my lights, reads my calendar and plays the news. I use other products as well, like screen readers and a refreshable braille display, to help me be as productive as possible.

I bring my understanding of what it’s like to have a disability to work with me, where I lead accessibility for Google Search, Google News and the Google Assistant. I work with cross-functional teams to help fulfill Google’s mission of building products for everyone—including those of us in the disabled community.

The Assistant can be particularly useful for helping people with disabilities get things done. So today, Global Accessibility Awareness Day, we’re releasing a series of how-to videos with visual and audible directions, designed to help the accessibility community set up and get the most out of their Assistant-enabled smart devices.

You can find step-by-step tutorials to learn how to interact with your Assistant, from setting up your Assistant-enabled device to using your voice to control your home appliances, at our YouTube playlist which we’ll continue to update throughout the year.

Three new machine learning coursesThree new machine learning courses

Many years ago, I took a dance lesson in Budapest to learn the csárdás, a Hungarian folk dance. The instructor shouted directions to me in enthusiastic Hungarian, a language I didn’t understand, yet I still learned the dance by mimicking the instructor and the expert students. Now, I do love clear directions in a lesson—I am a technical writer, after all—but it’s remarkable what a person can learn by emulating the experts.  

In fact, you can learn a lot about machine learning by emulating the experts. That’s why we’ve teamed with ML experts to create online courses to help researchers, developers, and students. Here are three new courses:

  • Clustering: Introduces clustering techniques, which help find patterns and related groups in complex data. This course focuses on k-means, which is the most popular clustering algorithm. Although k-means is relatively easy to understand, defining similarity measures for k-means is challenging and fascinating.
  • Recommendation Systems: Teaches you how to create ML models that suggest relevant content to users, leveraging the experiences of Google’s recommendation system experts. You’ll discover both content-based and collaborative filtering, and uncover the mathematical alchemy of matrix factorization. To get the most out of this course, you’ll need at least a little background in linear algebra.
  • Testing and Debugging: Explains the tricks that Google’s ML experts use to test and debug ML models. Google’s ML experts have spent thousands of hours deciphering the signals that faulty ML models emit. Learn from their mistakes.    

These new courses are engaging, practical, and helpful. They build on a series of courses we released last year, starting with Machine Learning Course Crash (MLCC), which teaches the fundamentals of ML. If you enjoyed MLCC, you’re ready for these new courses. They will push you to think differently about the way you approach your work. Take these courses to copy the moves of the world’s best ML experts.

Building for all learners with new apps, tools, and resourcesBuilding for all learners with new apps, tools, and resources

Assessing with accessibility in mind

Teachers use locked mode when giving Quizzes in Google Forms, only on managed Chromebooks, to eliminate distractions. Locked mode is now used millions of times per month, and many students use additional apps for accommodations when taking quizzes. We’ve been working with many developers to make sure their tools work with locked mode. One of those developers is our partner Texthelp®. Coming soon, when you enable locked mode in Quizzes in Google Forms, your students will be able to access Read&Write for Google Chrome and EquatIO® for Google that they rely on daily.

Another partner, Don Johnston, supports students with their apps including Co:Writer for word prediction, translation, and speech recognition and Snap&Read for read aloud, highlighting, and note-taking. Students signed into these extensions can use them on the quiz—even in locked mode. This integration will be rolling out over the next couple of weeks.

Learn more about the accessibility features available in locked mode, including ChromeVox, select-to-speak, and visual aids including high contrast mode and magnifiers.

Affirming the identities of teachers and students in the classroomAffirming the identities of teachers and students in the classroom

Editor’s note: We’re thrilled to have Kristina Joye Lyles from DonorsChoose.org as a guest author, sharing about teaming up with Google.org to launch the #ISeeMe campaign.

I joined DonorsChoose.org in 2013 and have long been working with organizations like Google.org who share our belief in the power of teachers. To date, Google.org has provided over $25 million to support classrooms on DonorsChoose.org, and last week, they committed an additional $5 million to teachers, with a focus on supporting diverse and inclusive classrooms. Together, we’re kicking off #ISeeMe, a new effort to enable teachers and students across the country to celebrate their identities in their classrooms.

As a military brat, I attended many public schools across the U.S. but only had two teachers of color from kindergarten through twelfth grade. My teachers and professors of color had a particularly strong impact on me as mentors and role models; I was encouraged to see them as leaders in our school community, and their presence alone showed me that diversity and representation matter.

My story is like those of so many others. Research shows that students benefit from seeing themselves in their teachers and learning resources. For example, black students who have just one black teacher between third and fifth grade are 33 percent more likely to stay in school. Girls who attend high schools with a higher proportion of female STEM teachers are 19 percent more likely to graduate from college with a science or math major.

With this support from Google.org, teachers who are underrepresented in today’s public school classrooms–like teachers of color and female math and science teachers– as well as all teachers looking to create more inclusive classrooms will get the support they need and deserve. Teachers from all backgrounds can take steps toward creating classrooms that reflect their students, whether they’re selecting novels with diverse characters to discuss or taking trainings to learn more about meeting the needs of students from culturally diverse backgrounds. And we’re eager to help them bring their ideas to life so that more students can see themselves reflected in their classrooms.

I’m thrilled that many teachers on DonorsChoose.org are already coming up with inspiring ways to foster classroom environments where every student can feel important and included.  Mr. Yung sees the power of food to bring his students together across different cultural backgrounds. Ms. McLeod is determined to bring her students from Lumberton, North Carolina, to the National Museum of African-American History and Culture in Washington, D.C. Mrs. Toro-Mays aspires to bring her bilingual students books with culturally relevant heroes and heroines.

We hope you’ll join us and the philanthropists of various backgrounds who have lit the torch for #ISeeMe today. If you are a public school teacher, you can set up an #ISeeMe classroom project right now at DonorsChoose.org. You can also access free inclusive classroom resources and ideas created for educators, by educators at any time in Google’s Teacher Center. And for those of you who have been inspired by a teacher, we invite you to explore classroom projects that are eligible for Google.org’s #ISeeMe donation matching—we would love to have your support for these teachers and classrooms.

We hear you: updates to Works with NestWe hear you: updates to Works with Nest

Last week we announced that we would stop supporting the Works with Nest (WWN) program on August 31, 2019 and transition to the Works with Google Assistant platform (WWGA). The decision to retire WWN was made to unify our efforts around third-party connected home devices under a single platform for developers to build features for a more helpful home. The goal is to simplify the experience for developers and to give you more control over how your data is shared. Since the announcement, we’ve received a lot of questions about this transition. Today we wanted to share our updated plan and clarify our approach.

First, we’re committed to supporting the integrations you value and minimizing disruptions during this transition, so here’s our updated plan for retiring WWN:

  • Your existing devices and integrations will continue working with your Nest Account, however you won’t have access to new features that will be available with a Google Account. If we make changes to the existing WWN connections available to you with your Nest Account, we will make sure to keep you informed.

  • We’ll stop accepting new WWN connections on August 31, 2019. Once your WWN functionality is available on the WWGA platform you can migrate with minimal disruption from a Nest Account to a Google Account.

Second, we want to clarify how this migration will work for you. Moving forward, we’ll deliver a single consumer and developer experience through the Google Assistant. WWGA already works with over 3,500 partners and 30,000 devices, and integrates seamlessly with Assistant Routines. Routines allow anyone to quickly customize how their smart devices work together based on simple triggers—whether you’re leaving home or going to bed.

One of the most popular WWN features is to automatically trigger routines based on Home/Away status. Later this year, we’ll bring that same functionality to the Google Assistant and provide more device options for you to choose from. For example, you’ll be able to have your smart light bulbs automatically turn off when you leave your home. Routines can be created from the Google Home or Assistant apps, and can be created using the hardware you already own. Plus we’re making lots of improvements to setup and managing Routines to make them even easier to use.

We recognize you may want your Nest devices to work with other connected ecosystems. We’re working with Amazon to migrate the Nest skill that lets you control your Nest thermostat and view your Nest camera livestream via Amazon Alexa. Additionally, we’re working with other partners to offer connected experiences that deliver more custom integrations.

For these custom integrations, partners will undergo security audits and we’ll control what data is shared and how it can be used. You’ll aso have more control over which devices these partners will see by choosing the specific devices you want to share. For example, you’ll be able to share your outdoor cameras, but not the camera in your nursery, with a security partner.

We know we can’t build a one-size-fits-all solution, so we’re moving quickly to work with our most popular developers to create and support helpful interactions that give you the best of Google Nest. Our goal remains to give you the tools you need to make your home, and those of other Nest users, helpful in the ways that matter most to you.

New features to make audio more accessible on your phoneNew features to make audio more accessible on your phone

Smartphones are key to helping all of us get through our days, from getting directions to translating a word. But for people with disabilities, phones have the potential to do even more to connect people to information and help them perform everyday tasks. We want Android to work for all users, no matter their abilities. And on Global Accessibility Awareness Day, we’re taking another step toward this aim with updates to Live Transcribe, coming next month.

Available on 1.8 billion Android devices, Live Transcribe helps bridge the connection between the deaf and the hearing via real-time, real-world transcriptions for everyday conversations. With this update, we’re building on our machine learning and speech recognition technology to add new capabilities.

First, Live Transcribe will now show you sound events in addition to transcribing speech. You can see, for example, when a dog is barking or when someone is knocking on your door.  Seeing sound events allows you to be more immersed in the non-conversation realm of audio and helps you understand what is happening in the world. This is important to those who may not be able to hear non-speech audio cues such as clapping, laughter, music, applause, or the sound of a speeding vehicle whizzing by.

Second, you’ll now be able to copy and save transcripts, stored locally on your device for three days. This is useful not only for those with deafness or hearing loss—it also helps those who might be using real-time transcriptions in other ways, such as those learning a language for the first time or even, secondarily, journalists capturing interviews or students taking lecture notes. We’ve also made the audio visualization indicator bigger, so that users can more easily see the background audio around them.

Street View cars measure Amsterdam’s air qualityStreet View cars measure Amsterdam’s air quality

Project Air View

Building on efforts in London and Copenhagen, Google and the municipality of Amsterdam are now working together to gain insight into the city’s air quality at the street level. Amsterdam already measures air quality at several points around the city. Information from two of our Street View cars in Project Air View will augment the measurements from these fixed locations, to yield a more detailed street-by-street picture of the city’s air quality.

To take the measurements, the Street View cars will be equipped with air sensors to measure nitric oxide, nitrogen dioxide, ultra-fine dust and soot (extremely small particles that are hardly ever measured). Scientists from Utrecht University are equipping the air sensors into the vehicles, and working with the municipality and Google to plan the routes for driving and lead the data validation and analysis. Once the data validation and analysis is complete, we’ll share helpful insights with the public, so that everyone—citizens, scientists, authorities and organizations—can make more informed decisions.

This research can spread awareness about air pollution and help people take action. For example, if the research shows differences in air quality between certain areas in the city, people could adjust their bike route or choose another time to exercise. Our hope is that small changes like this can help improve overall quality of life. For more information about Project Air View, visit g.co/earth/airquality.

Sharing Hawaiian food and tradition with generations to comeSharing Hawaiian food and tradition with generations to come

Highway Inn is an Oahu-based restaurant founded by Hawaii-born Japanese-American Seiichi Toguchi. At the start of World War II, Seiichi was taken from his home to an internment camp in California and assigned to work in the mess halls. There, Japanese-American chefs from around the country taught him how to cook, eventually inspiring him to open Highway Inn to share the foods he loved growing up. Seiichi passed the restaurant down to his son Bobby Toguchi, who has since passed it to his daughter, Monica Toguchi Ryan. Their family has been proudly serving authentic Hawaiian food for over 70 years.

As the third generation owner, Monica was determined to not just honor her family traditions and legacy, but also to share with younger generations the kinds of food that keep them connected to Hawaiian and local food culture. When her grandfather started the restaurant, he relied on word of mouth to reach new customers. Now, Monica uses Google Ads and their Business Profile on Google to connect with customers, helping them to grow from one location to three across Oahu. She and her family hope to continue preserving the beauty and tradition of Hawaiian food for generations to come.

This Asian American and Pacific Islander Heritage Month, we’re telling this and other stories, like Kruti Dance Academy from Atlanta, Georgia. They are two of the many Asian American and Pacific Islander-owned small businesses having an impact on their local communities.

The importance of influence in designThe importance of influence in design

Human behavior has always intrigued me—that’s the reason I studied psychology as an undergraduate. At the time, I wondered how those learnings could one day apply to life in the “real world.” As it turns out, an understanding of people and human behavior is an invaluable asset when it comes to cultivating influence—especially when it comes to design.

In my role as VP of User Experience (UX) Design at Google, I’m constantly tasked with influencing others. I lead a team of designers, researchers, writers and engineers who are behind products like Google’s Shopping, Trips, Payment and Ads. To create great experiences for people, we must first convince people building these products that design is elemental to delivering not just user value, but also business value. Over the years I’ve seen how the ability to build influence is essential to designing the best experiences.

User empathy is a fast track to influence

As UX professionals (designers, writers, researchers and front-end engineers), it’s our job to fully grasp the needs of people using our products and be the spokesperson for them. It’s easy to fall into the trap of believing that we understand our users without witnessing them actually using our products. Or to believe that our personal experiences reflect those of people everywhere. Yet every time I go out into the real world and spend time with people actually using our products, I come back with an unexpected insight that changes how I initially thought about a problem.

In 2017, I took a trip to Jakarta to research the challenges of using smartphones in a region where service is relatively expensive and bandwidth is not readily available. It wasn’t until I was on the ground that I realized how degraded the experience was from what I’d pictured. Similarly, during a recent trip to Tel Aviv, I learned how difficult it is to get funding and grow a business. Developing this kind of understanding, which can only come from experience, helps motivate you to fix a problem from a different angle.

Ideally, we’d bring all of our team members into the field to have these first-hand experiences, but that approach doesn’t scale. What does scale is empathy. We can share our personal experiences, research and user stories to build greater understanding. Once we’ve built a foundation of shared understanding, we can have better influence over decisions that affect users.