How Google continues to improve human-computer interaction with AI

On the occasion of its Google I / O, American Group has unveiled multiple advances in artificial intelligence in its flagship application.

During the high-end May 11 and 12, Mountain View Company unveiled its Pixel 6a and Pixel Buds Pro. The giant took advantage of the event to lift a veil over its future Pixel Watch, its Pixel 7, its Pixel Tab, as well as a prototype of augmented reality glass that makes it easier to communicate with its peers.

Initially dedicated to developers, Google I / O was an opportunity for the beautiful Pichai group to showcase a whole series of new features that will soon be available in its services. Juggernaut emphasized some advances in artificial intelligence (AI), including translation. Over the past few years, YouTube’s tools have evolved significantly, including integration of automatic and real-time translation of YouTube content, transcripts of dialogues, and, above all, support for a growing number of Google Translate solutions. . Artificial intelligence optimization, voice assistant or even online research was at the center of the annual conference, focusing on the effectiveness of collaborative work.

An automated summary powered by Google Docs and Google Chat

Emails and instant conversations are an integral part of professionals’ daily lives. With the multiplicity of meetings and schedules multiplied, it can sometimes be difficult to find the time (and courage!) To immerse yourself in a long report or meeting minutes. Google Docs should soon help users save valuable time, allowing them to keep track of the information they need. In support of artificial intelligence, Google Docs will actually offer an automated summary soon. A quick way to see the key points covered in a document, before going into more detail later. What limits the risk of data loss is a major problem for many companies today.

This feature is also available in Google Chat and Spaces app. A quick and effective way to discover exchanged key ideas, without having to re-read dozens or even hundreds of lines of discussion.

Chat with Google Assistant more naturally

Watch and talk: Gone are the days when you had to say “OK Google”, one look is enough to trigger a voice assistant.Google

Voice assistants are everywhere. They make it possible to control smartphones, but also control a large number of connected objects in the home. In the long run, it may be tedious to systematically talk about “Ok Google” to launch a new request. Also, to streamline interactions with Voice Assistant and further delete the human-machine interface, Google is announcing the “Look and Talk” function for its Nest Hub Max smart speakers.

Thanks to this new feature, all you have to do is look at the speaker and then make your request (“What’s the weather like today”). To do this, turn on the device’s camera and microphone facial and / or voice recognition that it is actually an authorized user. The sensors then analyze a hundred parameters, such as head inclination and lip movement, to determine if the person wants to issue a voice command.

Quick phrases in the program for frequent commands
Quick phrases in the program for frequent commandsGoogle

Also, to still offer an alternative to the “OK Google” catchphrase, but to allow the machine to better understand the intricacies of human language, the voice assistant of Pixel 6 and Nest Hub Max will “integrate the option. Quick talk”. As the American publisher noted, these are pre-programmed short sentences that relate to general requests such as “turn on the hallway light” or “set a timer for 10 minutes”.

Initially, only the United States will be able to benefit from this progress. The group has not yet announced an international establishment date.

The future of visual search is being created

In addition to these two new everyday practical features, AI will support visual search through the lens. It will be possible to find a nearby restaurant on a shop basis offering a photograph dish or even a specific item. All you have to do is add “near me” so that the functionality finds the items available in stock around you, or even the nearest restaurant where you can taste the food.

Finally, Google is becoming more inclusive and supports a wide range of skin tones, hair colors and textures in its search engine. The questions are thus refined and each specification is better referred to for more relevant results.

Leave a Comment