Technology always offers people new technologies and ways to make their lives easier in various fields, and language is no exception. There are technologies that provide machine translation, such as Google Translate, and other programs that help people communicate and understand other people’s languages. These techniques provide an easy way to understand foreign languages, and then the situation has evolved into mind-reading techniques. It didn’t stop there, as Google decided to expand the field of interpretation and translation, to test the technology of reading body language.
Notebook Verification Reports on this technique read and explain body language. In a video, Google shares information about a technology called Soli radar, which hopes to reduce interference with devices by reading accurate human signals, a technology already used in the Motion Sense feature found in the Pixel 4 phone. It produces. Google Inc.
Google has released more details about Chloe’s radar technology. For example, non-verbal communication can involve contact with a device by waving or moving the head. The technology, which the company is still developing, integrates motion sensors and algorithms.
In the video Google explains how sensors and algorithms work and describes events that will be used in the future. The company notes that the advantages of this technology include reducing the clutter of the device and making the tools more efficient and less intrusive.
Gestures such as near and far use in-depth learning algorithms to determine if someone is in the device’s “private domain”. According to Google, the personal sphere is a good indicator of whether humans interact with each other or go unnoticed.
Functions such as turn / close and sight are recognized by machine learning algorithms that can understand most accurate body language. For example, this technology can show how far a person is turning their head, which allows them to predict how far a person will react.
The Soli radar sensor was launched in 2015 and has been used on many Google devices. Google uses this technology in Pixel 4’s motion sensor to detect hand gestures, allowing users to stop playing music or alarm without touching their mobile. Sensors have also been used in the second generation of the Nest Hub smart screen to sense sleep. This tool helps to monitor the quality of sleep by monitoring the user’s breathing and movement patterns.
“Award-winning beer geek. Extreme coffeeaholic. Introvert. Avid travel specialist. Hipster-friendly communicator.”