In the future, Google wants various gadgets to understand our social intentions and react to actions depending on the context. For example, systems that automatically pause films when we move away from the screen and play them again when we come back would be conceivable.
The influence of technology on our everyday life will continue to increase in the coming years. Why not? After all, algorithms and machines are already making our daily lives more accessible these days. So far, however, computers still struggle with one thing: context.
Because while it is relatively easy for us humans to recognize the context of an action, computers today are still somewhat in a bind. For example, if someone pauses the current Netflix film and goes to the bathroom, we can relatively easily connect A to B and conclude that the film was interrupted for a bathroom break.
Algorithms cannot yet conclude that easily. But that could change soon because, according to the tech magazine FastCompany, Google wants to develop computers that should be able to recognize precisely such intentions and actions.
Algorithms should learn social norms in everyday life and thus react proactively to people. Then you might get up from the couch, and an intelligent assistant automatically stops the currently running film. If you return to the couch, the film will automatically play. The Solar system developed by Google plays a significant role in this.
It was first used in pixel smartphones, for example, to control media content hands-free. This was followed by the Google Nest Hub, which tracks movements, for example, to analyse sleep and sleep phases. Other devices could follow. An example would be Nest thermostats, which are already widespread, at least in the American population.
Because the technology behind Soli is making great strides, while the system was initially only able to cover a small area, today, even small rooms can be fully monitored. So if you connect several Google devices, coverage of the entire household is already possible today.
The future should look like this for Google: For example, users look at a distant Nest Hub and see the time displayed over a large area. As soon as a person approaches the device, the view changes to incoming emails.
On the other hand, it switches to energy-saving mode if the screen is ignored. Automatic pausing and playing of films based on our movements could also be possible via various devices. So it’s to be expected that voice assistants like Alexa, Google Assistant, and Siri will one day be able to interact with us even more intelligently. After all, algorithms and machines are already making our daily lives more accessible these days.
For example, if someone pauses the current Netflix film and goes to the bathroom, we can relatively easily connect A to B and conclude that the film was interrupted for a bathroom break. It was first used in pixel smartphones, for example, to control media content hands-free. This was followed by the Google Nest Hub, which tracks movements, for example, to analyse sleep and sleep phases.
Also Read: How Do You Know You’ve Found The Best Laptop Computer?
There is not an industrial sector or a company that is not being transformed today… Read More
Although its logistics capabilities have been known for some time, RFID technology is now ready… Read More
There is great expectation for the future reform of the ePrivacy directive, which concerns the… Read More
How Many Steps Does Market Research Involve? The best technique for doing statistical surveying is… Read More
On September 9 and 10, Silicon is organizing two days of web conferences to share… Read More
Today's unpredictable business world presents serious security breaches and data theft threats as constant risks;… Read More