Tech giant Google is working on expanding wearable tech to respond directly to gestures on the skin. The company has already issued a patent for the technology, suggesting it could come to a product in the future.
Titled, “Skin interface for Wearables: Sensor fusion to improve signal quality,” the patent shows how a user can control a wearable device with their skin. While there will be no specific product, the technology could become part of existing Google hardware like the Pixel Watch or Pixel Buds.
So, how does this skin input work? Well, the user would draw on their own skin rather on the device itself. This may seem pointless if the hardware as a screen… why not just touch the screen. For devices like buds, though, the tech could be interesting.
We already have an idea how this would work. For example, the Pixel Buds already allow users to control playback and volume through gestures. The difference here instead of doing those gestures on the small surface area of the device, you could do them on your skin. This could allow more control and ease of use.
Existing Tech
You may notice a lot of “could” in this article. That’s because this is merely at the patent phase. As always, there are no guarantees Google will put this technology into a real product. Perhaps the company is just covering its bases.
However, this is a tech I think will eventually come to Pixel devices, simply because Sony already has something similar. Sony’s LinkBuds headphones have an open back design that allows skin controls for playing and pausing audio, while gestures on the skin also control volume, track skipping, and the always on listening feature.
Tip of the day: Do you sometimes face issues with Windows search where it doesn’t find files or return results? Check our tutorial to see how to fix Windows search via various methods.