Sunday, March 13, 2016

NExt gen RADAR

Google Invents Next-Gen Radar-Based In-Air Gesturing System

9.0 PATENT APPLICATION
1AF 55 COVER 55 RADAR FIELD

This week the U.S. Patent & Trademark Office published a pair of patent applications from Google that reveal a new in-air gesturing system that uses a next-gen radar system that could be integrated in devices like a PC or Television but also a smartphone, Google Glass, smart bracelet of smartwatch. In a smartwatch Google shows us that a beam of radar could be generated toward the hand to automatically act upon specific gestures. The radar system can transmit its beam through any kind of clothing so that transmission isn't interfered with. Google sees this as a crucial means of communicated with tomorrow's advanced devices beyond touch surfaces like a display. Such a system could work with Google's future VR headset that is reportedly underway in Google labs.

Google's Invention Background

Use of gestures to interact with computing devices has become increasingly common. Gesture recognition techniques have successfully enabled gesture interaction with devices when these gestures are made to device surfaces, such as touch screens for phones and tablets and touch pads for desktop computers. Users, however, more and more often desire to interact with their devices through gestures not made to a surface, such as a person waving an arm to control a video game. These in-the-air gestures are difficult for current gesture-recognition techniques to accurately recognize.

Google's Solution: Radar-Based Gesture Recognition

Google's invention covers techniques and devices for radar-based gesture recognition. These techniques and devices can accurately recognize gestures that are made in three dimensions, such as in-the-air gestures. These in-the-air gestures can be made from varying distances, such as from a person sitting on a couch to control a television, a person standing in a kitchen to control an oven or refrigerator, or millimeters from a desktop computer's display.

Furthermore, the Google's invention describes techniques that may use a radar field to sense gestures, which can improve accuracy by differentiating between clothing and skin, penetrating objects that obscure gestures, and identifying different actors.

Google's invention introduces simplified concepts concerning radar-based gesture recognition. These concepts, techniques and devices will be able to enable a great breadth of gestures and uses for those gestures, such as gestures to use, control, and interact with various devices, from desktops to televisions to refrigerators. Later Google expands on this to include a wider range of devices as follows:

"Tablets, laptops, refrigerators, microwaves, home automation and control systems, entertainment systems, audio systems, other home appliances, security systems, netbooks, smartphones, and e-readers.

In Google's patent FIG.1 noted below we're able to see how the radar-based gesture recognition system in use with a TV and PC.

2AF 88 - GOOGLE'S FIG. 1 RADAR GESTURING

The techniques and devices of this system are capable of providing a radar field that can sense gestures from multiple actors (as FIG. 4 Below) at one time and through obstructions, thereby improving gesture breadth and accuracy over many conventional techniques.

3AF 55 GOOGLE FIGS 3 & 4
With the new radar-based gesture system, selections can be made simpler and easier than using a flat touch screen surface, a TV remote controller or gaming pad using conventional control mechanisms.

In a second related Google patent they provide us with an example noted below of a radar transmission and a gesture interacting with the radar transmission that is being emitted by radar system of a wearable computing device. In this particular example, the wearable computing device is illustrated as wearable computing bracelet #204, though any suitable computing device, wearable or otherwise, may implement the techniques. Radar transmission #202 is interacted with by a person's finger which causes a refection (not shown) in the radar transmission. This reflection, as noted, can be received and processed to provide data from which a gesture is determined.

4AF RADAR

In another example noted below, Google tells us that a user may perform complex or simple gestures with a hand or fingers (or a device like a stylus) that interrupt the radar transmission. Example gestures include the many gestures usable with current touch-sensitive displays, such as swipes, two-finger pinch and spread, tap, and so forth.

Other gestures are enabled that are complex, or simple but three-dimensional, examples include many sign-language gestures, e.g., those of American Sign Language (ASL) and other sign languages worldwide. A few of these include an up-and-down fist, which in ASL means "Yes", an open index and middle finger moving to connect to an open thumb, which means "No", a flat hand moving up a step, which means "Advance", a flat and angled hand moving up and down, which means "Afternoon", clenched fingers and open thumb moving to open fingers and an open thumb, which means "taxicab", an index finger moving up in a roughly vertical direction, which means "up", and so forth. These are but a few of the many gestures that can be sensed by a radar system.

5AF 55 RADAR

6AF 55
Google's new Radar-Based Gesture sensing and data transmission system, as noted above in FIG. 4, could apply to any wearable or mobile device such as Google Glass, a smartwatch or bracelet (emitters top and bottom) and a smartphone.

7AF 55 FIG. 7 FLOWCHART GOOGLE RADAR
Google's patent FIG. 7 illustrates another example method enabling radar-based gesture sensing and data transmission performed at a receiving device.

Materials Don't Interfere with Radar Transmissions

Google notes that the radio element can be configured to emit continuously modulated radiation, ultra-wideband radiation, and/or sub-millimeter-frequency radiation. The radio element in some cases is configured to form radiation in beams, the beams aiding a receiving device, and/or radar antenna and signal processor to determine which of the beams are interrupted, and thus locations of interactions within a field having the radar transmission.

In some cases, the radio element is configured to transmit radar that penetrates fabric or other obstructions and reflect from human tissue. These fabrics or obstructions can include wood, glass, plastic, cotton, wool, nylon and similar fibers, and so forth, while reflecting from human tissues, such as a person's hand, thereby potentially improving gesture recognition as clothing or other obstructions can be overcome.

In more detail, the radio element can be configured to emit microwave radiation in a 1 GHz to 300 GHz range, a 3 GHz to 100 GHz range, and narrower bands, such as 57 GHz to 63 GHz. This frequency range affects the radar antenna's ability to receive interactions, such as to track locations of two or more targets to a resolution of about two to about 25 millimeters. Radio element can be configured, along with other entities of the radar system to have a relatively fast update rate, which can aid in resolution of the interactions.

By selecting particular frequencies, the radar system can operate to substantially penetrate clothing while not substantially penetrating human tissue. Further, the radar antenna or signal processor can be configured to differentiate between interactions in the radar field caused by clothing from those interactions in the radar field caused by human tissue. So a person wearing gloves or a long sleeve shirt that could interfere with sensing gestures with some conventional techniques, can still be sensed with a radar system.

Google's patent application that was filed back in October 2014 was published by USPTO this week. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

A Note for Tech Sites covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation. 

No comments:

Post a Comment