Gesture Recognition Examples

1. In-store retail 

Gesture recognition has the power to deliver an exciting, seamless in-store experience.

This example uses Kinect to create an engaging retail experience by immersing the shopper in relevant content, helping customer to try on products and offering a game that allows the shopper to earn a discount incentive.

2. The Operating Room

Companies such as Microsoft and Siemens are working together to redefine the way that everyone from motorists to surgeons accomplish highly sensitive tasks.

These companies have been focused on refining gesture recognition technology to focus on fine motor manipulation of images and enable a surgeon to virtually grasp and move an object on a monitor.

Researchers are creating a system that uses depth-sensing cameras and specialized algorithms to recognize hand gestures as commands to manipulate MRI images on a large display. Recent research to develop the algorithms has been led by doctora; student Mithun George Jacob.

The researchers validated the system, working with veterinary surgeons to collect a set of gestures natural for clinicians and surgeons. The surgeons were asked to specify functions they perform with MRI images in typical surgeries and to suggest gestures for commands.

Ten gestures were chosen: 

3. Smart Phone

Every swipe we use to scroll through or switch between apps is a “gesture” that doesn’t require pressing a tactile button or a virtual menu button.

After studying the way people were using the Back button on phones—as much as 50 percent more than the Home button—Google designed two core gestures to coincide with the most reachable/comfortable areas and movement for thumbs.

On Samsung devices, you can now use a stylus pen, instead of your fingers, to change music tracks or frame and capture a photo or video. (This works on both the Galaxy Note10 smartphone and the company’s new Galaxy Tab S6 tablet.)

The feature is enabled by the six-axis sensor in the S Pen, which consists of an accelerometer sensor and a gyro sensor.” The data from the S Pen’s movement is shared with the phone wirelessly over Bluetooth, and the phone responds.

https://youtu.be/eI6uZag3Wwo

4. Automotive

Gesture Control applications can help reduces driver distraction and increases safety.

Now the detection zone of gestures is focuses on the steering wheel. This is possible due to a time-of-flight sensor, which is integrated into the instrument cluster. It detects the motion of the hand and converts it into actions. The driver can navigate through the menus by swiping up and down, and confirm the selection with a brief tapping motion.

Touch-free operation is also possible for other functions. For example, if the driver moves his fingers up and down in a uniform movement while keeping his hands on the steering wheel, he can accept calls or reject them. 

How it works? The time-of-flight sensor comprises a 3D camera system with an integrated 3D image sensor and converts the infrared signal detected by the sensor into a 3D image. Consequently, the hand positions and gestures of the driver are detected with millimeter precision and converted to actions.

At the same time, the system can currently detect four different gestures: setting the navigation, browsing through apps and starting music, answering calls, and controlling the on-board computer.

5. Drone

Gesture recognition technology helps you to communicate or control any other devices via your hand gestures. From this technology, you can control the drone simply by moving your hands.

Now you can control your drone by doing the gesture where you don't need to have the transmitter in your hand for a flying drone.

An accelerometer will be fixed over on the hand that will detect the movement of the body, then it sends the signal to the microcontroller. That signal will be sent to the drone with the help of telemetry.

Share this article

shares