Articles

Losing touch

Parizad Mangi

Touchscreens could become things of the past if we’re able to control devices by gestures, opening up potential for consumer products and robotics



There was a time when touchscreens elicited excitement. No longer would we need buttons and controls occupying space on our devices to command them. All that surface could be used for bigger, brighter high-resolution graphics for a more optimised user experience, and the ultramodern visuals were a bonus. But that’s old news now. Soon, we may not need to touch our devices at all.

With every leap in technological advancement, we seem to be getting closer to sci-fi fantasies. Engineers are developing human-machine-interface (HMI) technologies where devices are commanded by our gestures. One slashing motion with your hand could turn off the car radio, while a flick of your finger could turn up the volume.

Touchless technology is already present in our everyday world in the form of automated sliding doors where the HMI is motion sensors, or operating systems in our personal devices where the HMI is voice recognition. However, the technologies now being developed by certain companies and institutes will be specifically designed to recognise precise gestures that correspond to certain commands on devices. The systems could be used in applications ranging from mobile phones and car dashboards to industrial robots on an assembly line.

Why would we need such technology on a wider basis, other than for a futuristic aesthetic? Those working on it argue that the advances could make our lives safer.

Ultrahaptics, a Bristol start-up company that has developed a technology that enables gesture-based interaction with devices using ultrasound, is looking to move its hands-free technology into the automotive sector. The firm believes its systems will enable drivers to control their dashboard screens while keeping their eyes on the

road, thereby minimising the risk of accidents.

Professor Sriram Subramanian, a member of the Ultrahaptics research team, says: “Right now, you need to touch the dashboard screens to give instructions, and when you do you have to take your eyes off the road and look at the screen. There are already devices that track hand movements.

“With our technology, you don’t have to take your eyes off the road to give instructions. It makes driving more functional, and safer to some extent.”

Subramanian likens the sensations of the Ultrahaptics technology to the bass we feel humming through our bodies at a music concert. Instead of feeling the hum in our chest, we’ll feel it in our fingertips.

“We used ultrasound instead of audible sound,” he says. “The advantage is that the wavelength of ultrasound is much shorter than that of audible sound at the frequency of use. So it’s targeted precisely at your fingertips, not at your whole chest.”

The Ultrahaptics team uses a collection of 100-150 small speakers that are controlled individually. The speakers create an interference pattern that allows you to specify where you want the sound to be located. Once someone puts their hand in that location, they will feel a humming sensation that tells them that the technology is ready to be used on their dashboard.

Safety won’t be the only benefit. Engineers are already reaching into the industrial sector with their gesture HMI technologies for production efficiency. Companies could implement the technology in robots, allowing workers to operate them without having to train them to use the machinery, as the robot would simply mimic their gestures.

This is the technology that Fanuc UK is working on, to incorporate into its autonomous factory to increase productivity. Its hand-guidance CR-35iA robot has been designed for individual handling operations and complete movements without any programming. It simply has to mimic the gestures of the factory workers and then repeat those exact movements. The workers don’t require training to operate the equipment, as the robot is self-sufficient.

The robot can carry a load of up to 35kg, as well as assemble pieces together. The first application areas include metalworking and packaging.

Keeping in mind Asimov’s first law of robotics, the robot is designed with human safety in mind. Intelligent sensors alert it to human contact and control its movements, while still allowing it to get close to other equipment.

However, as is the case when new automation is introduced to industrial sectors, there is the question of whether the move will render manual labour obsolete and result in job losses once the robots are fully trained to carry out the desired actions with more strength and durability than a person. 

As for convenience and efficiency on a more domestic level, researchers are working on bringing the technology to our personal devices. A team at the University of Science and Technology of China is developing a gesture-based technology that will rely on wi-fi, dubbed WiFinger. The researchers claim that this is “the first solution using ubiquitous wireless signals to achieve number text input in wi-fi devices”.

Inspired by existing technology that senses movements using wi-fi, the researchers are ingraining it into the field of human-computer interaction to recognise number or text

input without using a keyboard of any sort. So far the device has achieved recognition for nine signs in the American sign language.

The function is made possible by the extraction of channel state information (CSI) by modifying the driver of a network interface card. When the user’s fingers move in a unique formation, it distorts the CSI and creates a unique pattern, as with the ultrasound used by Ultrahaptics. This pattern, called a waveform, allows the unique gesture to be recognised. The device uses no scanners or cameras, so it avoids violating the user’s personal privacy in their work environment.

Although the WiFinger achieves 90.4% recognition of sign language, according to the researchers, several obstacles remain.

Firstly, the technology has yet to recognise the entirety of the American sign language, and extending the scope of recognition might be time-consuming. More significantly, the device can be used only if the user’s body remains utterly motionless, with the exception of the fingers. The researchers say their system needs interference-free surroundings – an issue that needs to be eliminated before they can even think of commercialising the product.

And the system’s recognition accuracy is affected by the orientation and position of the wi-fi device, so the device would have to be retrained if it were moved.

Finally, WiFinger requires high CSI sampling rates – about 2,000 CSI packets/s – to achieve high recognition accuracies. So you won’t be able to gesture text messages to your friends anytime soon.

For HMI devices in general, another issue that will need to be tackled is contamination of the devices over time. Dust and other particles in the atmosphere can interfere with the controls, making the usage unreliable. In industrial applications, regular cleaning of HMI robots to ensure smooth operation could incur extra costs.

Although the most tangible use of gesture-based HMI technology now is Fanuc’s industrial robot, the technology might become available to some extent to the public in the foreseeable future. Ultrahaptics claims there will be products on the market using its technology within the next six months. The company is also looking to expand its services to virtual reality – a market that several conglomerates are tapping into to fashion the next fad in technology.

It’s easy to imagine industries amalgamating this technology into their manufacturing processes to boost productivity. But it remains to be seen if companies would invest in the technology for the wider domestic market.

Ultimately, gesture-based HMI technology has the potential to touch every aspect of life, and every piece of machinery that we rely on.

Share:

Professional Engineering magazine

Professional Engineering app

  • Industry features and content
  • Engineering and Institution news
  • News and features exclusive to app users

Download our Professional Engineering app

Professional Engineering newsletter

A weekly round-up of the most popular and topical stories featured on our website, so you won't miss anything

Subscribe to Professional Engineering newsletter

Opt into your industry sector newsletter

Related articles