Ultrasound technology
that enables mobiles and tablets to be controlled by gesture could go
into production as early as next year.
Norwegian start-up Elliptic Labs is in talks with Asian handset manufacturers to get the chip embedded in devices.The technology works via an ultrasound chip that uses sound waves to interpret hand movements.
The move towards gesture control has gathered pace and there are now many such products on the market.
Big gestures What sets Elliptic's gesture-control system apart from others is its wide field of use, up to a metre away from the phone. It means it can identify mid-air gestures accurately.
Because it uses sound rather than sight, the sensor can recognise gestures from a 180-degree field. It also consumes less power and works in the dark.
By contrast Samsung's Galaxy S4 uses an infrared sensor that can only interpret hand movements within a very small zone.
"The user needs to learn the exact spot to gesture to instead of having a large interactive space around the device," said Erik Forsstrom, the user interface designer for Elliptic Labs.
Allowing users more freedom in how they gesture is vital if such products are to become mainstream, he thinks.
"With a small screen such as a phone or a tablet, the normal body language is not that precise. You need a large zone in which to gesture."
If consumers can quickly see the effects their gestures have on screen, he thinks, "it is quite likely that this is the next step within mobile".
The technology was recently shown off at Japanese tech show Ceatec.
In the demonstration, an Android smartphone was housed in a case containing the ultrasound transmitters.
But Elliptic Labs said it had formed partnerships with a number of Asian handset manufacturers who are looking at building the ultrasound chip into devices, as early as next year.
No comments:
Post a Comment