Tuesday 1 March 2016



Meet the Man Who Made Virtual Reality 'Feel' More Real
Tom Carter's ultrasound technology lets you touch and manipulate virtual objects—attracting interest from Jaguar Land Rover, Harman and dozens of others.


Source: Ultrahaptics Ltd.
A virtual slider control for a DJ's mixing desk—just one potential application for Ultrahaptics technology
As a student at the University of Bristol, Tom Carter became obsessed with a seemingly impossible notion: letting people feel and manipulate virtual objects. A professor turned him on to an esoteric concept—using ultrasound to simulate tactile sensations—that was floated and abandoned back in the 1970s.
Today, Carter, 27, is co-founder and chief technology officer of Ultrahaptics, which uses clever algorithms and an array of ultrasound emitters to simulate a range of feelings: tiny bubbles bursting on your fingertips, a stream of liquid passing over your hand, the outlines of three-dimensional shapes. Carter says dozens of companies making everything from computer games to cars and appliances are testing the technology, an example of what computer scientists call haptic feedback (haptic in Ancient Greek basically meant "coming into contact with something").
While other tech companies are working on their own versions of haptic feedback, Ultrahaptics says it's the only one that lets people feel and manipulate virtual objects in the air. There are many potential applications—using an invisible slider to pump up the bass on a home stereo; adjusting the car air conditioning with the twist of a virtual dial—but Carter said the technology's greatest promise may lies in making virtual reality "feel" more real.
"Touch is a really essential sense to make compelling virtual reality."

"Touch is a really essential sense to make compelling virtual reality," Carter said. "If you go and get the best possible virtual-reality goggles and the best possible surround-sound headphones, it is going to be very cool, but it is going to be very difficult if you don’t have the sense of touch to interact with the things that are there."
Owing to non-disclosure agreements, Carter can't say which companies are testing the technology or what they're working on. But there are tantalizing hints. Last month, high-end stereo maker Harman demonstrated a prototype audio speaker system featuring a gesture-control system powered by Ultrahaptics technology at the Consumer Electronics Show. Jaguar Land Rover has announced plans to integrate Ultrahaptics’ technology into a gesture-control system for its cars. Steve Cliffe, Ultrahaptics’ chief executive officer, says the first computer game using Ultrahaptics will be launched this year but declined to provide more details.
 The technology has its limitations: while ultrasound can simulate the sensation of touching the outline of an object, it cannot create the illusion of solidity. You'll always be able to push your fingers or hands through the area of vibration. Users also can hear a distracting buzz when the ultrasound waves reverberate off the skin. The company says it will find a way to minimize the noise. And while humans can't hear the ultrasound waves themselves, dogs and cats can. Ultrahaptics says the sound hasn't triggered a canine reaction in tests, but pledges that pets won't be affected by whatever products come to market.
Patrick Baudisch, a professor specializing in human-computer interaction at the University of Pottsdam’s Hasso Plattner Institute in Germany, admires the research team behind Ultrahaptics. “They are one of the most innovative groups working on this in Europe,” he said. However, the inability to create the illusion of grasping a solid object or the resistance one encounters when bumping up against things in the real world might limit the technology's use in virtual reality, he said. Instead, Baudisch sees greater applications in so-called augmented reality—when a user needs to interact with both real world and virtual objects
Simultaneously

Steve Cliffe, chief executive officer of Ultrahaptics, left, and Tom Carter, co-founder and chief technology officer.
Photographer: Adrian Sherrat/Ultrahaptics
Ultrahaptics' offices are in The Engine Shed–a sleek, post-modern incubator housed inside what was once a train shed for the Bristol railway station built by Isambard Kingdom Brunel, a 19th-century disruptor known for his pioneering work on trains, steamships and bridges. Sitting in a conference room, Carter frankly explains that the hardware—an array of ultrasound emitters hooked up to an off-the-shelf gesture-control platform—is nothing special.  
"The clever bit is in our software,'' said Carter, who is lanky and boyish, with a mop of brown hair that wouldn't look out of place on George Harrison circa 1964. "It is actually in the algorithms of how you drive the emitters to create the sensations.”
The software can be programmed to find your hand and direct sound to it, projecting sensations onto a spot as small as a fingertip from as much as six feet away. It can also create multiple sensations simultaneously–allowing different hands, for instance, to “touch” different things.
Founded in 2013, Ultrahaptics grew out of research Carter began as a student at the University of Bristol. There he worked under the supervision of computer-science professor Sriram Subramanian, who ran a lab devoted to improving human-computer interaction. Subramanian, who has since moved to the University of Sussex, said the idea of using ultrasound to simulate touch was floated in the 1970s, but no one could figure out the programming. “I knew this could be an interesting idea,” he said. “But before Tom came along, I didn’t really imagine anyone would want to take this on.”



Demonstrating a virtual control knob on an induction cooktop.
Source: Ultrahaptics Ltd.
Five years of experimentation—and "a lot of really hard maths" later—Carter put his PHD on hold to start Ultrahaptics. Subramaniam and another research assistant from the same lab, Benjamin Long, joined as co-founders. (Both still serve as scientific advisers to the company.) Going from the lab to a commercially viable product required overcoming big technical challenges: an early prototype took almost 20 minutes to render a single image of a hand gesture and produce a corresponding sensation. Now Ultrahaptics software can render 100,000 frames per second with a fraction of the original computing power, making it viable for home appliances and cars.
In 2014, Carter and his partners secured 600,000 pounds ($865,000) in seed money from IP Group Plc, a publicly listed British venture capital firm, and went looking for a chief executive. They quickly hired Cliffe, who had held a variety of executive roles in the semiconductor industry. Cliffe said he immediately grasped the applications of Ultrahaptics’ product. And he liked Carter. “Tom and I hit it off straight away,” Cliffe said.  He was impressed that Carter seemed to know what he didn’t know–and was willing to listen to a more seasoned hand. Ultrahaptics hired Cliffe in December 2014.
Last year, Ultrahaptics secured a 10.1-million-pound Series A investment round, led by Woodford Investment Management, a large U.K. money management firm run by Neil Woodford, a former top Invesco portfolio manager. IP Group also pumped in more money. Since its founding, the company has sold more than 50 kits, which cost $20,000 apiece including technical support, that companies can use to evaluate the technology. It has more than 20 employees and plans to double in size this year.

And then there’s the tractor beam.

With funding lined up and a U.S. office opening later this year, the firm is racing to help customers get the first products using Ultrahaptics to market. The company has received a 1.5 million euro ($1.64 million) grant from the European Union to put the technology on a single chip for small businesses keen to use the technology–for instance, bringing gesture control to household lamps.
There are many more applications besides virtual reality and home appliances. For example, Ultrahaptics technology could enable the blind and deaf to navigate more safely.
And then there’s the tractor beam.
Last year, Long and Subramanian were part of a team, along with other researchers from the University of Bristol and Spain’s University of Navarre, that used Ultrahaptics gear to levitate small polystyrene beads, grabbing them and manipulating them through the air. At the Bristol Interaction and Graphics lab, the same university lab that gave birth to Ultrahaptics, a PHD student is already applying the tractor beam technology to create three-dimensional data visualizations. The levitating beads making up a chart rearrange themselves in mid-air as the data changes. And Bruce Drinkwater, a professor who specializes in ultrasound at Bristol and participated in the tractor beam project, said there are potential medical uses for the technology, including using the beam to grab and help remove kidney stone fragments.
(Carter said Ultrahaptics has no immediate plans to enter the tractor beam business.)



ALSO SEE……..


Global Warming Crushes Records. Again.
What we're seeing has no precedent.


If weather were measured in peppers, last month would be an habanero.
Here we go again. 
For the surface of planet Earth, 2015 was the hottest year on record by a stunning margin. But already, 2016 is on track to beat it. 
Last month was the hottest January in 137 years of record keeping, according to data released Wednesday by the National Oceanic and Atmospheric Administration. It's the ninth consecutive month to set a new record.
To be sure, some of the recent extremes are the result of a monster El Niño weather pattern that still lingers in the Pacific Ocean. But the broader trend is clear: We live in a world that's warming rapidly, with no end in sight. Since 1980, the world has set a new annual temperature record roughly every three years. Fifteen of the hottest 16 years ever measured are in the 21st century.  The chart below shows earth's warming climate, measured from land and sea dating back to 1880. If the rest of 2016 is as hot as January, it would shatter the records set in 2014 and 2015
Results from the world’s top monitoring agencies vary slightly, but NASANOAA, and the Japan Meteorological Agency all agree that January was unprecedented. 
The El Niño weather pattern that started last year produced some of the hottest temperatures ever witnessed across great swaths of the equatorial Pacific. By some measures, this may now be considered the most extreme El Niño on record. It has triggered powerful typhoons, spoiled harvests in Africa, and contributed to vast fires in Indonesia. In California, residents are bracing for more floods over the coming months.
The heat in January was experienced differently around the world. The map below shows a few purple spots of cooler-than-average temperatures and plenty of record-breaking red. The blob of crimson in the Pacific Ocean is the footprint of El Niño. Some of the most unusual warmth swept the Arctic, where ice levels fell to the lowest on record for this time of year.
NOAA
While El Niño conditions appear to have peaked, they may continue to a lesser extent through late spring or early summer, according to the U.S. Climate Prediction Center. Then it's pretty much a coin toss whether the Pacific returns to more neutral temperatures or even a cooler La Niña pattern. The heat that's dispersed into the atmosphere during an El Niño can linger, which means there's a decent chance 2016 will turn out to be the third straight year to set a new temperature record. That's never happened before. 
A Brief History of Global Warming


No comments:

Post a Comment

LATEST NEWS