February 20, 2007
5:20 PM -- It takes a fair amount to really rouse the audience at the annual invite-only TED conference, but Jeff Han, a consulting research scientist for NYU's Department of Computer Science, succeeded with his demo of a multi-touch sensor interface.
Essentially, Han and his colleagues have developed a touchable computer screen that is sensitive not just to one source of heat and pressure, like your ATM screen or the soft keyboard on your smartphone, but multiple sources. In other words, you can use all 10 fingers to manipulate the images or keys onscreen, an advance that, as Han put it, "Will really change the way we interact with machines."
What's interesting is that Han's presentation took place at last year's TED, in Feb. 2006. A little less than a year later, Steve Jobs presented the iPhone, featuring something called "the Multi-Touch user interface" -- which Jobs described as a revolution on the order of the computer mouse. (See Apple Makes iPhone Call.)
Coincidence? You decide.
— Richard Martin, Senior Editor, Unstrung
You May Also Like
5G Network Automation and AI at Global Megaevents: A Telco AI-at-scale case study with Ooredoo and EricssonOct 10, 2023
5G Transport & Networking Strategies Digital Symposium.Oct 26, 2023
Improve Service Efficiency in the Call Center and Field with Slack AutomationOct 13, 2023
Open RAN Evolution Digital Symposium Day 1Jul 26, 2023
Dec 1, 2023