By Sterling “Chip” Camden
Contributing Writer, [GAS]
At this week’s User Interface Software and Technology conference, Microsoft is presenting a new idea for touchscreen interface that doesn’t involve touching the screen. It’s called SideSight, and it uses infrared reflection to detect finger movement in the proximity of the device — up to 10 centimeters away. This makes a “touch” interface, where different finger gestures can have different meanings, workable for devices that are two small to have any significant gesture space on a screen.
Microsoft has already built a prototype using a mobile phone that only has sensors along two edges, although a future production model could have sensors all around its periphery — or perhaps even over its entire surface. So far researchers have been “pleasantly surprised” by the ability of the device to accurately distinguish the infrared signals from ambient light. The only problem they noted was that where one finger goes, four more follow — so rather than being misled by any old finger that shows up on the radar (or more accurately, lidar), they elected to follow the dictates of only a single finger.
I wonder… if I give it my usual single finger gesture, will it result in a BSOD?