Not content with turning all manner of surfaces into computers, Microsoft’s researchers are working to turn an entire room into one giant computing surface.
Andy Wilson and his team had already turned a table top, a globe-sized sphere, and a walk-in dome into surface computers. Microsoft also has its Surface, a tabletop computer that it sells for use in places like hotels and restaurants. But with LightSpace–the latest research project–Wilson has turned an entire 10-foot-by-8-foot room into a surface computer. The floor, table, and a wall are all interactive in this latest project, with users able to do things like move an object from one surface to another.
To achieve this feat, LightSpace taps a combination of projectors and depth-sensing cameras.
Among the things that LightSpace lets a user do is take an object from, say a table, and sweep it into their hand or a plate or other object. A projection of the image stays on their hand or other thing being used to carry the object.
“You see it in your hand,” Wilson said in an interview today. “That’s a very different interaction than just (a) surface.”
Microsoft’s LightSpace research project turns an entire room into a surface computer. Here a person is sweeping an object off the table and into their hands. (Credit: Microsoft)
Wilson and his team, including researcher Hrvoje Benko, are presenting a paper on the LightSpace work at an industry conference this week. LightSpace was shown internally to Microsoft workers at this year’s TechFest science fair.
As is common with Wilson’s research projects, it’s not clear when or if they might end up moving into a commercially available project. But, as usual, Wilson also has all kinds of ideas on where to go next.
One option–and one with some commercial appeal–is the idea of using LightSpace to power some kind of next-generation conference room. Imagine, Wilson says, a room where the conference room knows who each participant is and beams their documents down on the table. To share with the group, a participant might sweep a document into their hand and then point to a wall.
Also, with LightSpace, a user can carry an object and it will stay beamed on their hand or carrying device (thanks to the tracking of the cameras). However the object is currently two-dimensional and also creates no tactile sensation in the user’s hand.
“When the thing is in your hand you are very aware of it and your behavior changes,” Wilson said.
Down the road, Wilson says he sees a lot of potential for moving to the use of three-dimensional objects, though that would certainly add cost and might require a user to wear special glasses.
LightSpace builds on Wilson’s past projects which have included various variations on surface computing, including a project called SecondLight which essentially let a surface computer show one image on its surface and a second image was displayed on an object held over the device. Last year, Wilson’s team showed a demo at TechFest in which people could enter a cardboard dome and see and interact with Microsoft’s Worldwide Telescope.
Bill Gates has talked about moving toward a world in which every surface can be a computer.
“This is yet another step and a little bit more complete step in making that vision a reality,” Wilson said.
However, Wilson knows that such “smart rooms” have a better history of making it into sci-fi movies than into the lives of everyday workers and consumers.
“It’s unclear how some of that stuff gets transferred to the real world,” he said.
Wilson acknowledges that he tends to focus more on the computers and less on how they could be used.
“It would be fun to take six months off and just write applications,” Wilson said. “One of the nice things about having interns is we can get some of that to happen…We don’t do a lot of work with applications.”
Since, it’s always better to see these than to just read about them, below is a video that Microsoft has posted showing LightSpace in action.