Dartmouth College
Researchers at the Dartmouth College invented a way how screens can secretly communicate with smartphone via its camera, enabling a whole new world of computer human interaction.
When looking at an HiLight enabled screen a smartphone can see more than a human. A team of researchers at the Dartmouth College have invented a way how for the first displays can and cameras can talk to each other without the user knowing it. The video below shows how this works.
The value of the HiLight is clearly much higher with augmented reality glasses. Google should take not of this invention and make the next generation they are working on function with this system.
HiLight can enable a new type of context-aware applications for smart devices. Such applications include smart glasses communicating with screens to realize augmented reality or acquire personalized information without affecting the content that users are currently viewing. The system also provides far-reaching implications for new security and graphics applications.
The Dartmouth team studied how to enable screens and cameras to communicate without the need to show any coded images like QR code, a mobile phone readable barcode. In the HiLight system, screens display content as they normally do and the content can change as users interact with the screens. At the same time, screens transmit dynamic data instantaneously to any devices equipped with cameras behind the scene, unobtrusively, in real time.
HiLight leverages the alpha channel, a well-known concept in computer graphics, to encode bits into the pixel translucency change. HiLight overcomes the key bottleneck of existing designs by removing the need to directly modify pixel color values. It decouples communication and screen content image layers.
“Our work provides an additional way for devices to communicate with one another without sacrificing their original functionality,” says senior author Xia Zhou, an assistant professor of computer science and co-director of the DartNets (Dartmouth Networking and Ubiquitous Systems) Lab. “It works on off-the-shelf smart devices. Existing screen-camera work either requires showing coded images obtrusively or cannot support arbitrary screen content that can be generated on the fly. Our work advances the state-of-the-art by pushing screen-camera communication to the maximal flexibility.”
The findings will be presented /4/20 at the ACM MobiSys’15 conference.
Now Watch
You /4/Also Like
Advertisement
Share this Story
Read the Latest from I4U News