ReFlex is a full-color, high-resolution flexible smartphone that combines multi-touch with bend input.
Researchers at Queen’s University have developed the first flexible smartphone. ReFlex boasts a high-resolution, full-color display that combines multi-touch with bend input. Just as current software enables devices to respond to various combinations of finger movements, ReFlex and its apps react to different bend gestures instead.
“When this smartphone is bent down on the right, pages flip through the fingers from right to left, just like they would in a book,” explains Roel Vertegaal, a computer scientist and director of Queen’s Human Media Lab. “More extreme bends speed up the page flips. Users can feel the sensation of the page moving through their fingertips via a detailed vibration of the phone. This allows eyes-free navigation, making it easier for users to keep track of where they are in a document.”
The smartphone itself uses many components commonly found in today’s gadgets, including a high-definition 720p OLED touchscreen and an Android 4.4 KitKat board. However, the ReFlex gets its super powers from the researcher’s newly-developed bend sensors located behind the display.
“ReFlex also features a voice coil that allows the phone to simulate forces and friction through highly detailed vibrations of the display. Combined with the passive force feedback felt when bending the display, this allows for a highly realistic simulation of physical forces when interacting with virtual objects,” the team adds.
This technology will ultimately enhance the user experience, which is evident when playing a game like Angry Birds and employing a virtual slingshot.
“As the rubber band expands, users experience vibrations that simulate those of a real stretching rubber band. When released, the band snaps, sending a jolt through the phone and sending the bird flying across the screen,” Vertegaal notes.
Intrigued? While this new way of interacting with our mobile devices may still be a few years away, you can see it in action below and read all about it here.
[Images: Queen’s University Human Media Lab]