A team of Cornell University students has designed a security lock that opens after verifying a stored gesture pattern.
“The idea is to create a box like assembly, in which the user places his hand, makes a defined gesture and unlocks the system. Basically, there is a mechanism that allows the user to save a gesture pattern,” a team rep wrote on the project’s official page.
“Once that is done, the system goes in lock state. When the user enters his hand in the box, he tries to recreate the same pattern. If he is able to do so, the system unlocks. If unable to, the system remains locked.”
According to the rep, the project was inspired by a popular mobile phone unlock feature where a user draws a pattern on the screen to activate the device.
“We wanted to create a similar system which could be used in any security application, as simple as opening the door of the house based on the gesture,” the rep explained. “The attractive feature of the project is that the user makes the pattern in the air and not on any surface. Also, we have given the user the flexibility of changing the pattern whenever he wishes to do so.”
The gesture-based security lock is powered by Atmel’s versatile ATmega1284P microcontroller (MCU), a custom PCB and an IR proximity sensor. Additional key components include a three-pin jumper cable, breadboard, power supply, toggle switch, push button, LEDs, 330ohm resistors, assorted wires and a cardboard frame. On the software side, the project employs a series of algorithms for switches/inputs, store mode, pattern matching and four channel ADC multiplexing.
“Overall, our system performs satisfactorily and can be effectively used to create a gesture-based secure unlock,” the team rep concluded. “Given more time and budget, we could have made the system 3D. Changes in the third dimension could be used to model the system, [thereby] increasing system accuracy and giving the user another dimension for creating the patterns.”
Interested in learning more about the Atmel-powered gesture-based security system? You can check out the project’s official page here and HackADay’s write-up here.
To analyze the gesture students are using four Spark Fun proximity sensors setup in a linear array to sense the distance a hand is moved. An ATMega1284P is used to convert the analog sensor signal to digital for further processing. The project is extremely well documented, as it appears to be the final report for the project. This project is to ensure the complete house & retail security.
For most up-to-date news you have to visit world-wide-web
and on world-wide-web I found this site as a most excellent
site for hottest updates.