Published on Nov 30, 2023
The goal of this project is to design a useful and fully functional real-Time world project that efficiently translates the movement of the fingers into the American Sign Language.
The American Sign Language (ASL) is a visual language based on hand gestures. It has been well-developed by the deaf community over the past centuries and is the 3rd most used language in the United States today.
Our motivation is two-fold. Aside from helping deaf people communicate more easily, the SLC also teaches people to learn the ASL. Our product, a sign language coach (SLC), has two modes of operation:
The SLC uses a glove to recognize the hand positions and outputs the ASL onto an LCD. The glove detects the positions of each finger by monitoring the bending of the Touch sensor.
Below is a summary of what we did and why:
1) Build touch sensor circuit for each finger. Sew Touch sensors or accelerometer onto glove to more accurately detect the bending and movement of the components.
2) Send sensor circuit output to MCU A/D converter to parse the finger positions.
3) Implement Teach mode. In Teach mode, the user "teaches" the MCU ASL using hand gestures. To prevent data corruption, A/D converter output and the associated user specified alphabet are saved to eeprom, which can only be reset by reprogramming the chip.
4) Implement LEARN mode. In Learn mode, the MCU randomly chooses a letter it has been taught and teaches it to the user. The user "learns" by matching his hand positions to that which the MCU associated with the letter.
Using the LCD, he can adjust his finger positions appropriately. The finger positions are matched to the appropriate ASL using an efficient matching algorithm