Presentation Link: https://docs.google.com/presentation/d/15ABuJPKuzlChrAYAgYay3wQsnwoiWCrQ/edit?usp=sharing&ouid=101720822677778888577&rtpof=true&sd=true
AI Brawlers
Bryan Tang, Nemo Kim, Davis Wang
SignScribe: An AI powered real time sign language translator platform
Communication barriers exist for individuals who are deaf or hard of hearing, as sign language is not universally understood. This project aims to bridge that gap by developing a portable, AI-driven device capable of recognizing sign language gestures in real-time using an Arduino TinyML kit. By translating hand gestures into text or speech, this system can facilitate smoother interactions, promoting inclusivity and accessibility for sign language users in various environments.
Develop a Real-Time Gesture Recognition Model: Train and deploy a deep learning model on the Arduino TinyML kit that accurately recognizes sign language gestures in real time.
Achieve High Accuracy and Responsiveness: Optimize the model to ensure reliable classification of various hand gestures with minimal delay, aiming for high throughput and low latency.
Ardunio TinyML Kit, Ardunio Nano 33 BLE; Camera; Tensorflow #2
AI-driven device that translates sign language gestures with high throughput and accuracy in real time. This device, powered by an Arduino TinyML kit, will allow users to perform basic sign language gestures that are then converted into text output, bridging communication between sign language users and those unfamiliar with it. This technology aims to enhance accessibility, enabling smoother, more inclusive interactions in daily life.
Week 1: Research
Week 2: Software Development
Week 3: Hardware Implementation
Week 4: Testing