SignCLI
A simple yet powerful command-line interface that teaches sign language.
Implementation
It uses OpenCV to capture video from the webcam and MediaPipe to detect hands. It then uses 2 machine learning models to predict the sign language being communicated. One for recognizing hands and their shapes, and another custom CNN for recognizing which gesture corresponds to each letter.
Philosophy
I wanted to create a tool that could be used by anyone to learn sign language. It is a simple yet powerful tool that provides instant feedback on the user's signing accuracy.