top of page
PVNet Logo white.png
Digital Gadget

ASL Hand gestures

summer 2025 internship research project

Hands

ASL Hand Gesture Abstract

*Draft - Development In Process

 

Team Members

  • Tyler Pham - Project Lead

  • Kyle Lin

  • Shassh Umamaheswaran

Goal / Introduction

This project develops a real-time hand gesture recognition system to provide a hands-free alternative to traditional input methods. The goal is to enhance accessibility, safety, and user experience by allowing users to perform system-level functions—such as adjusting audio volume, toggling mute, or changing screen brightness—using only hand gestures. Beyond these functions, the project also explores gesture-driven computing for broader applications in communication, emergency shutoff, and human–computer interaction.

Methodology

The system was implemented in Python, combining MediaPipe for hand landmark detection and OpenCV for video processing. A standard webcam or compatible camera device captures video input. Recognized gestures are mapped to system actions through supporting libraries including pycaw for audio control and screen_brightness_control for brightness adjustment.

A structured state management flow was developed to ensure reliability:

  • Command mode activation via a fist gesture.

  • Cooldown timers to prevent unintended actions.

  • Gesture-to-action mapping for consistent control.

Additional exploratory research considered integrating eye-tracking for precision, creating wearable gesture-recognition devices (e.g., smart glasses), and building 3D robotic hand visualizations in Unity or web platforms.

Results

  • Core Functionality: The system successfully interprets common hand gestures and maps them to system controls.

  • Reliability Features: Command mode and cooldowns reduce false positives.

  • Extended Applications: Demonstrated feasibility for accessibility tools, silent communication, and safety-critical shutoff systems.

  • Prototyping: Early exploration showed potential for multimodal inputs (hand + eye tracking) and integration with robotics/visualization platforms.

Conclusion

The ASL Hand Gesture project demonstrates the practicality of gesture-based interfaces as a supplement or alternative to traditional input methods. By combining robust computer vision frameworks with system control libraries, the project shows how gesture recognition can improve accessibility, enhance safety in noisy or constrained environments, and expand the possibilities of human–computer interaction. Future work will focus on refining gesture accuracy, integrating multimodal inputs, and scaling applications for broader adoption.

bottom of page