The-Multi-Disciplinary-Research-Building, 218 View map
Outlook users, please download the .ics file to your computer using the clock button above, then go here for instructions on how to add this event feed to your calendar.

701 East Martin Luther King Boulevard

View map

The UTC Graduate School is pleased to announce that Nafiseh Ghaffar Nia will present Doctoral research titled, Predicting Hand Grasping Orientation for Prosthetic Hand Control Using Multimodal Sensor Data (EEG, EMG, and IMU) with Machine Learning Approach on 06/17/2024 at 2:00 PM in College of Engineering and Computer Science Building (ECS) Room: 347 ( Conference Room). Everyone is invited to attend. 

Computational Science

Chair: Dr. Erkan Kaplanoglu

Co-Chair: 

Abstract:
Enhancing the control of prosthetic hands is a crucial challenge that directly impacts the daily functionality of individuals with limb loss. Our research delves into advanced machine learning (ML) methodologies to accurately predict hand-grasping orientations, thereby improving the precision of prosthetic control. We present a comprehensive framework that amalgamates inputs from multiple sensors. This includes electroencephalography (EEG) to discern user intentions, electromyography (EMG) to evaluate muscle activity, and inertial measurement units (IMU) to track features of hand movement. By harmoniously integrating these varied data streams, our ML model strives to provide predictions of hand orientation that are more accurate and intuitive than those achievable with single-sensor systems. This innovative approach has the potential to significantly elevate prosthetic functionality and user experience, enabling more precise and effortless execution of grasping tasks. Integrating these sensors within a singular, cohesive ML framework allows for the dynamic assessment of various physical and neurological cues. This methodological synergy enhances the prosthetic's adaptability to each user's unique movement patterns and neural commands. Applying deep learning techniques, particularly through a combination of ML models, we proposed a new model called AutoMerNet. This model further enhances our system’s ability to learn from complex, multi-modal sensor data, continually improving its predictive capabilities over time.

This research not only contributes to the technological advancement of prosthetic hands but also opens avenues for personalized prosthetic adjustments based on individual physiological and biomechanical characteristics. The enhanced control provided by our ML model holds promise for significantly improving the quality of life for prosthetic users, facilitating more natural and effective interaction with their environment.

Event Details

0 people are interested in this event

User Activity

No recent activity