Skip to content

arshc0der/EMG-Gesture-Classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

4 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ’ͺ EMG Gesture Classification

Hand Gesture Recognition using 8-Channel EMG Signals for Prosthetic Control

EMG Gesture Classification


πŸš€ Overview

This project builds a machine learning pipeline for EMG signal classification using 8-channel electromyography (EMG) data.

The goal is to classify hand gesture classes for prosthetic-control style applications, where muscle activity signals are used to recognize intended movement patterns.

This project demonstrates a practical biosignal classification workflow including:

  • dataset loading
  • preprocessing
  • normalization
  • model training
  • evaluation
  • confusion matrix generation
  • demo prediction

🎯 Project Objective

The main goals of this project are to:

  • classify hand gesture classes from EMG signals
  • build a baseline prosthetic-control style classifier
  • demonstrate biosignal preprocessing and machine learning workflow
  • generate clear evaluation outputs for model performance analysis

🧠 Project Overview

The model uses:

  • Input features: channel1 to channel8
  • Target label: class
  • Model: RandomForestClassifier
  • Preprocessing: missing-value removal + normalization using StandardScaler

This makes the project a strong starting point for EMG-based gesture recognition systems.


πŸ“Š Final Demo Result

Demo Prediction

Predicted class: 0
Actual class: 0
Input values: {
  'channel1': -2e-05,
  'channel2': 1e-05,
  'channel3': 0.0,
  'channel4': -7e-05,
  'channel5': -3e-05,
  'channel6': 1e-05,
  'channel7': 0.0,
  'channel8': -1e-05
}

This example shows that the trained model correctly predicted the class for one test sample.


πŸ“¦ Dataset

The dataset file is stored in compressed form to keep the repository lightweight.

Expected dataset path

After extraction, the project should contain:

data/EMG-data.csv

Compressed dataset file

You can keep the dataset in the repository as:

data/EMG-data.zip

or

data/EMG-data.csv.zip

Then extract it so the CSV becomes:

data/EMG-data.csv

⚠️ Why the Dataset is Zipped

The raw CSV file can be relatively large, while the zip file is much smaller.

Keeping it compressed helps to:

  • reduce repository size
  • improve cloning and downloading speed
  • make the project easier to share with contributors

πŸ— Project Structure

EMG-Gesture-Classification/
β”‚
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ EMG-data.zip               # or EMG-data.csv.zip
β”‚   └── EMG-data.csv               # extracted dataset file
β”‚
β”œβ”€β”€ models/
β”‚   β”œβ”€β”€ emg_random_forest.pkl
β”‚   └── scaler.pkl
β”‚
β”œβ”€β”€ results/
β”‚   β”œβ”€β”€ accuracy_report.txt
β”‚   β”œβ”€β”€ confusion_matrix.png
β”‚   └── demo_prediction.txt
β”‚
β”œβ”€β”€ train_emg.ipynb
β”œβ”€β”€ README.md
└── .gitignore

βš™οΈ Machine Learning Pipeline

The classification workflow includes:

  1. loading EMG data
  2. selecting channels channel1 to channel8
  3. using class as the target label
  4. removing missing rows
  5. normalizing feature values
  6. splitting data into training and testing sets
  7. training a Random Forest model
  8. evaluating predictions
  9. saving model and result files

🧰 Tech Stack

  • Python
  • NumPy
  • Pandas
  • Matplotlib
  • scikit-learn
  • SciPy
  • joblib

πŸ“¦ Installation

Install Python 3.10+ and required libraries:

pip install numpy pandas matplotlib scikit-learn scipy joblib

🐍 Virtual Environment Setup

Windows PowerShell

python -m venv emg_env
.\emg_env\Scripts\Activate.ps1
pip install numpy pandas matplotlib scikit-learn scipy joblib

Windows CMD

python -m venv emg_env
emg_env\Scripts\activate.bat
pip install numpy pandas matplotlib scikit-learn scipy joblib

macOS / Linux

python3 -m venv emg_env
source emg_env/bin/activate
pip install numpy pandas matplotlib scikit-learn scipy joblib

πŸ“₯ Extract the Dataset

If the dataset is EMG-data.zip

Windows PowerShell

Expand-Archive -Path .\data\EMG-data.zip -DestinationPath .\data -Force

macOS / Linux

unzip data/EMG-data.zip -d data

If the dataset is EMG-data.csv.zip

Windows PowerShell

Expand-Archive -Path .\data\EMG-data.csv.zip -DestinationPath .\data -Force

macOS / Linux

unzip data/EMG-data.csv.zip -d data

After extraction, verify that this file exists:

data/EMG-data.csv

▢️ Run the Project

If you are using the notebook:

jupyter notebook

Then open:

train_emg.ipynb

and run all cells.

Since this project uses a notebook, avoid writing python train_emg.ipynb. If you later convert it to a script, you can use:

python train_emg.py

πŸ“ Output Files

After running the project, the following files are generated:

results/
β”œβ”€β”€ accuracy_report.txt
β”œβ”€β”€ confusion_matrix.png
└── demo_prediction.txt

models/
β”œβ”€β”€ emg_random_forest.pkl
└── scaler.pkl

πŸ“ Example Outputs

Accuracy Report

Contains:

  • overall model accuracy
  • classification report
  • class-wise performance summary

Confusion Matrix

Shows how well the model predicts each class.

Demo Prediction

Shows one sample prediction from the test set.


πŸ–Ό Visual Result

Confusion Matrix

EMG Confusion Matrix

This matrix helps visualize how well the classifier distinguishes between gesture classes.


πŸ”¬ Features Used

The baseline model uses these EMG channels as input:

  • channel1
  • channel2
  • channel3
  • channel4
  • channel5
  • channel6
  • channel7
  • channel8

Target Label

  • class

πŸ“Œ Notes

  • This baseline version predicts the numeric values stored in the class column.
  • The time and label columns are not used in the first version.
  • If a mapping between class IDs and real gesture names becomes available later, the project can be upgraded to display gesture names instead of numeric classes.

βœ… What This Project Demonstrates

  • EMG biosignal preprocessing
  • multi-channel feature-based classification
  • normalization for stable model training
  • supervised machine learning workflow
  • confusion matrix based evaluation
  • prosthetic-control style gesture recognition baseline

⚑ Current Strengths

  • clean baseline machine learning pipeline
  • practical biosignal classification use case
  • easy-to-understand notebook workflow
  • reusable saved model and scaler
  • strong base for future signal-processing improvements

πŸ›£ Future Improvements

  • add EMG signal filtering

  • implement window-based segmentation

  • extract engineered EMG features such as:

    • MAV
    • RMS
    • Variance
    • Waveform Length
  • compare with SVM, KNN, and XGBoost

  • map numeric predictions to gesture names

  • add live prediction support

  • build a prosthetic-control demo interface

  • evaluate feature importance across EMG channels


πŸ’‘ Potential Applications

This kind of EMG gesture classification system can be extended for:

  • prosthetic hand control
  • human-computer interaction
  • wearable biosignal interfaces
  • rehabilitation systems
  • gesture-based assistive technology

πŸ“„ Example .gitignore

__pycache__/
*.pyc
emg_env/
models/
results/
data/*.csv

Adjust this depending on whether you want model files or images pushed to GitHub.


🀝 Contributing

Contributions are welcome.

If you'd like to improve preprocessing, add feature extraction, compare classifiers, or build a real-time EMG application, feel free to fork the repository and open a pull request.


πŸ“œ License

This project is licensed under the MIT License.


πŸ‘¨β€πŸ’» Author

arshc0der


🌟 Repository Summary

This project is a practical implementation of:

  • EMG signal classification
  • hand gesture recognition
  • prosthetic-control style AI
  • biosignal preprocessing
  • machine learning classification
  • performance evaluation with confusion matrix

If you found this project useful, consider giving it a star on GitHub.