Skip to content

Commit 378643c

Browse files
author
Dhruva Shaw
committed
some changes
1 parent 92df898 commit 378643c

File tree

2 files changed

+111
-3
lines changed

2 files changed

+111
-3
lines changed

_projects/mcba.md

Lines changed: 56 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,6 +4,7 @@ title: Mind Controlled Bionic Arm with Sense of Touch
44
description: Imagine a prosthetic arm that functions like your natural arm. You wear a headband, and with the thought process, the working signal from mind connects to the prosthetic about moving the arm, it responds accordingly—just like your real arm!
55
tags: Bionic Arm Robotics Biotechnology Mind Control Prosthetics
66
giscus_comments: true
7+
citation: true
78
img: /assets/img/mcba_logo.jpeg
89
date: 2024-12-12
910
featured: true
@@ -43,7 +44,6 @@ authors:
4344
name: Lovely Professional University
4445

4546
bibliography: papers.bib
46-
citation: true
4747

4848
# Optionally, you can add a table of contents to your post.
4949
# NOTES:
@@ -54,11 +54,65 @@ citation: true
5454
toc: true
5555
---
5656

57-
# Abstract
57+
## Abstract
5858

5959
Advancements in bionic technology are transforming the possibilities for restoring hand function in individuals with amputations or paralysis. This paper introduces a cost-effective bionic arm design that leverages mind-controlled functionality and integrates a sense of touch to replicate natural hand movements. The system utilizes a non-invasive EEG-based control mechanism, enabling users to operate the arm using brain signals processed into PWM commands for servo motor control of the bionic arm. Additionally, the design incorporates a touch sensor (tactile feedback) in the gripper, offering sensory feedback to enhance user safety and dexterity.
6060
The proposed bionic arm prioritizes three essential features:
6161
1. Integrated Sensory Feedback: Providing users with a tactile experience to mimic the sense of touch (signals directly going to the brain). This capability is crucial for safe object manipulation by arm and preventing injuries
6262
2. Mind-Control Potential: Harnessing EEG signals for seamless, thought-driven operation.
6363
3. Non-Invasive Nature: Ensuring user comfort by avoiding invasive surgical procedures.
6464
This novel approach aims to deliver an intuitive, natural, and efficient solution for restoring complex hand functions.
65+
66+
---
67+
68+
## Methodology
69+
### 1. Data Collection and Dataset Overview
70+
The model development utilized a publicly available EEG dataset comprising data from 60 volunteers performing 8 distinct activities [3]. The dataset includes a total of 8,680 four-second EEG recordings, collected using 16 dry electrodes configured according to the international 10-10 system [3].
71+
• Electrode Configuration: Monopolar configuration, where each electrode's potential was measured relative to neutral electrodes placed on both earlobes (ground references).
72+
• Signal Sampling: EEG signals were sampled at 125 Hz and preprocessed using:
73+
- A bandpass filter (5–50 Hz) to isolate relevant frequencies [3].
74+
- A notch filter (60 Hz) to remove powerline interference [3].
75+
76+
### 2. Data Preprocessing
77+
The dataset, originally provided in CSV format, underwent a comprehensive preprocessing workflow:
78+
• The data was split into individual CSV files for each of the 16 channels, resulting in an increase from 74,441 files to 1,191,056 files.
79+
• Each individual channel's EEG data was converted into audio signals and saved in .wav format, allowing the brain signals to be audibly analyzed.
80+
• The entire preprocessing workflow was implemented in Python to ensure scalability and accuracy.
81+
The dataset captured brainwave signals corresponding to the following activities:
82+
1) BEO (Baseline with Eyes Open): One-time recording at the beginning of each run [3].
83+
2) CLH (Closing Left Hand): Five recordings per run [3].
84+
3) CRH (Closing Right Hand): Five recordings per run [3].
85+
4) DLF (Dorsal Flexion of Left Foot): Five recordings per run [3].
86+
5) PLF (Plantar Flexion of Left Foot): Five recordings per run [3].
87+
6) DRF (Dorsal Flexion of Right Foot): Five recordings per run [3].
88+
7) PRF (Plantar Flexion of Right Foot): Five recordings per run [3].
89+
8) Rest: Recorded between each task to capture the resting state [3] [4].
90+
91+
### 3. Feature Extraction and Classification
92+
Feature extraction and activity classification were performed using transfer learning with YamNet [5], a deep neural network model.
93+
• Audio Representation: Audio files were imported into MATLAB using an Audio Datastore [6]. Mel-spectrograms, a time-frequency representation of the audio signals, were extracted using the yamnetPreprocess [7] function [8].
94+
• Dataset Split: The data was divided into training (70%), validation (20%), and testing (10%) sets.
95+
Transfer Learning with YamNet [5] [8]:
96+
- The pre-trained YamNet model (86 layers) was adapted for an 8-class classification task:
97+
-> The initial layers of YamNet [5] were frozen to retain previously learned representations [8].
98+
-> A new classification layer was added to the model [8].
99+
- Training details:
100+
-> Learning Rate: Initial rate of 3e-4, with an exponential learning rate decay schedule [8].
101+
-> Mini-Batch Size: 128 samples per batch.
102+
-> Validation: Performed every 651 iterations.
103+
104+
### 4. Robotic Arm Design and Simulation
105+
A 3-Degree-of-Freedom (DOF) robotic arm was designed using MATLAB Simulink and Simscape toolboxes. To ensure robust validation:
106+
• A virtual environment was developed in Simulink, simulating the interactions between the trained AI models and the robotic arm.
107+
• The simulations served as a testbed to evaluate the system's performance before real-world integration.
108+
109+
### 5. Project Progress and Future Directions
110+
Completed Tasks:
111+
1. AI Model Development: Successfully trained models to classify human activities based on EEG signals.
112+
2. Robotic Arm Design: Designed a functional 3-DOF robotic arm with simulated controls.
113+
3. Virtual Simulation: Validated AI-robotic arm interactions in a virtual environment.
114+
115+
Future Directions:
116+
1. Hardware Integration: Implement the developed AI models into physical robotic hardware for real-world testing.
117+
2. Real-Time EEG Acquisition: Develop a system for real-time EEG data acquisition and activity classification.
118+
3. Tactile Feedback System: Integrate tactile sensors with the robotic arm for real-world sensory feedback, complemented by Simulink-based simulations.

assets/bibliography/papers.bib

Lines changed: 55 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,58 @@
11
---
22
---
3+
@misc{nprnews,
4+
url={https://www.npr.org/sections/health-shots/2021/05/20/998725924/a-sense-of-touch-boosts-speed-accuracy-of-mind-controlled-robotic-arm},
5+
journal={NPR},
6+
year={2021},
7+
month={May},
8+
title={Scientists Bring The Sense Of Touch To A Robotic Arm}
9+
}
310

4-
11+
@misc{transferlearning_matlab,
12+
url={https://in.mathworks.com/help/audio/ug/transfer-learning-with-pretrained-audio-networks.html},
13+
journal={Mathworks.com},
14+
year={2024},
15+
title={Transfer Learning with Pretrained Audio Networks}
16+
}
17+
18+
@misc{classify_sounds_using_yamnet_2021,
19+
url={https://in.mathworks.com/help/audio/ref/yamnetpreprocess.html},
20+
journal={Mathworks.com},
21+
year={2021},
22+
title={yamnetPreprocess}
23+
}
24+
25+
@misc{audio_datastore,
26+
url={https://in.mathworks.com/help/audio/ref/audiodatastore.html},
27+
journal={Mathworks.com},
28+
year={2021},
29+
title={audioDatastore}
30+
}
31+
32+
@misc{yamnet_github,
33+
url={https://github.com/tensorflow/models/tree/master/research/audioset/yamnet},
34+
journal={GitHub},
35+
year={2024},
36+
author={Google and Ellis, Dan and Plakal, Manoj},
37+
}
38+
39+
@article{asanza_2023,
40+
title={MILimbEEG: An EEG Signals Dataset based on Upper and Lower Limb Task During the Execution of Motor and Motorimagery Tasks}, volume={2}, url={https://data.mendeley.com/datasets/x8psbz3f6x/2},
41+
DOI={https://doi.org/10.17632/x8psbz3f6x.2},
42+
journal={Mendeley Data},
43+
author={Asanza, Victor and Montoya, Daniel and Lorente-Leyva, Leandro Leonardo and Peluffo-Ordóñez, Diego Hernán and González, Kléber},
44+
year={2023},
45+
month={July}
46+
}
47+
48+
@misc{https://doi.org/10.5524/100295,
49+
doi = {10.5524/100295},
50+
url = {http://gigadb.org/dataset/100295},
51+
author = {Cho, Hohyun and Ahn, Minkyu and Ahn, Sangtae and {Moonyoung Kwon} and Jun, Sung Chan},
52+
keywords = {ElectroEncephaloGraphy(EEG), Motor imagery, EEG, brain computer interface, performance variation, subject-to-subject transfer},
53+
language = {en},
54+
title = {Supporting data for "EEG datasets for motor imagery brain computer interface"},
55+
publisher = {GigaScience Database},
56+
year = {2017},
57+
copyright = {CC0 1.0 Universal}
58+
}

0 commit comments

Comments
 (0)