Current version 1.0, feel free to contribute by creating a pull request 🚀
Learning programming is becoming an important part in children's education and it should also be available for visually impaired children. Teachers and educators should explore new ways of teaching programming principles to children
- Download the project from github
git clone https://github.com/jtiagodev/codebuddy.git
- Run the Node Package Manager Installer on each component of the project to install dependencies
cd/client
npm run install
npm run start
cd/meta
npm run install
npm run start
cd/server
npm run install
npm run start
- Add your Environment Variables, create a .env file under /client, /meta and /server
- Access the application:
- Access CodeBuddy's GUI here localhost:3000
- Access CodeBuddy's Metadata Support here localhost:1337
- Local Server is available at localhost:3001
- Print TopCodes (Tangible Object Placement Codes) which are available here
- (Optional) Connect to your computer one of the supported Robots, eg. WonderWorks DASH
- When properly configured this system works with any type of grid system/size
- For early versions, we recommend using a 6x6 Double LEGO grid and attach a camera as centered as possible (as shown on the paper)
- Follow the paper to setup the already available block coonfiguration with specific codes attached to it
- Once the System is installed and running
- System will only start once you identify yourself my name is (your_name)
- Either turn on map recognition and recognize custom board or choose/build a board through the Metadata config
- System validates board, recognizes each block including start position and direction and goal
- Board is saved in metadata database
- Once you have a list of commands to execute, turn on commands execution
- System validates list of commands (simply excludes any invalid blocks)
- System also converts list of commands to interface with other systems (eg, another work Group)
- Saves commands to database either when the camera detects the "save" block or when you order via voice command
- System auto computes solution for current selected board & game mode (not yet implemented)
- Adds feedback in form of additional actions (not yet implemented)
- Displays on screen the result of commands list. Green mean success, red mean stop/failure (eg, ordered Robot to move towards wall)
- Tells local system to execute solution computation (using selected Robot on the GUI)
- Client
- Initial GUI
- Voice Recognition Module
- Logic Layer built with RegEx Patterns
- Voice Synthesis Module
- Board Recognition Module
- Commands Recognition Module
- Meta
- Accessible Metadata Configuration
- Remote (Server)
- Save Map and Commands History to Realtime Database
- Result Computation Module
- Local (Server)
- Result Execution Interface
- Robot Execution Interface (Python)
- Robot Execution Scripts: WonderWorks Dash
- ChatBot Integration to replace RegEx Patterns (in progress)
- Further Integration with more Robots from WonderWorks, LEGO, etc.
- User Accounts and Analytics (Gamification)
- CLI
- CodeBuddy's Command Line Interface for project bootstrap
Paper available here
Project developed by Catarina Fitas, Daniel São Pedro && João Tiago
Advanced Interaction Techniques, Master in Computer Science (Professor Luís Carriço && Professor Tiago Guerreiro)
Faculty of Sciences of the University of Lisbon
LaSIGE - Large Scale Systems Laboratory
2019
CodeBuddy is distributed under Apache 2.0 License



