Skip to content

codeahl/autonomous_rover

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Roomba Object Detection & Remote Navigation System

Project Overview

This project is an embedded systems and robotics application that transforms a Roomba into a remotely navigable robot with real-time object detection and environment mapping capabilities.

The robot is equipped with:

  • A sonar sensor and infrared sensor mounted on a servo motor at the front of the bot
  • A Texas Instruments Tiva TM4C123GH6PM microcontroller to control motion, sensors, and communication

All register configurations and peripheral control were implemented directly using the microcontroller’s datasheet, without high-level abstraction libraries.

A Python-based GUI communicates with the robot over a Wi-Fi connection, continuously receiving telemetry data such as:

  • Robot position and angle
  • Detected objects
  • Floor obstacles
  • Sensor readings

The robot is designed to navigate an unknown test field with no prior knowledge of the environment, using only live sensor feedback and user input from the GUI.


Basic System Architecture

[ Sensors (Sonar + IR) ]  
            ↓  
[ Servo Motor Sweep ]  
            ↓  
[ Tiva TM4C123GH6PM MCU ]  
            ↓  
[ Wi-Fi Module ] ⇄ [ Python GUI ]

Hardware Components

  • Roomba mobile robot base
  • Texas Instruments TM4C123GH6PM microcontroller
  • Ultrasonic distance sensor (sonar)
  • Infrared proximity sensor
  • Servo motor (for sensor sweep)
  • Wi-Fi communication module
  • Power distribution circuitry

Software Components

Embedded (C)

  • Bare-metal register configuration using TI documentation
  • GPIO, UART, PWM, timers, and interrupts configured manually
  • Sensor polling and servo sweep logic
  • Motion control and obstacle detection
  • Wi-Fi data transmission

GUI (Python)

  • Real-time data reception over Wi-Fi
  • Visualization of:
    • Robot orientation
    • Detected objects
    • Floor obstacles
  • Manual remote control of robot movement
  • Field mapping display

Features

  • Real-time object detection
  • Live environment mapping
  • Servo-based scanning system
  • Remote navigation via GUI
  • Autonomous sensor feedback loop
  • Wi-Fi telemetry streaming
  • Low-level register-based MCU control

Intended Use

The system is designed to:

  • Navigate a test field with no preloaded map
  • Rely solely on live sensor data
  • Be manually controlled through the GUI
  • Track and display detected obstacles and robot orientation

This demonstrates:

  • Embedded programming
  • Sensor integration
  • Networking
  • GUI development
  • Robotics control

How It Works

  1. The servo sweeps the sonar and IR sensors across a range of angles
  2. Distance and obstacle data is collected
  3. Position and orientation are computed
  4. Data is transmitted over Wi-Fi
  5. The Python GUI displays the robot and detected objects
  6. User inputs control movement in real time

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published