This repository contains a solution for the third project of Computer Vision Nanodegree program.
In this project, we'll implement SLAM (Simultaneous Localization and Mapping) for a 2 dimensional world. We'll combine knowledge about robot sensor measurements and movement to create a map of an environment from only sensor and motion data gathered by a robot, over time. SLAM gives us a way to track the location of a robot in the world in real-time and identify the locations of landmarks such as buildings, trees, rocks, and other world features. This is an active area of research in the fields of robotics and autonomous systems.
Below is an example of a 2D robot 50x50 grid world with landmarks (purple x's) and the robot (a red 'o') located and found using only sensor and motion data collected by that robot.
Example of SLAM output (estimated final robot pose and landmark location)
The repository contains five files:
1. Robot Moving and Sensing.ipynb
: Robot Moving and Sensing2. Omega and Xi, Constraints.ipynb
: Omega and Xi, Constraints3. Landmark Detection and Tracking.ipynb
: Landmark Detection and Trackingrobot_class.py
: Implementation of Robot classhelpers.py
: Helper functions