Table of Contents
The scope of the project is to create and research about Reinforcement Learning Map Coverage (and also Path Planning and Obstacle Avoidance) based methods. The model ideally should support both known and unknown obstacles that could be discovered using a sensor (e.g. LiDAR).
This section contains useful information about how to setup the project and the environment.
Create a new conda environment (optional but preferrable).
conda create -n <env_name> python==3.10
Below is an example of how you can instruct your audience on installing and setting up your app. This template doesn't rely on any external dependencies or services.
- Clone the repo
git clone https://github.com/your_username_/Project-Name.git
- Install the requirements. (TODO)
pip install -r requirements.txt
Once you configured the environment you can procede to train the agent using the predefined commands. Use the run script:
python run.py <flags>
Available flags:
- algo: "dqn", "a2c", "ppo", default="dqn"
- obs: "linear" or "image"
- sb: Use stablebaselines or not
- drone: Use drone environment (recommended)
- log: Log to W&B
- path: Print the path followed by the drone
- render: Show the environment
- easy: Use Easy Env (outdated)
- st: Use StEnv (outdated)
- Simulate Lidar
- Add drone orientation
- Add Rover for Map Exploration
- Clean the code up.
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.
Davide Buoso - davide.buoso@studenti.polito.it