NOTE: This repo is the basic implementation with few limitations. Please refer the new repo at https://github.com/Paperspace/DinoRunTutorial
https://blog.paperspace.com/dino-run/
A Deep Convolutional Neural Network to play Google Chrome's offline Dino Run game by learning action patterns from visual input using a model-less Reinforcement Learning Algorithm
NOTE: This is a basic-implementation repository with some limitations. Please refer https://github.com/Paperspace/DinoRunTutorial where I've used a GPU VM for better results, with scores upto 4000
Refer the jupyter notebook for detailed implementation :
https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning/blob/master/Reinforcement%20Learning%20Dino%20Run.ipynb
Start by cloning the repository
$ git clone https://github.com/ravi72munde/Chrome-Dino-Reinforcement-Learning.git
Dependencies can be installed using pip install or conda install for Anaconda environment
Dependencies
- Python 3.6
- Selenium
- OpenCV
- PIL
- Keras
- Chromium driver for Selenium