A Python Bunkr downloader that fetches images and videos from URLs. It supports both Bunkr albums and individual file URLs, logs issues, and enables concurrent downloads for efficiency.
- Downloads multiple files from an album concurrently.
- Supports batch downloading via a list of URLs.
- Supports selective files downloading based on filename criteria.
- Provides progress indication during downloads.
- Automatically creates a directory structure for organized storage.
- Logs URLs that encounter errors for troubleshooting.
- Python 3
BeautifulSoup
(bs4) - for HTML parsingrequests
- for HTTP requestsrich
- for progress display in the terminal.
project-root/
├── helpers/
│ ├── crawlers/
│ │ └── crawler_utils.py # Utilities for extracting media download links
│ ├── downloaders/
│ │ └── album_downloader.py # Manages the downloading of entire albums
│ │ ├── download_utils.py # Utilities for managing the download process
│ │ └── media_downloader.py # Manages the downloading of individual media files
│ ├── managers/
│ │ ├── live_manager.py # Manages a real-time live display
│ │ ├── log_manager.py # Manages real-time log updates
│ │ └── progress_manager.py # Manages progress bars
│ ├── bunkr_utils.py # Utilities for checking Bunkr status
│ ├── config.py # Manages constants and settings used across the project
│ ├── file_utils.py # Utilities for managing file operations
│ ├── general_utils.py # Miscellaneous utility functions
│ └── url_utils.py # Utilities for Bunkr URLs
├── downloader.py # Module for initiating downloads from specified Bunkr URLs
├── main.py # Main script to run the downloader
├── URLs.txt # Text file listing album URLs to be downloaded
└── session_log.txt # Log file for recording session details
- Clone the repository:
git clone https://github.com/Lysagxra/BunkrDownloader.git
- Navigate to the project directory:
cd BunkrDownloader
- Install the required dependencies:
pip install -r requirements.txt
To download a single media from an URL, you can use downloader.py
, running the script with a valid album or media URL.
python3 downloader.py <bunkr_url>
You can either download an entire album or a specific file:
python3 downloader.py https://bunkr.si/a/PUK068QE # Download album
python3 downloader.py https://bunkr.fi/f/gBrv5f8tAGlGW # Download single media
The script supports selective file downloads from an album, allowing you to exclude files using the Ignore List and include specific files with the Include List.
The Ignore List is specified using the --ignore
argument in the command line. This allows you to skip the download of any file from an album if its filename contains at least one of the specified strings in the list. Item in the list should be separated by a space.
python3 downloader.py <bunkr_album_url> --ignore <ignore_list>
This feature is particularly useful when you want to skip files with certain extensions, such as .zip
files. For instance:
python3 downloader.py https://bunkr.si/a/PUK068QE --ignore .zip
The Include List is specified using the --include
argument in the command line. This allows you to download a file from an album only if its filename contains at least one of the specified strings in the list. Items in the list should be separated by a space.
python3 downloader.py <bunkr_album_url> --include <include_list>
python3 downloader.py https://bunkr.si/a/PUK068QE --include FullSizeRender
To batch download from multiple URLs, you can use the main.py
script. This script reads URLs from a file named URLs.txt
and downloads each one using the media downloader.
- Create a file named
URLs.txt
in the root of your project, listing each URL on a new line.
- Example of
URLs.txt
:
https://bunkr.si/a/PUK068QE
https://bunkr.fi/f/gBrv5f8tAGlGW
https://bunkr.fi/a/kVYLh49Q
- Ensure that each URL is on its own line without any extra spaces.
- You can add as many URLs as you need, following the same format.
- Run the batch download script:
python3 main.py
- The downloaded files will be saved in the
Downloads
directory.
The application logs any issues encountered during the download process in a file named session_log.txt
. Check this file for any URLs that may have been blocked or had errors.