Hey!
I created a docker container with a pyhton app with the help of deepseek, doing the first download run and it's downloading the directory. It's slow because it has to scrape and then download the files one by one. I also added some basic resume functionality (it checks the download folder for items and skips them).
Here's the basic code, I will try to upload it to github for ease:
1. Directory Structure
Ensure your directory structure looks like this:
(Create a directory and create all these files in it)
pcloud/
├── docker-compose.yml
├── Dockerfile
├── download_folder.py
└── downloaded_files/ (this will be created automatically)
2. Docker Compose File
Create a folder called pcloud and create a docker-compose.yml
file in pcloud
with the following content:
Run "nano docker-compose.yml"
Paste these and save
services:
pcloud-downloader:
build: .
container_name: pcloud-downloader
volumes:
- ./downloaded_files:/app/downloaded_files
3. Dockerfile
Ensure your Dockerfile
is in the same directory and contains the following:
Dockerfile
FROM python:3.9-slim
WORKDIR /app
COPY download_folder.py .
RUN pip install requests beautifulsoup4
RUN mkdir -p /app/downloaded_files
CMD ["python", "download_folder.py"]
4. Python Script
Create a file download_folder.py
nano download_folder.py
with following contents (PASTE THE FOLLOWING INTO IT AND SAVE):
import os
import requests
from bs4 import BeautifulSoup
from urllib.parse import urljoin, unquote
import json
import time
import logging
base_url = "https://filedn.com/lgm4rog8XwDbvwRIvGBXqry/"
download_folder = "downloaded_files"
log_file = os.path.join(download_folder, "download_errors.log")
if not os.path.exists(download_folder):
os.makedirs(download_folder)
logging.basicConfig(filename=log_file, level=logging.ERROR, format='%(asctime)s - %(message)s')
def download_file(url, folder, retries=10):
"""Download a file from the given URL and save it to the specified folder."""
local_filename = os.path.join(folder, unquote(url.split('/')[-1]))
for attempt in range(retries):
try:
with requests.get(url, stream=True) as r:
r.raise_for_status()
with open(local_filename, 'wb') as f:
for chunk in r.iter_content(chunk_size=8192):
f.write(chunk)
print(f"Downloaded: {local_filename}")
return local_filename
except (requests.exceptions.ChunkedEncodingError, requests.exceptions.ConnectionError) as e:
print(f"Attempt {attempt + 1} failed for {url}: {e}")
if attempt < retries - 1:
time.sleep(5) # Wait 5 seconds before retrying
else:
logging.error(f"Failed to download {url} after {retries} attempts.")
print(f"Failed to download {url} after {retries} attempts.")
return None
except Exception as e:
logging.error(f"Error downloading {url}: {e}")
print(f"Error downloading {url}: {e}")
return None
def scrape_and_download(url, folder):
"""Scrape the folder and download all files and subfolders."""
response = requests.get(url)
soup = BeautifulSoup(response.text, 'html.parser')
scrape_and_download(base_url, download_folder)
print("Download complete!")
5. Run Docker Compose
Navigate to the pcloud
directory and run the following command:
docker-compose up