Youtube Disaster plan ===================== YouTube is going to shit and I'd like to archive as many videos as possible before it shits the bed. This web application allows users to scrape channels/playlists/videos that they enjoy and request specific videos to be downloaded (with subs, description, metadata, and even current sponsorblock info). To handle the creators who might have some bangers but don't always bang, you can also mark videos as hidden to exclude them from your search results so you can just go through videos you haven't decided on yet. TODO ---- - LDAP authentication to have multi-user support for the application - Automatically download metadata for channels and playlists the user specifies - Automatically download videos which match certain criteria (from a channel/playlist/title matches a search/could get crazy with vector search since each video's title is vectorized) - Display current scraping and downloading jobs to give the user some feedback as to what is happening - Not shit UI How-to ------ 1. Make a directories for the docker containers to store their database. By default the docker-compose file uses "dump/backend" and "dump/typesense". 2. Run `docker compose build && docker compose up` 3. In the frontend directory run `npm install` 4. Now the frontend just needs `npm start` to run Deploying --------- - Pray - Also you'll need to stop hardcoding the backend url in the frontend + put both typesense and the flask app behind a reverse proxy Architecture ------------ The backend is a Flask app. It makes use of Python RQ+redis to create worker queues for downloading metadata and downloading videos. The docker compose file currently spawns two workers, but you can bump that number up to do more things at the same time. The search functionality is provided by typesense. The frontend is written in Vue and utilizes InstantSearch.js to provide a good search UI.