Ever wondered how long it will take to finish that YouTube course playlist you just found? Instead of manually checking each video, I built a small Python project called CourseTime Analyzer ๐.
This tool automatically searches YouTube for a course playlist, fetches all video durations, and calculates the total study time โ all packed in a simple Tkinter GUI.
๐ฅ Why I Built This
Whenever I started a YouTube course, I always wanted to know โHow much total time will this take?โ Sure, YouTube shows individual durations, but for playlists with 50+ videos, calculating by hand is painful.
So I automated it with Python + Selenium and wrapped it in a clean GUI using Tkinter.
๐ To be honest, this was a time-pass project. I wasnโt in the mood to continue my actual work (learning more about Softmax Regression ๐ ), so I coded this as a fun escape.
โ๏ธ Features
- ๐ Search YouTube for any course playlist
- ๐บ Fetch playlist title and creator details
- โฑ๏ธ Calculate total video duration in hours
- ๐ฅ Show video count and display durations (first 15 listed, rest summarized)
- ๐ Clickable playlist link directly inside GUI
- ๐ผ๏ธ Clean interface with background image
- ๐ฅ๏ธ Supports GUI mode (graphics.py) and CLI mode (main.py)
๐งฉ Modularized Approach
The project is structured to keep things clean and reusable:
- main.py โ Handles the core mechanism (YouTube scraping & analysis via Selenium).
- graphics.py โ A wrapper around
main.py
that provides a Tkinter-based GUI.
You can run main.py
independently in CLI mode โ the GUI is just an additional layer.
๐ GitHub Repository
All the code is open-source and available here:
๐ CourseTime Analyzer on GitHub
๐ฌ Feedback
Comments are open! Feel free to suggest improvements, criticize the approach, or even fork the repo and make it better. This was just a fun side project, so Iโd love to see how others take it further. ๐
Top comments (1)
To really crank up the CourseTime Analyzer for handling big playlists, youโll want to toss in some async I/O or threading to speed up that metadata fetch. Also, caching is a must to stop redundant scraping from slowing things down. For extra points, throw in features like search, progress tracking, and a visual scheduler-itโll make the whole thing way more user-friendly. Personally, I track my course time using Python scripts with yt-dlp and dump progress into Google Sheets. If youโre leaning into Selenium-based tools, consider using undetected-chromedriver, Playwright, or aiohttp to boost both speed and reliability.