Playlist was made in ~2018
YouTube tends to take down videos whether it's by the uploaders themselves, forcefully taken down by a claimant or YouTube being YouTube. Regardless, I got sick and tired of losing the knowledge of what videos disappear from my playlists. I decided to take a sort of census on my playlists. This endeavor had long began from 2018 but only recently I was really motivated to do this to the end. Of those 2056 videos you see on the left here, only about 1.9K or so videos still remain. It's standard for videos to go missing and leaving many playlists like cartoonish blocks of cheeses - full of holes. I am sad those videos are gone (not so much for those not uploaded with permission) but I don't give too much care on them. You can always find a replacement. My efforts here are to just archive what videos were really taken down. Just so I know what videos should be replaced and also maybe one day try archiving the whole playlist for my own personal use.
Here is me searching through Google on web scraping which at first I confused with web crawling. Similar yet different things.
Here's me looking up videos on web scraping with python. I am bad at using python despite it being somewhat easier syntax wise. I have little to no experience with this coding language but I'll try my best in learning it on the way.
One of the earlier codes I tried copying from a tutorial. I kept getting errors for it. I've already written multiple different python files in similar fashion with this one. Most of the guides tell to use BeautifulSoup to read the html data.
Honestly I gave up on coding the scraper for a while and turned to manually typing all the data from each video from the playlist. This was super important to do regardless of the success of me coding because it gave me insight on how I would go about scraping the videos. YouTube playlists still retain that video as an index in it, despite no way of having to watch it anymore, you can still find the link of the unavailable videos.
Now, this might seem useless to the untrained but these somewhat dead links can still be help us find what video was here before. Simply copy the link and paste it on a search engine and with luck you can find if any other sites have linked to it before. If that fails, you can use web archive's wayback machine to find an archive of it (hopefully).
I continued later on dabbling in python to code this scraper but then I realised it was going to be more complex than that. The scraper is unable to perfectly do what I do to find the dead video links since it can't scrape info on it. So I could only do this for the majority of videos. I began doubting the reason for this whole coding part.
I lament-posted this on Whatsapp status and someone gave an idea to just go through with the web scraper and manually find the missing videos myself. Good idea. I exported my playlist using Google Export which only exported the video IDs. Thankfully it did give all of the videos IDs instead of only the working ones. I continued trying to code.
I finally realised why the code couldn't properly run. Apparently a lot of YouTube's site is coded in JSON which is not compatible with Beautiful Soup (it's mainly used for HTML). Well at least I now know it's not because of my incompetence in coding but rather ignorance LOLMAO.