Automatically download Youtube videos to Plex on TrueNAS using youtube-dl

This guide will go through how you setup youtube-dl to automatically download videos from your favorite Youtube channels to a Plex server running on TrueNAS.

Start by logging in to your TrueNAS admin panel and create a new custom jail, we are not using a plugin for this.

  1. Because of my network configuration I will be using the advanced jail creation option.
  2. Give the jail a name, for instance youtube-dl.
  3. Make sure you set jail type to Clone Jail. You can read more about the differences in the TrueNAS forum.
  4. Choose the latest available release.
  5. Since we are downloading podcasts this jail will need networking.
    1. Activate VNET.
    2. Activate Berkely Packet Filter.
    3. Set vnet_default_interface to none.
    4. Set IPv4 interface to vnet0.
    5. Pick an IP-address and netmask.
  6. I’m putting this jail on a VLAN so I have already created a bridge I want to use with it. Therefore I go to Network Properties and then change interfaces to vnet0:mybridge. Here is some more information about using jails with VLANs.
  7. Create the jail.

Now I like to switch over to using SSH instead. So SSH to your TrueNAS server and then start the jail if you haven’t already.

iocage start youtube-dl

Open a console to it.

iocage console youtube-dl

Install youtube-dl, I will also install nano but you can install whatever text editor you want.

pkg install nano ca_root_nss yt-dlp

Add a new user for youtube-dl, run the adduser command and answer the questions. I used the following options.

  • Username: youtube-dl
  • Password: random
  • Full name: youtube-dl
  • Uid: 1002 (an id that doesn’t already exist)
  • Class: default
  • Home: default
  • Home mode: default
  • Shell: bash
  • Locked: no

We should now go back to the TrueNAS admin interface and create a dataset where we will store our videos. This dataset will be mounted into both the youtube-dl and Plex jails so that youtube-dl can download to it and Plex can read from it.

When you have created the dataset you need to edit the ACL for it. You do that by going to Storage -> Pools -> find the dataset and click on the three dot icon to the right of it -> Edit ACL. Add another ACL Item and set the following permissions for it:

  • Who: User
  • User: The UID you earlier set while creating the youtube-dl user.
  • ACL Type: Allow
  • Permission Type: Basic
  • Permissions: Full Control
  • Flags Type: Basic
  • Flags: Inherit

What you are doing here is granting the youtube-dl user full control of what is stored on this dataset, that way it can write videos to it. There are more settings on this page you should go through and make sure they are set the way you want them to be. If you are unsure what everything does I recommend you watch this video.

You will need to add one more ACL Item to grant the Plex jail access to the dataset. Start by finding out the UID for the Plex user.

iocage console nameofplexjail
id plex

In my case the UID is 972, so create another ACL Item with the exact same settings as the previous one but change the user to 972.

When you have created the dataset shutdown the youtube-dl jail. Then edit the mount points and add a new mount point for the dataset you just created. You can add a mount point if you go to Jails -> maximize the jail -> Mount points -> Actions -> Add. I will mount it under /mnt/youtube inside the jail but you can mount it wherever you like.

Start the jail again, open up a console to it again like we did before and su to the user we created earlier.

iocage start youtube-dl
iocage console youtube-dl
su - youtube-dl

If you create a config file in a specific location youtube-dl will by default read from it so that is what we will do.

mkdir -p ~/.config/youtube-dl
nano ~/.config/youtube-dl/config

In that file I have put the following.

--batch-file "~/.config/youtube-dl/channels"
--output "/mnt/youtube/%(uploader)s/%(uploader)s: %(title)s.%(ext)s"
--dateafter now-1days
--download-archive "~/.config/youtube-dl/archive"
--add-metadata
--write-thumbnail
--playlist-end 10
--ignore-errors
--cookies "~/.config/youtube-dl/cookies"
--limit-rate 2M
  • batch-file: File to read a list of Youtube channels from.
  • output: Where to put downloaded videos and what to name them.
  • dateafter: Only download videos uploaded within the last day.
  • download-archive: Location of the file that keeps track of which videos have already been downloaded.
  • add-metadata: Add metadata to the downloaded video files.
  • write-thumbnail: Download the video thumbnail, it will be added to the video metadata if combined with add-metadata.
  • playlist-end: If you don’t have this youtube-dl will scan every video in the playlist, which often is the entire channel. This takes a lot of time and causes unnecessary API calls. Setting this to 10 will only scan the 10 latest videos uploaded to the playlist. Since I run the script multiple times a day and only want to download the latest video it’s fine to only scan the 10 latest videos for every channel.
  • ignore-errors: Don’t exit on errors.
  • cookies: I had to add this because I got blocked, you can read more about this in the documentation.
  • limit-rate: Youtube-dl ate all of my internet so this will make sure it plays nice with everything else that needs internet.

You can add any command line option to the config file and youtube-dl will read it and there are a lot of options so if you need something else check the documentation.

Now lets add a list of channels to download videos from.

nano ~/.config/youtube-dl/channels

It’s important to remember that youtube-dl works with playlists. And when I say playlists I’m not only referring to the feature on Youtube called “playlists”, everything on Youtube is a playlist. For example the list of all the uploaded videos on a channel is also a playlist. So if you want to download the latest video from a channel you need to find the playlist for all of the videos for that channel. Luckily that is easy:

https://www.youtube.com/c/CHANNELNAME/videos

Or you can just navigate to the channel and click the “Videos” tab and copy the link from your browser. Just add that link to your channels file, one link per row. Now you can run youtube-dl and it will download the videos from your chosen channels.

youtube-dl

Keep in mind that it will not only download the latest video. Since there are no videos added to your archive file yet it will download the ten latest videos from each channel. There are ways around this but it’s probably easier to just let it download everything and then remove what you don’t want. When you run it the next time it will only download new videos.

Now lets add a cron job that automatically runs youtube-dl.

crontab -e

I added the following cron job, that will run youtube-dl once every hour.

30 * * * * /usr/local/bin/youtube-dl

You could also add another cron job to remove old videos, for instance.

15 * * * * find /mnt/youtube/ -type f -mtime +7 -delete

This will also run every hour but 15 minutes past the hour instead of 30. It will remove files older than 7 days.

Plex can however not access any video yet, to fix that you need to mount the dataset you created earlier in the Plex jail. So shutdown the Plex jail, mount the dataset and create a new library in Plex that points to that mount point. Your Youtube videos should now show up in Plex.

3 Replies to “Automatically download Youtube videos to Plex on TrueNAS using youtube-dl”

    1. Yes I see the same thing on TrueNAS 12.2 so seems like they changed the package name. I updated the article, thanks for the info!

  1. Thanks so much for the article! I’m running into a problem with just the first few commands, off the bat when trying to run `pkg install nano ca_root_nss youtube-dl` and the output I’m getting is “pkg: No packages available to install matching ‘youtube-dl’ have been found in the repositories”

    Do you think the package was removed? I heard on reddit that it got DMCA’d ~2 years ago.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.