Offloading Peertube Media to S3

My last post documented offloading Azuracast media to Digital Ocean’s Spaces, and this is a follow-up on the offloading theme. Only this time it was for my Peertube instance hosted at bava.tv. I’ll be honest, this could have been a very painful process if Taylor hadn’t  blogged how he did it with Backblaze last December, so in many ways thanks his work made the process a piece of cake, so thank you, Taylor!

I made all media public in the Digital Ocean Spaces bucket, and also created a CORS rule to give the bava.tv domain full control over bucket. After that, I just got the keys, endpoint, region, and bucket name, and I was good to go:

object_storage:
  enabled: true
  endpoint: 'nyc3.digitaloceanspaces.com' 
  upload_acl: 
    public: 'public-read'
    private: 'private' 
  videos: 
    bucket_name: 'bavatv'
    prefix: 'videos/' 
  streaming_playlists: 
    bucket_name: 'bavatv' 
    prefix: 'streaming-playlists/' 
  credentials: 
    access_key_id: 'youraccesskey' 
    secret_access_key: 'yoursupersecretkeytothekingdom'

Once I replaced that with existing object_storage code at the bottom of  the production.yaml file (found in /home/peertube/docker-volume/config/) and restarted the container the site was offloading videos to my bucket in Spaces seamlessly. Now I just needed to offload all existing videos to that bucket, which this command did perfectly:

docker-compose exec -u peertube peertube npm run create-move-video-storage-job -- --to-object-storage --all-videos

After it was done running it removed close to 120GBs from the server and my videos loaded from the Spaces bucket. The videos load a bit slower, but that is to be expected, and the next piece will be finding a way to speed them up using a proxy of some kind. But for now I am offloading media like it’s my job, well, actually, it kind of is.

css.php