Automatic copy to amazon when download finished

rtorrent
copy
rclone
rutorrent

#1

Hello !

I’m working on an automatic and “easy” task to transfert my finished downloads to Amazon via rclone.
So, I created this topic to share what I’ve done and ask you helps to improve it !

What I’ve done :

  • Installing rclone : http://rclone.org/install/

  • Changing /etc/lshell.conf to permit rtorrent users to use rclone command :

  • Connecting with your rtorrent user (dl for me) and create the new remote for amazon drive (or other cloud provider) :

How to configure your rclone’s Amazon remote : http://rclone.org/amazonclouddrive/

  • Adding this line in the .rtorrent.rc file of your rtorrent’s user :
    system.method.set_key = event.download.finished,copy_rclone,"execute=rclone,copy,-v,–log-file=log_copy,$get_directory=,test:test"
    You will need to change “test:test” by your rclone’s remote previously configured
    You could just enter “remote_name:” (“test:” in my example), it will copy your finished torrent to the root path of your rclone’s remote.

  • Reloading rtorrent
  • Downloading a torrent and check the logfile named “log_copy” in your rtorrent’s user home directory if all is ok
  • It works !

So now, I need help to improve it because here’s what happened when a download is finished :
It launch command : rclone copy -v -log-file=log_copy $get_directory= test:test
My problem is the “$get_directory=” variable which give me, in my case, /home/dl/torrents/rtorrent, so it will copy all new files contained in /home/dl/torrents/rtorrent even those are not finished to download.

What I would like :
I created many folders in my /home/dl/torrents/rtorrent (music, movies, shows, …) and I would like to found the variable which could give me the folder where I put my torrent (ex: /home/dl/torrents/rtorrent/music).
I already try those variables :
$d.directory_base
$d.get_directory
$d.get_directory_base
$directory.default
$get_directory
$d.base_path
$d.get_base_path

$directory.default and $get_directory will copy in the folder I want (ex: If I put a torrent to download to /home/dl/torrents/rtorrent/music, it will copy to test:test/music/), others will copy in the root folder of my rclone’s remote.

More details : https://gist.github.com/ahocquet/ea8318661a68177e0be1a442076833d5

I think that I must change too “test:test” by something like “test:test/$get_directory=” (already try but it will create a folder named $get_directory=) to not scan all the rclone’s remote.
The goal is to gain time, scanning many TB take several time.

I don’t know if I’m clear in my request (my english is really not perfect), let me know if you need more explanations.


#2

What is the point in having rtorrent trigger the upload?
I have tTorrent trigger filebot for post processing(unraring, renaming, getting additional files) and then it hardlinks so the torrents still seed. I then have a separate rclone job that uploads those. I suppose I could trigger this based on inodes but timed uploads…every 10 min seems fine. I have my upload scripts setup so to not run concurrently…ala it won’t run if its already running.

You will have issues because of the time it takes to upload. You’re better off breaking it into a separate task like I did or spawning a separate process to do it.


#3

Thanks for your answer, I’ve thinking about your solution but don’t how you setup to not run concurrently, could you please share your script ?

Instead of create a cron every 10min, you could launch your script automatically when a download is finished ?
By adding system.method.set_key = event.download.finished,copy,“execute=script.sh” in the .rtorrent.rc ?


#4

I use flock. I found out yesterday about the --no-traverse. This prevents the spinning through the existing content. Really cuts down on the upload time.

My cronjob:
*/20 * * * * /home/user/bin/upload_main.sh

My upload_main.sh
flock -xn /home/user/bin/locks/upload.lck -c "/home/user/bin/upload_silent.sh"

my upload_silent.sh:

#!/bin/bash
/home/user/bin/rclone move --transfers=6  --no-traverse /home/user/torrents/rtorrent/complete/Movies/ gdrive-wh:media/Movies --quiet
/home/user/bin/rclone move --transfers=6  --no-traverse /home/user/torrents/rtorrent/complete/"TV Shows"/ gdrive-wh:media/"TV Shows" --quiet

# remove empty directories
find /home/user/torrents/rtorrent/complete/Movies/* -empty -type d -delete 2>/dev/null
find /home/user/torrents/rtorrent/complete/"TV Shows"/* -empty -type d -delete 2>/dev/null

#5

Thanks a lot !

Have you any idea of how we could use Sickrage + rclone ?

I configured it, Sickrage can browse rclone’s remote but can’t move/copy/create/rename/ in this remote.
My idea would be to create a tmp folder where sickrage will put files post-processed and configure an extra scripts in sickrage to move files in rclone.

What’s your opinion ? Better idea ?


#6

If you compile rclone from source then it can be written to so Sickrage can change things. You should really never count on that however. Writing to the mount with acd_cli or with rclone is a bad idea. It will never have the performance nor reliability as the upload functions and the authors both say that.

I use Filebot’s AMC script to manage my renaming, downloading of posters, subtitles, etc.

So:

  • sickrage will use my rclone mount for shows and pass the missing to rtorrent.
  • rtorrent downloads
  • filebot with AMC post processes by hardlinking the files, renaming the files, gathering posters, subtitles, etc into a separate directory that I call ‘complete’. I use the AMC ‘plex’ formatting so the shows have the proper file structure and names for Plex.
  • rclone scans this complete directory and will upload anything in there to ACD and Google Drive. I do a copy to gdrive and a move to ACD.
  • sickrage scans my rclone mount to see that the show is there

I do not have sickrage or couchpotato do any post processing. I purposefully disable all of that as its handled by filebot.


#7

RXWatcher, how does your script create a hard link?


#8

Filebot does that… Look up filebot AMC script


#9

Set --action hardlink


#10

thanks for sharing. i’m not sure it does exactly what i want. the hard-link function looks good, but I don’t want filebot to sort by music, etc

I like how AutoTools sorts into trackers by directory, I would like to replicate that… and I can’t figure out how exactly except by running a script that checks each directory for new files, which isn’t very elegant.

I haven’t tested it, but I guess it would go something like this:

#!/bin/sh

# rtorrent.rc
# system.method.set_key=event.download.finished,filebot,"execute={move-torrents.sh,$d.get_base_path=,$d.get_name=,$d.get_custom1=}"
if [[ ":$1:" == *":tracker1:"* ]]; then rclone move --ignore-existing /home/user/finished/tracker1.com/ amazon:/finished/tracker1.com/ && rm $1 && ln /media/amazon/finished/tracker1/$2 $1
if [[ ":$1:" == *":tracker2:"* ]]; then rclone move --ignore-existing /home/user/finished/tracker2.com/ amazon:/finished/tracker2.com/

#11

Hi there o/

I have the exact same setup as you do.
Sickrage/Couchpotato + rclone + ACD

And my only problem is adding new shows to sickrage.
I mount rclone readonly for stability, so sickrage can’t write the TV show folder to the ACD mount point, This is done later via upload.
How did you get around this issue? are you mounting ACD rw?

Thanks in advance.