Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.
Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.
Not gonna lie, I just map a network share and copy and paste through the gui.
Sounds very straight forward. Do you have a samba docker container running on your server or how do you do that?
I just type
sftp://[ip, domain or SSH alias]into my file manager and browse it as a regular folder
Same lol, somebody please enlighten me on a faster way!
Rsync and NFS for me.
scpscp is deprecated.
SCP, the protocol, is deprecated. scp, the command, just uses the SFTP protocol these days. I find its syntax convenient.
Checks username… yeah that tracks
rclone. I have a few helper functions;
fn mount { rclone mount http: X: --network-mode } fn kdrama {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/KDrama/$x --filter-from ~/.config/filter.txt } fn tv {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/TV/$x --filter-from ~/.config/filter.txt } fn downloads {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/Downloads/$x --filter-from ~/.config/filter.txt }So I download something to my seedbox, then use
rclone lsd http:to get the exact name of the folder/files, and runtv "filename"and it runs my function. Pulls all the files (based on filter.txt) using multiple threads to the correct folder on my NAS. Works great, and maxes out my connection.Resilion Sync
- sftp for quick shit like config files off a random server because it easy
- rsync for big one-time moves
- smb for client-facing network shares
- NFS for SAN usage (mostly storage for virtual machines)
rsync if it’s a from/to I don’t need very often
More common transfer locations are done via NFS
I’d say use something like zeroconf(?) for local computer names. Or give them names in either your dns forwarder (router), hosts file or ssh config. Along with shell autocompletion, that might do the job. I use scp, rsync and I have a NFS share on the NAS and some bookmarks in Gnome’s file manager, so i just click on that or type in scp or rsync.
Magic wormhole is pretty dead simple https://magic-wormhole.readthedocs.io/en/latest/welcome.html#installation
I use this a lot at work for moving stuff between different test vms, as you don’t need to check IPs/hostnames
People have already covered most of the tools I typically use, but one I haven’t seen listed yet that is sometimes convenient is
python3 -m http.serverwhich runs a small web server that shares whatever is in the directory you launched it from. I’ve used that to download files onto my phone before when I didn’t have the right USB cables/adapters handy as well as for getting data out of VMs when I didn’t want to bother setting up something more complex.you can use a regular ftp server with administrator and user rights, distribute rights to those who replenish, and those who just take - guests at home I transfer in this way from computer to computer without connecting them to a common network, what could be simpler? why invent some ways with keys or bash if there is a 40-year-old technology that just works great, and to open ftp it is enough to enter the IP address in the explorer
What do you mean by specifying IP address?
rsync -are ssh from to@pc:/dirI used rsync but specifying the IP address and the folders and everything is bit fiddly.
If this is Linux:
-
You can throw an entry in
/etc/hoststo add a hostname mapping, even if you don’t want to set up DNS:192.168.34.16 tea -
It looks like bash has tab-completion for
rsyncpaths set up by default these days on my Debian box. Will probably need to have passwordless or ssh-agent-based pub/privkey authentication set up in some fashion. -
If you don’t need
rsync’s functionality – replicating attributes and reasonable resuming and doing partial copies – I tend to uselftp, which supports sftp, an ssh-based file-transfer mechanism; one uses something likesftp://remotehost/remotepath. That’s more-comfortable to browse, can bip around the local and remote filesystem with tab-completingcd,lcd,pwd, andlpwdand list remote files. Themirrorandmirror -Rcommand will move a file tree from or to the remote system. -
If you do want rsync’s “update changed files” functionality, and also want bidirectional synchronization in each direction instead of rsync’s unidirectional functionality — I do this to maintain a synchronized directory of videos that I want to be able to watch offline on various devices, among other things — I use
unison. You can set this up to auto-synchronize a set of directories. If you’ve got a preconfigured set of directories to synchronize set up, just type “unison”, and bam. -
If I’m doing a lot of browsing and work on the remote end, I use emacs’s dired plus TRAMP as a rough approximation of a “two-pane file manager”, more-properly called an orthodox file manager.
emacs /ssh:remotehost:/remotepath localpathwill fire it up. TRAMP is pretty clever, tries to do stuff on the remote machine as much as possible – like, you can do stuff like version-control on remote directories and it’ll use git on the remote machine, can grep through the remote file tree without transferring the files locally, etc. That being said, this is probably mostly-applicable to users who already heavily use emacs, but it’s a handy technique if you have only a low-bandwidth connection and need to work on files remotely. I have emacs set up to default to using the “other pane” as a default target for things like copy and move commands using the following text in myinit.el:;; Try suggesting dired targets (setq dired-dwim-target t) -
If I’m wanting to play a video file remotely, I’ll sometimes do a sshfs mount using FUSE:
$ mkdir videos $ sshfs remotehost:/videos `pwd`/videos $ mpv videos/vid01.mkv $ umount videos
-
What’s wrong with rsync? If you don’t like IP addresses, use a domain name. If you use certificate authentication, you can tab complete the folders. It’s a really nice UX IMO.
If you’ll do this a lot, just mount the target directory with sshfs or NFS. Then use rsync or a GUI file manager.






