Copy large amount of file to remote server using nohup tar and ssh

This command will copy large amount of file to remote server by compressing and decompressing on the fly. It saves time and bandwidth. It will execute in background, so you can detach current login session.

nohup sh -c “tar -c /any/directory/at/source/server/ | gzip -2 | ssh server-alias ‘cat | tar xz -C /target/directory/of/target/server/'” > /dev/null 2>&1 &

here nohup output sent to /dev/null that means i don’t want any nohup output. you can adjust its behavior.

I use the command for millions of file that occupied more than 500 GB.

Posted in linuxTagged , , ,