Linux - how to split / break big files in to smaller chunks

Just did a backup and noticed that the backup tar file is more then 2gb, which fail to copy to a fat32 thumb drive. The mysqldump output text file is more then 5gb and needless to try as it cannot fit into any fat32 formatted drive. Or even regular file that is more then 10mb would not be able to email to any Yahoo or Gmail account.

These are the scenarios where the big files are needed to break into smaller chucks in order to transfer it. Below is a good way of doing it :

to split the big file :
split -d -b <bytes of the splitted file> <source-file> <prefix>
split -b 1500000000 very-huge-file split-chuck-file-

The output of the files will be "split-chuck-file-00", "split-chuck-file-01", "split-chuck-file-02" and so on. Every file will be approximately 1.5 gb.

To join back the chunks of file into the original file :
cat <files in order> >> file-name
cat split-chuck-file-00 split-chuck-file-01 split-chuck-file-02 >> very-huge-file

Additionally, one might want to put the chunks of file into the media directly, just "cd" into the destination directory then execute
split -b 1500000000 very-huge-file split-chuck-file-

Voilla !!!

1 comment:

Al said...

Thank you for posting this, just what I needed to put some large backup images on a FAT32 disk.

Many thanks :-)