These are the scenarios where the big files are needed to break into smaller chucks in order to transfer it. Below is a good way of doing it :
to split the big file :
split -d -b <bytes of the splitted file> <source-file> <prefix>e.g.
split -b 1500000000 very-huge-file split-chuck-file-
The output of the files will be "split-chuck-file-00", "split-chuck-file-01", "split-chuck-file-02" and so on. Every file will be approximately 1.5 gb.
To join back the chunks of file into the original file :
cat <files in order> >> file-namee.g.
cat split-chuck-file-00 split-chuck-file-01 split-chuck-file-02 >> very-huge-file
Additionally, one might want to put the chunks of file into the media directly, just "cd" into the destination directory then execute
split -b 1500000000 very-huge-file split-chuck-file-
Voilla !!!
1 comment:
Thank you for posting this, just what I needed to put some large backup images on a FAT32 disk.
Many thanks :-)
Post a Comment