I am having a website whose data file has grown to 2GB. I took the backup of same and would like to transfer it on my local linux machine. Problem is the maximum file size I can download from hosting provider is 400MB. Could you please let me know if there's any script that can breakup the large file in to chunks of 400MB (or less) and once downloaded same needs to be merged to create original large file?
Only Web Master / Web Hosting related signatures allowed.