When transferring large (Several GBs) of data over various forms of file transfer, such as: FTP, SFTP, NFS and Samba. They all suffer from the same issue of multiple small files hampering speeds down to MBs or KBs at times - even over a 10Gbps link.
However if I was to zip, tar or rar the entire folder before transferring, then the network link gets fully saturated.
What is it that causes this effect?
What can be done to improve the performance of large transfers with many small individual files over a network?
Out of the available file transfer protocols, which is best suited for this?
I have full administration over the network so all configurations and options are available like setting MTU and Buffer sizes on network interfaces and turning off async and encryption in file server configurations as a couple of throwaway ideas.