I would agree with breaking it up using a file compression utility on something this size; then if something goes cranky we can use tools on both sides to generate a signature of the files and find out which piece needs re-sending (saving the time of redoing the whole lot). Also if an interruption in network connectivity takes place, it is easier to resume where you left off if you only have to redo the chunk that failed and any subsequent ones still needing to be sent.
That said there's not a limit per se with the server, but when using different clients (including browsers) they may have their own limits or preferences, so the actual size you break them down to is a matter of experimentation.
I would think up to 2GB per piece would normally be fine and with a file of this size, and have seen both larger and smaller values used successfully.
7zip or winzip are both fine. If your source file is on a unix box and you want to transfer it directly from there, then the 'split' command line tool is also fine (we can use the Windows 'copy' command with the '/b' and '+' operators for merging a file back together that has been 'split' this way).