Every byte is 8 bits. In ASCII, a byte is made up of 7 "significant" bits and an eighth, "insignificant" bit, used for error control. Binary uses all 8 bits as significant bits. Text-only is ASCII, whereas word processing documents, image and data files, etc, are binary.
If a binary file is transferred in ASCII mode, every eighth bit becomes corrupted and the file unusable — you inevitably corrupt a binary file by sending it as ASCII.
But sending ASCII files as binary works much of the time, if the operating systems on both ends are using the same convention for that eighth "error-checking" bit. However, you certainly CAN get into trouble with Unix/Mac/Windows mismatches which use different conventions.
So, a javascript file, being a text-only or 7 bit file, will usually transfer just fine in either mode. But it is most properly ASCII, as far as I know. This is also true for HTML files.