Why would an encrypted file be ~35% larger than an unencrypted one?
Most likely, the encrypted file is base64 encoded which would account for 33.3% file increase (you encode three bytes of data in four bytes of base64 data). Inserting a new line every 64 characters to make it easier to read (as is done by ASCII armor in openssl, GPG, PGP) will increase the size by 65/64.
Combining these two effects results in the new file being (4/3)*(65/64) = 135.4% of the size of the original or an increase in file size of 35.4%.
I've gone through the calculation in this answer here.
You are correct though that encryption should not need to significantly change the file size. It possibly adds a couple blocks of data if there is a header, an initialization vector/nonce, some padding to make it a full block and/or MAC to check integrity, though these changes will be insignificant for large files (e.g., adding four blocks to an AES encoded file that is 1 MB would make the file 0.006% larger).
However, to not increase the files size, you need to be fine with storing and passing around the encrypted data as an arbitrary binary. Arbitrary binaries are often blocked over email to prevent spreading computer viruses, and are often difficult to open outside of hexeditors. Base64 encoded files are easier to pass around and is a more portable format than binary files of an unknown file type.
If the files are being compressed then you might see this discrepancy.
Compression algorithms work best on non-random data. Encryption aims to generate randomness from information. Information is generally easy to compress as it has patterns. However, if you encrypt it, you are generally erasing any patterns (and information).
Example: 2.75GB of email archive files can be easily compressed down to <.5GB. If these email archives were encrypted, however, then the compressed version would be much closer to 2.75GB.
Normally, the % mark says that the file might be Base64 encoded after encrypting, and also might get some checksum over each block to prevent corruption. Base64 encodes characters of 8 bits into characters of 6 bits, which means the file in question gets about 30 % larger due to more charachters required to render the whole file. Add a per-block checksum and you are up to 35 %.
Normally, the encryption itself adds some overhead. Normally, the overhead is header+footer, eventual encrypted key, parameters, salts, checksum, and also one block size minus 1, because if the encrypted data is not evenly dividable with the block size, you would have to pad with up to block size - 1.
But all those data in the previous sentence would add a static amount of data to every file, regardless of its size, even if its 1 or 100 GB large.
The data enlargement expressed in % says its a reencoding process like base64 or something similiar.