cc1plus: error: include: Value too large for defined data type when compiling with g++
I have found a solution on Ubuntu at least. I, like you have noticed that the error only occurs on mounted samba shares - it seems to come from g++ 'stat'ing the file, the inode returns a very large value.
When mounting the share add ,nounix,noserverino to the options, ie:
mount -t cifs -o user=me,pass=secret,nounix,noserverino //server/share /mount
I found the info at http://bbs.archlinux.org/viewtopic.php?id=85999
I had similar problem. I compiled a project in a CIFS mounted samba share. With one Linux kernel the compilation was done, but using an other Linux kernel (2.6.32.5), I got similar error message: "Value too large for defined data type". When I used the proposed "nounix,noserverino" CIFS mounting option, the problem was fixed. So in that case there is a problem with CIFS mounting, so the error message is misleading, as there are no big files.
GNU Core Utils:
27 Value too large for defined data type
It means that your version of the utilities were not compiled with large file support enabled. The GNU utilities do support large files if they are compiled to do so. You may want to compile them again and make sure that large file support is enabled. This support is automatically configured by autoconf on most systems. But it is possible that on your particular system it could not determine how to do that and therefore autoconf concluded that your system did not support large files.
The message "Value too large for defined data type" is a system error message reported when an operation on a large file is attempted using a non-large file data type. Large files are defined as anything larger than a signed 32-bit integer, or stated differently, larger than 2GB.
Many system calls that deal with files return values in a "long int" data type. On 32-bit hardware a long int is 32-bits and therefore this imposes a 2GB limit on the size of files. When this was invented that was HUGE and it was hard to conceive of needing anything that large. Time has passed and files can be much larger today. On native 64-bit systems the file size limit is usually 2GB * 2GB. Which we will again think is huge.
On a 32-bit system with a 32-bit "long int" you find that you can't make it any bigger and also maintain compatibility with previous programs. Changing that would break many things! But many systems make it possible to switch into a new program mode which rewrites all of the file operations into a 64-bit program model. Instead of "long" they use a new data type called "off_t" which is constructed to be 64-bits in size. Program source code must be written to use the off_t data type instead of the long data type. This is typically done by defining -D_FILE_OFFSET_BITS=64 or some such. It is system dependent. Once done and once switched into this new mode most programs will support large files just fine.
See the next question if you have inadvertently created a large file and now need some way to deal with it.