concurrent file read/write
You can encounter the problem of race condition
To avoid this if you only need simple append data you can use
file_put_contents(,,FILE_APPEND|LOCK_EX);
and don't worry about your data integrity.
If you need more complex operation you can use flock (used for simple reader/writer problem)
For your PHP script counter I suggest you to do with something like this:
//> Register this impression
file_put_contents( $file, "\n", FILE_APPEND|LOCK_EX );
//> Read the total number of impression
echo count(file($file));
This way you don't have to implement a blocking mechanism and you can keep the system and your code script lighter
Addendum
To avoid to have to count the array file()
you can keep the system even lighter with this:
//> Register this impression
file_put_contents( $file, '1', FILE_APPEND|LOCK_EX );
//> Read the total number of impression
echo filesize($file);
Basically to read the number of your counter you just need to read its filesize considering each impression add 1 byte to it
No, requests will not be queued, reader will get damaged data, writers will overwrite each other, data will be damaged.
You can try to use flock and x
mode of fopen.
It's not so easy to code good locking mutex, so try to find existing variant, or try to move data from file to DB.