Live streaming through MP4
Here's my thoughts guys some of it might be right on others way way off. I plead ignorance because no one have really documented this process fully, its all an educated guess.
AvAssetWriter only encodes to a file, there seems to be no way to get encoded video to memory. Reading the file while it is being written to from a background thread to say a socket results in an elementary stream, this is essentially an m4v, which its a container with h264/acc mdata, but no moov atoms. (in other words no header) No apple supplied player can play this stream, but a modified player based on ffplay should be able to decode and play the stream. This should work, because ffplay use libavformat which can decode elementary streams, one caveat since there is no file length info, some things have to be determined by the play, the DTS and PTS and also the player can't seek within the file.
Alternatively an the raw naul's from the m4v stream can be used to construct an rtmp stream.
If you want to discuss further you can contact me directly.
How you get at the data.
Since your going to have to rebuild the file on the receiving side anyway, I guess you could just kind of segment it, Steve Mcfarin wrote a little appleSegmentedEcorder you can find on his github page, this solves some of the issues for moov atoms since you have all the file info.
You may use fragmented MP4. A fragmented MP4 file is built a follows:
moov [moof mdat]+
The moov box then only contains basic information about the tracks (how many, their type , codec initialization and so on) but no information about the samples in the track. The information about sample locations and sample sizes is in the moof box, each moof box is followed by a mdat that contains the samples as described in the preceding moof box. Typically one would choose the length of a (moof, mdat)-pair to be around 2,4 or 8 seconds (there is no specification on that but these values seem to be reasonable for most usecases).
This is a way to construct a neverending MP4 stream.