HTTP LIve Streaming

This might help in swift:

    import UIKit
    import MediaPlayer

 class ViewController: UIViewController {

     var streamPlayer : MPMoviePlayerController =  MPMoviePlayerController(contentURL: NSURL(string:"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"))
     override func viewDidLoad() {
         super.viewDidLoad()
         streamPlayer.view.frame = self.view.bounds
         self.view.addSubview(streamPlayer.view)

         streamPlayer.fullscreen = true
         // Play the movie!
         streamPlayer.play()
}
}

MPMoviePlayerController is deprecated from iOS 9 onwards. We can use AVPlayerViewController() or AVPlayer for the purpose. Have a look:

import AVKit
import AVFoundation
import UIKit

AVPlayerViewController :

override func viewDidAppear(animated: Bool){
let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
let player = AVPlayer(URL: videoURL!)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.presentViewController(playerViewController, animated: true) {
    playerViewController.player!.play()
}
}

AVPlayer :

 override func viewDidAppear(animated: Bool){
    let videoURL = NSURL(string: "https://clips.vorwaerts-gmbh.de/big_buck_bunny.mp4")
    let player = AVPlayer(URL: videoURL!)
    let playerLayer = AVPlayerLayer(player: player)
    playerLayer.frame = self.view.bounds
    self.view.layer.addSublayer(playerLayer)
    player.play()
    }

HTTP Live Streaming

HTTP Live Streaming is a streaming standard proposed by Apple. See the latest draft standard.

Files involved are

  • .m4a for audio (if you want a stream of audio only).
  • .ts for video. This is a MPEG-2 transport, usually with a h.264/AAC payload. It contains 10 seconds of video and it is created by splitting your original video file, or by converting live video.
  • .m3u8 for the playlist. This is a UTF-8 version of the WinAmp format.

Even when it's called live streaming, usually there is a delay of one minute or so during which the video is converted, the ts and m3u8 files written, and your client refresh the m3u8 file.

All these files are static files on your server. But in live events, more .ts files are added, and the m3u8 file is updated.

Since you tagged this question iOS it is relevant to mention related App Store rules:

  • You can only use progressive download for videos smaller than 10 minutes or 5 MB every 5 minutes. Otherwise you must use HTTP Live Streaming.
  • If you use HTTP Live Streaming you must provide at least one stream at 64 Kbps or lower bandwidth (the low-bandwidth stream may be audio-only or audio with a still image).

Example

Get the streaming tools

To download the HTTP Live Streaming Tools do this:

  • Get a Mac or iPhone developer account.
  • Go to https://developer.apple.com and search for "HTTP Live Streaming Tools", or look around at https://developer.apple.com/streaming/.

Command line tools installed:

 /usr/bin/mediastreamsegmenter
 /usr/bin/mediafilesegmenter
 /usr/bin/variantplaylistcreator
 /usr/bin/mediastreamvalidator
 /usr/bin/id3taggenerator

Descriptions from the man page:

  • Media Stream Segmenter: Create segments from MPEG-2 Transport streams for HTTP Live Streaming.
  • Media File Segmenter: Create segments for HTTP Live Streaming from media files.
  • Variant Playlist Creator: Create playlist for stream switching from HTTP Live streaming segments created by mediafilesegmenter.
  • Media Stream Validator: Validates HTTP Live Streaming streams and servers.
  • ID3 Tag Generator: Create ID3 tags.

Create the video

Install Macports, go to the terminal and sudo port install ffmpeg. Then convert the video to transport stream (.ts) using this FFMpeg script:

# bitrate, width, and height, you may want to change this
BR=512k
WIDTH=432
HEIGHT=240
input=${1}
 
# strip off the file extension
output=$(echo ${input} | sed 's/\..*//' )
 
# works for most videos
ffmpeg -y -i ${input} -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s ${WIDTH}x${HEIGHT} -vcodec libx264 -b ${BR} -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 7 -trellis 0 -refs 0 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate ${BR} -bufsize ${BR} -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 30 -qmax 51 -qdiff 4 -level 30 -aspect ${WIDTH}:${HEIGHT} -g 30 -async 2 ${output}-iphone.ts

This will generate one .ts file. Now we need to split the files in segments and create a playlist containing all those files. We can use Apple's mediafilesegmenter for this:

mediafilesegmenter -t 10 myvideo-iphone.ts

This will generate one .ts file for each 10 seconds of the video plus a .m3u8 file pointing to all of them.

Setup a web server

To play a .m3u8 on iOS we point to the file with mobile safari. Of course, first we need to put them on a web server. For Safari (or other player) to recognize the ts files, we need to add its MIME types. In Apache:

 AddType application/x-mpegURL m3u8
 AddType video/MP2T ts

In lighttpd:

 mimetype.assign = ( ".m3u8" => "application/x-mpegURL", ".ts" => "video/MP2T" )

To link this from a web page:

<html><head>
    <meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
</head><body>
    <video width="320" height="240" src="stream.m3u8" />
</body></html>

To detect the device orientation see Detect and Set the iPhone & iPad's Viewport Orientation Using JavaScript, CSS and Meta Tags.

More stuff you can do is create different bitrate versions of the video, embed metadata to read it while playing as notifications, and of course have fun programming with the MoviePlayerController and AVPlayer.


Another explanation from Cloudinary http://cloudinary.com/documentation/video_manipulation_and_delivery#http_live_streaming_hls

HTTP Live Streaming (also known as HLS) is an HTTP-based media streaming communications protocol that provides mechanisms that are scalable and adaptable to different networks. HLS works by breaking down a video file into a sequence of small HTTP-based file downloads, with each download loading one short chunk of a video file.

As the video stream is played, the client player can select from a number of different alternate video streams containing the same material encoded at a variety of data rates, allowing the streaming session to adapt to the available data rate with high quality playback on networks with high bandwidth and low quality playback on networks where the bandwidth is reduced.

At the start of the streaming session, the client software downloads a master M3U8 playlist file containing the metadata for the various sub-streams which are available. The client software then decides what to download from the media files available, based on predefined factors such as device type, resolution, data rate, size, etc.