Trying to understand CMTime
A CMTime
struct represents a length of time that is stored as rational number (see CMTime Reference). CMTime
has a value
and a timescale
field, and represents the time value/timescale seconds
.
CMTimeMake
is a function that returns a CMTime
structure, for example:
CMTime t1 = CMTimeMake(1, 10); // 1/10 second = 0.1 second
CMTime t2 = CMTimeMake(2, 1); // 2 seconds
CMTime t3 = CMTimeMake(3, 4); // 3/4 second = 0.75 second
CMTime t4 = CMTimeMake(6, 8); // 6/8 second = 0.75 second
The last two time values t3
and t4
represent the same time value, therefore
CMTimeCompare(t3, t4) == 0
If you set the videoMinFrameDuration
of a AVCaptureSession
is does not make a difference if you set
connection.videoMinFrameDuration = CMTimeMake(1, 20); // or
connection.videoMinFrameDuration = CMTimeMake(2, 40);
In both cases the minimum time interval between frames is set to 1/20 = 0.05 seconds.
My experience differs.
For let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
If you set a breakpoint and look in the debugger side window it says:
"383 100ths of a second"
Testing by seeking to a fixed offset in a video in AVPlayer has confirmed this.
So put the actual number of seconds in the seconds field, and the precision in the preferredTimescale field. So 100 means precision of hundredths of a second.
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 100)
Still seeks to the same place in the video, but it displays in the debugger side window as "3833 1000ths of a second"
Doing
let testTime = CMTime(seconds: 3.83, preferredTimescale: 1)
Does not seek to the same place in the video, because it's been truncated, and it displays in the debugger side window as "3 seconds". Notice that the .833 part has been lost due to the preferredTimescale.