IVSBackgroundImageSource
Objective-C
@protocol IVSBackgroundImageSource <IVSCustomImageSource>
Swift
protocol IVSBackgroundImageSource : IVSCustomImageSource
An extention of IVSCustomImageSource
that is used to pre-encode an image or video to be rendered when the application
goes into the background.
The timing information on the samples provided via onSampleBuffer
is ignored on this image source, every image submitted will
be encoded as the next frame based on the targetFramerate
on the provided IVSVideoConfiguration
.
Note
samples submitted will be processed on the invoking thread. For large amounts samples, submit them on a background queue.Generating large numbers of samples from MP4 files is fairly straight forward using AVFoundation. There are multiple ways to do it in fact
You can use AVAssetImageGenerator
and generateCGImagesAsynchronously
to generate an image at every 1 / FPS increment.
Be certain to set requestedTimeToleranceAfter
and requestedTimeToleranceBefore
to .zero
, otherwise it will batch the same frame multiple times.
You can also use an AVPlayer
instance with AVPlayerItemVideoOutput
and a DisplayLink
, using the copyPixelBuffer
API from the video output.
Both can provide a series of CVPixelBuffers to submit to this API in order to broadcast a looping clip while in the background.
-
Signals that no more images will be submitted for encoding and final processing should begin. Any errors that happen during this process will be emitted through the callback provided to
createAppBackgroundImageSource
.Declaration
Objective-C
- (void)finish;
Swift
func finish()
-
A convenience API that doesn’t require creating a
CMSampleBufferRef
to provide to theIVSCustomImageSource
API, since timing data is ignored for the background source.Declaration
Objective-C
- (void)addPixelBuffer:(nonnull CVPixelBufferRef)pixelBuffer;
Swift
func add(_ pixelBuffer: CVPixelBuffer)
Parameters
pixelBuffer
The PixelBuffer to be encoded.