Protocols
The following protocols are available globally.
-
An extention of
IVSAudioDevice
that allows for submittingCMSampleBuffer
s manually. This can be used to submit PCM audio directly to the SDK.Note
Make sure you have anIVSMixerSlotConfiguration
that requests thepreferredAudioInput
value ofIVSDeviceTypeUserAudio
.Declaration
Objective-C
@protocol IVSCustomAudioSource <IVSAudioDevice>
Swift
protocol IVSCustomAudioSource : IVSAudioDevice
-
An extension of
See moreIVSAudioDevice
that represents a physical microphone accessible by the host device.Declaration
Objective-C
@protocol IVSMicrophone <IVSAudioDevice>
Swift
protocol IVSMicrophone : IVSAudioDevice
-
A delegate that provides updates about the attached microphone.
See moreDeclaration
Objective-C
@protocol IVSMicrophoneDelegate
Swift
protocol IVSMicrophoneDelegate
-
Provide a delegate to receive status updates and errors from the SDK. Updates may be run on arbitrary threads and not the main thread.
See moreDeclaration
Objective-C
@protocol IVSBroadcastSessionDelegate <NSObject>
Swift
protocol Delegate : NSObjectProtocol
-
Represents an input device such as a camera or microphone.
See moreDeclaration
Objective-C
@protocol IVSDevice <NSObject>
Swift
protocol IVSDevice : NSObjectProtocol
-
An extention of
IVSImageDevice
that allows for submittingCMSampleBuffer
s manually. The currently supported pixel formats are:kCVPixelFormatType_32BGRA
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
On devices that support it, theLossless
andLossy
equivalents of these formats are also supported.Note
Make sure you have anIVSMixerSlotConfiguration
that requests thepreferredVideoInput
value ofIVSDeviceTypeUserVideo
.Declaration
Objective-C
@protocol IVSCustomImageSource <IVSImageDevice>
Swift
protocol IVSCustomImageSource : IVSImageDevice
-
An extention of
IVSCustomImageSource
that is used to pre-encode an image or video to be rendered when the application goes into the background.The timing information on the samples provided via
onSampleBuffer
is ignored on this image source, every image submitted will be encoded as the next frame based on thetargetFramerate
on the providedIVSVideoConfiguration
.Note
samples submitted will be processed on the invoking thread. For large amounts samples, submit them on a background queue.Generating large numbers of samples from MP4 files is fairly straight forward using AVFoundation. There are multiple ways to do it in fact You can use
AVAssetImageGenerator
andgenerateCGImagesAsynchronously
to generate an image at every 1 / FPS increment. Be certain to setrequestedTimeToleranceAfter
andrequestedTimeToleranceBefore
to.zero
, otherwise it will batch the same frame multiple times.You can also use an
AVPlayer
instance withAVPlayerItemVideoOutput
and aDisplayLink
, using thecopyPixelBuffer
API from the video output.Both can provide a series of CVPixelBuffers to submit to this API in order to broadcast a looping clip while in the background.
See moreDeclaration
Objective-C
@protocol IVSBackgroundImageSource <IVSCustomImageSource>
Swift
protocol IVSBackgroundImageSource : IVSCustomImageSource