Protocols
The following protocols are available globally.
-
An extension of
IVSAudioDevice
that allows for submittingCMSampleBuffer
s manually. This can be used to submit PCM audio directly to the SDK.Note
Make sure you have anIVSMixerSlotConfiguration
that requests thepreferredAudioInput
value ofIVSDeviceTypeUserAudio
.Declaration
Objective-C
@protocol IVSCustomAudioSource <IVSAudioDevice>
Swift
protocol IVSCustomAudioSource : IVSAudioDevice
-
An extension of
See moreIVSAudioDevice
that represents a physical microphone accessible by the host device.Declaration
Objective-C
@protocol IVSMicrophone <IVSAudioDevice, IVSMultiSourceDevice>
Swift
protocol IVSMicrophone : IVSAudioDevice, IVSMultiSourceDevice
-
A delegate that provides updates about the attached microphone.
See moreDeclaration
Objective-C
@protocol IVSMicrophoneDelegate
Swift
protocol IVSMicrophoneDelegate
-
Provide a delegate to receive status updates and errors from the SDK. Updates may be run on arbitrary threads and not the main thread.
See moreDeclaration
Objective-C
@protocol IVSBroadcastSessionDelegate <NSObject>
Swift
protocol Delegate : NSObjectProtocol
-
Represents an input device such as a camera or microphone.
See moreDeclaration
Objective-C
@protocol IVSDevice <IVSErrorSource>
Swift
protocol IVSDevice : IVSErrorSource
-
Use this delegate to be notified about added / removed devices
See moreDeclaration
Objective-C
@protocol IVSDeviceDiscoveryDelegate <NSObject>
Swift
protocol IVSDeviceDiscoveryDelegate : NSObjectProtocol
-
An object capable of emitting errors.
See moreDeclaration
Objective-C
@protocol IVSErrorSource <NSObject>
Swift
protocol IVSErrorSource : NSObjectProtocol
-
Provide a delegate to receive errors emitted from
See moreIVSErrorSource
objects. Updates may be run on arbitrary threads and not the main thread.Declaration
Objective-C
@protocol IVSErrorDelegate <NSObject>
Swift
protocol IVSErrorDelegate : NSObjectProtocol
-
An extension of
IVSImageDevice
that allows for submittingCMSampleBuffer
s manually. The currently supported pixel formats are:kCVPixelFormatType_32BGRA
kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange
On devices that support it, theLossless
andLossy
equivalents of these formats are also supported.Note
Make sure you have anIVSMixerSlotConfiguration
that requests thepreferredVideoInput
value ofIVSDeviceTypeUserVideo
.Declaration
Objective-C
@protocol IVSCustomImageSource <IVSImageDevice>
Swift
protocol IVSCustomImageSource : IVSImageDevice
-
An extension of
IVSCustomImageSource
that is used to pre-encode an image or video to be rendered when the application goes into the background.The timing information on the samples provided via
onSampleBuffer
is ignored on this image source, every image submitted will be encoded as the next frame based on thetargetFramerate
on the providedIVSVideoConfiguration
.Note
samples submitted will be processed on the invoking thread. For large amounts samples, submit them on a background queue.Generating large numbers of samples from MP4 files is fairly straight forward using AVFoundation. There are multiple ways to do it in fact You can use
AVAssetImageGenerator
andgenerateCGImagesAsynchronously
to generate an image at every 1 / FPS increment. Be certain to setrequestedTimeToleranceAfter
andrequestedTimeToleranceBefore
to.zero
, otherwise it will batch the same frame multiple times.You can also use an
AVPlayer
instance withAVPlayerItemVideoOutput
and aDisplayLink
, using thecopyPixelBuffer
API from the video output.Both can provide a series of CVPixelBuffers to submit to this API in order to broadcast a looping clip while in the background.
See moreDeclaration
Objective-C
@protocol IVSBackgroundImageSource <IVSCustomImageSource>
Swift
protocol IVSBackgroundImageSource : IVSCustomImageSource
-
An extension of
See moreIVSImageDevice
that represents a physical camera accessible by the host device.Declaration
Objective-C
@protocol IVSCamera <IVSImageDevice, IVSMultiSourceDevice>
Swift
protocol IVSCamera : IVSImageDevice, IVSMultiSourceDevice
-
A delegate that provides updates about the attached camera on the main queue.
See moreDeclaration
Objective-C
@protocol IVSCameraDelegate <NSObject>
Swift
protocol IVSCameraDelegate : NSObjectProtocol
-
Protocol for messages parsed out of image frames.
Declaration
Objective-C
@protocol IVSBroadcastImageFrameMessage <NSObject>
Swift
protocol IVSImageFrameMessage : NSObjectProtocol
-
A mixed audio device that can accept multiple audio sources to be mixed together to produce a final output. A mixed audio device with no input sources will still produce output, but the output will be silent.
Note
Multiple audio sources will have their audio added together. If there are too many loud sources sending samples at the same time, the output may be clipped. Turn down the gain on individual sources to compensate if necessary.This can be created through
See moreIVSDeviceDiscovery
.Declaration
Objective-C
@protocol IVSMixedAudioDevice <IVSAudioDevice, IVSMixedDevice>
Swift
protocol IVSMixedAudioDevice : IVSAudioDevice, IVSMixedDevice
-
A mixed device is a device that can accept multiple input sources to be mixed together to produce a final output. A single
IVSMixedDevice
instance will be either all audio or all video sources, and the sources and configurations attached will be of the same type. A mixed device with no inputs sources will still produce output, but the output will be a a black image or silent audio, depending on the mixed device type.A mixed device can be attached to an
IVSBroadcastSession
viaattachDevice
or to anIVSStage
by wrapping it in aIVSLocalStageStream
.Declaration
Objective-C
@protocol IVSMixedDevice
Swift
protocol IVSMixedDevice
-
A mixed image device that can accept multiple image sources to be mixed together to produce a final output. A mixed image device with no input sources will still produce output, but the output will be a black image.
Note
Previews for this device will be delayed slightly from the input sources due to the time it takes to composite and render the various sources into a single output stream.@warn This device will stop mixing when the application is in the background as a result of limited access to metal shaders while backgrounded.
This can be created through
See moreIVSDeviceDiscovery
.Declaration
Objective-C
@protocol IVSMixedImageDevice <IVSImageDevice, IVSMixedDevice>
Swift
protocol IVSMixedImageDevice : IVSImageDevice, IVSMixedDevice
-
A delegate that provides information about the associated
See moreIVSRemoteStageStream
.Declaration
Objective-C
@protocol IVSRemoteStageStreamDelegate <IVSStageStreamDelegate>
Swift
protocol IVSRemoteStageStreamDelegate : IVSStageStreamDelegate
-
A protocol to implement that can be used to build user interfaces. Implementing a
See moreIVSStageRenderer
provides all the necessary information about a Stage to create a complete UI.Declaration
Objective-C
@protocol IVSStageRenderer <NSObject>
Swift
protocol IVSStageRenderer : NSObjectProtocol
-
The Strategy is the decision engine associated with an
See moreIVSStage
. It is how the Stage asks the host application what actions to take. If the host application wants to change their answer to a question, they can call[IVSStage refreshStrategy]
.Declaration
Objective-C
@protocol IVSStageStrategy
Swift
protocol IVSStageStrategy
-
A delegate that provides information about the associated
See moreIVSStageStream
.Declaration
Objective-C
@protocol IVSStageStreamDelegate <NSObject>
Swift
protocol IVSStageStreamDelegate : NSObjectProtocol