Protocols

The following protocols are available globally.

  • This represents an IVSDevice that provides audio samples.

    See more

    Declaration

    Objective-C

    @protocol IVSAudioDevice <IVSDevice>

    Swift

    protocol IVSAudioDevice : IVSDevice
  • An extention of IVSAudioDevice that allows for submitting CMSampleBuffers manually. This can be used to submit PCM audio directly to the SDK.

    Note

    Make sure you have an IVSMixerSlotConfiguration that requests the preferredAudioInput value of IVSDeviceTypeUserAudio.
    See more

    Declaration

    Objective-C

    @protocol IVSCustomAudioSource <IVSAudioDevice>

    Swift

    protocol IVSCustomAudioSource : IVSAudioDevice
  • An extension of IVSAudioDevice that represents a physical microphone accessible by the host device.

    See more

    Declaration

    Objective-C

    @protocol IVSMicrophone <IVSAudioDevice, IVSMultiSourceDevice>

    Swift

    protocol IVSMicrophone : IVSAudioDevice, IVSMultiSourceDevice
  • A delegate that provides updates about the attached microphone.

    See more

    Declaration

    Objective-C

    @protocol IVSMicrophoneDelegate

    Swift

    protocol IVSMicrophoneDelegate
  • Provide a delegate to receive status updates and errors from the SDK. Updates may be run on arbitrary threads and not the main thread.

    See more

    Declaration

    Objective-C

    @protocol IVSBroadcastSessionDelegate <NSObject>

    Swift

    protocol Delegate : NSObjectProtocol
  • Represents an input device such as a camera or microphone.

    See more

    Declaration

    Objective-C

    @protocol IVSDevice <IVSErrorSource>

    Swift

    protocol IVSDevice : IVSErrorSource
  • Represents an input device such as a camera or microphone with multiple underlying input sources.

    See more

    Declaration

    Objective-C

    @protocol IVSMultiSourceDevice <IVSDevice>

    Swift

    protocol IVSMultiSourceDevice : IVSDevice
  • Use this delegate to be notified about added / removed devices

    See more

    Declaration

    Objective-C

    @protocol IVSDeviceDiscoveryDelegate <NSObject>

    Swift

    protocol IVSDeviceDiscoveryDelegate : NSObjectProtocol
  • An object capable of emitting errors.

    See more

    Declaration

    Objective-C

    @protocol IVSErrorSource <NSObject>

    Swift

    protocol IVSErrorSource : NSObjectProtocol
  • Provide a delegate to receive errors emitted from IVSErrorSource objects. Updates may be run on arbitrary threads and not the main thread.

    See more

    Declaration

    Objective-C

    @protocol IVSErrorDelegate <NSObject>

    Swift

    protocol IVSErrorDelegate : NSObjectProtocol
  • This represents an IVSDevice that provides video samples.

    See more

    Declaration

    Objective-C

    @protocol IVSImageDevice <IVSDevice>

    Swift

    protocol IVSImageDevice : IVSDevice
  • An extention of IVSImageDevice that allows for submitting CMSampleBuffers manually. The currently supported pixel formats are: kCVPixelFormatType_32BGRA kCVPixelFormatType_420YpCbCr8BiPlanarFullRange kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange On devices that support it, the Lossless and Lossy equivalents of these formats are also supported.

    Note

    Make sure you have an IVSMixerSlotConfiguration that requests the preferredVideoInput value of IVSDeviceTypeUserVideo.
    See more

    Declaration

    Objective-C

    @protocol IVSCustomImageSource <IVSImageDevice>

    Swift

    protocol IVSCustomImageSource : IVSImageDevice
  • An extention of IVSCustomImageSource that is used to pre-encode an image or video to be rendered when the application goes into the background.

    The timing information on the samples provided via onSampleBuffer is ignored on this image source, every image submitted will be encoded as the next frame based on the targetFramerate on the provided IVSVideoConfiguration.

    Note

    samples submitted will be processed on the invoking thread. For large amounts samples, submit them on a background queue.

    Generating large numbers of samples from MP4 files is fairly straight forward using AVFoundation. There are multiple ways to do it in fact You can use AVAssetImageGenerator and generateCGImagesAsynchronously to generate an image at every 1 / FPS increment. Be certain to set requestedTimeToleranceAfter and requestedTimeToleranceBefore to .zero, otherwise it will batch the same frame multiple times.

    You can also use an AVPlayer instance with AVPlayerItemVideoOutput and a DisplayLink, using the copyPixelBuffer API from the video output.

    Both can provide a series of CVPixelBuffers to submit to this API in order to broadcast a looping clip while in the background.

    See more

    Declaration

    Objective-C

    @protocol IVSBackgroundImageSource <IVSCustomImageSource>

    Swift

    protocol IVSBackgroundImageSource : IVSCustomImageSource
  • An extension of IVSImageDevice that represents a physical camera accessible by the host device.

    Declaration

    Objective-C

    @protocol IVSCamera <IVSImageDevice, IVSMultiSourceDevice>

    Swift

    protocol IVSCamera : IVSImageDevice, IVSMultiSourceDevice
  • A protocol to implement that can be used to build user interfaces. Implementing a IVSStageRenderer provides all the necessary information about a Stage to create a complete UI.

    See more

    Declaration

    Objective-C

    @protocol IVSStageRenderer <NSObject>

    Swift

    protocol IVSStageRenderer : NSObjectProtocol
  • The Strategy is the decision engine associated with an IVSStage. It is how the Stage asks the host application what actions to take. If the host application wants to change their answer to a question, they can call [IVSStage refreshStrategy].

    See more

    Declaration

    Objective-C

    @protocol IVSStageStrategy

    Swift

    protocol IVSStageStrategy
  • A delegate that provides information about the associated IVSStageStream.

    See more

    Declaration

    Objective-C

    @protocol IVSStageStreamDelegate <NSObject>

    Swift

    protocol IVSStageStreamDelegate : NSObjectProtocol