Protocols

The following protocols are available globally.

  • This represents an IVSDevice that provides audio samples.

    See more

    Declaration

    Objective-C

    @protocol IVSAudioDevice <IVSDevice>

    Swift

    protocol IVSAudioDevice : IVSDevice
  • An extension of IVSAudioDevice that allows for submitting CMSampleBuffers manually. This can be used to submit PCM audio directly to the SDK.

    Note

    Make sure you have an IVSMixerSlotConfiguration that requests the preferredAudioInput value of IVSDeviceTypeUserAudio.
    See more

    Declaration

    Objective-C

    @protocol IVSCustomAudioSource <IVSAudioDevice>

    Swift

    protocol IVSCustomAudioSource : IVSAudioDevice
  • An extension of IVSAudioDevice that represents a physical microphone accessible by the host device.

    See more

    Declaration

    Objective-C

    @protocol IVSMicrophone <IVSAudioDevice, IVSMultiSourceDevice>

    Swift

    protocol IVSMicrophone : IVSAudioDevice, IVSMultiSourceDevice
  • A delegate that provides updates about the attached microphone.

    See more

    Declaration

    Objective-C

    @protocol IVSMicrophoneDelegate

    Swift

    protocol IVSMicrophoneDelegate
  • Provide a delegate to receive status updates and errors from the SDK. Updates may be run on arbitrary threads and not the main thread.

    See more

    Declaration

    Objective-C

    @protocol IVSBroadcastSessionDelegate <NSObject>

    Swift

    protocol Delegate : NSObjectProtocol
  • Represents an input device such as a camera or microphone.

    See more

    Declaration

    Objective-C

    @protocol IVSDevice <IVSErrorSource>

    Swift

    protocol IVSDevice : IVSErrorSource
  • Represents an input device such as a camera or microphone with multiple underlying input sources.

    See more

    Declaration

    Objective-C

    @protocol IVSMultiSourceDevice <IVSDevice>

    Swift

    protocol IVSMultiSourceDevice : IVSDevice
  • Use this delegate to be notified about added / removed devices

    See more

    Declaration

    Objective-C

    @protocol IVSDeviceDiscoveryDelegate <NSObject>

    Swift

    protocol IVSDeviceDiscoveryDelegate : NSObjectProtocol
  • An object capable of emitting errors.

    See more

    Declaration

    Objective-C

    @protocol IVSErrorSource <NSObject>

    Swift

    protocol IVSErrorSource : NSObjectProtocol
  • Provide a delegate to receive errors emitted from IVSErrorSource objects. Updates may be run on arbitrary threads and not the main thread.

    See more

    Declaration

    Objective-C

    @protocol IVSErrorDelegate <NSObject>

    Swift

    protocol IVSErrorDelegate : NSObjectProtocol
  • This represents an IVSDevice that provides video samples.

    See more

    Declaration

    Objective-C

    @protocol IVSImageDevice <IVSDevice>

    Swift

    protocol IVSImageDevice : IVSDevice
  • An extension of IVSImageDevice that allows for submitting CMSampleBuffers manually. The currently supported pixel formats are: kCVPixelFormatType_32BGRA kCVPixelFormatType_420YpCbCr8BiPlanarFullRange kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange On devices that support it, the Lossless and Lossy equivalents of these formats are also supported.

    Note

    Make sure you have an IVSMixerSlotConfiguration that requests the preferredVideoInput value of IVSDeviceTypeUserVideo.
    See more

    Declaration

    Objective-C

    @protocol IVSCustomImageSource <IVSImageDevice>

    Swift

    protocol IVSCustomImageSource : IVSImageDevice
  • An extension of IVSCustomImageSource that is used to pre-encode an image or video to be rendered when the application goes into the background.

    The timing information on the samples provided via onSampleBuffer is ignored on this image source, every image submitted will be encoded as the next frame based on the targetFramerate on the provided IVSVideoConfiguration.

    Note

    samples submitted will be processed on the invoking thread. For large amounts samples, submit them on a background queue.

    Generating large numbers of samples from MP4 files is fairly straight forward using AVFoundation. There are multiple ways to do it in fact You can use AVAssetImageGenerator and generateCGImagesAsynchronously to generate an image at every 1 / FPS increment. Be certain to set requestedTimeToleranceAfter and requestedTimeToleranceBefore to .zero, otherwise it will batch the same frame multiple times.

    You can also use an AVPlayer instance with AVPlayerItemVideoOutput and a DisplayLink, using the copyPixelBuffer API from the video output.

    Both can provide a series of CVPixelBuffers to submit to this API in order to broadcast a looping clip while in the background.

    See more

    Declaration

    Objective-C

    @protocol IVSBackgroundImageSource <IVSCustomImageSource>

    Swift

    protocol IVSBackgroundImageSource : IVSCustomImageSource
  • An extension of IVSImageDevice that represents a physical camera accessible by the host device.

    See more

    Declaration

    Objective-C

    @protocol IVSCamera <IVSImageDevice, IVSMultiSourceDevice>

    Swift

    protocol IVSCamera : IVSImageDevice, IVSMultiSourceDevice
  • A delegate that provides updates about the attached camera on the main queue.

    See more

    Declaration

    Objective-C

    @protocol IVSCameraDelegate <NSObject>

    Swift

    protocol IVSCameraDelegate : NSObjectProtocol
  • Protocol for messages parsed out of image frames.

    Declaration

    Objective-C

    @protocol IVSBroadcastImageFrameMessage <NSObject>

    Swift

    protocol IVSImageFrameMessage : NSObjectProtocol
  • A mixed audio device that can accept multiple audio sources to be mixed together to produce a final output. A mixed audio device with no input sources will still produce output, but the output will be silent.

    Note

    Multiple audio sources will have their audio added together. If there are too many loud sources sending samples at the same time, the output may be clipped. Turn down the gain on individual sources to compensate if necessary.

    This can be created through IVSDeviceDiscovery.

    See more

    Declaration

    Objective-C

    @protocol IVSMixedAudioDevice <IVSAudioDevice, IVSMixedDevice>

    Swift

    protocol IVSMixedAudioDevice : IVSAudioDevice, IVSMixedDevice
  • A mixed device is a device that can accept multiple input sources to be mixed together to produce a final output. A single IVSMixedDevice instance will be either all audio or all video sources, and the sources and configurations attached will be of the same type. A mixed device with no inputs sources will still produce output, but the output will be a a black image or silent audio, depending on the mixed device type.

    A mixed device can be attached to an IVSBroadcastSession via attachDevice or to an IVSStage by wrapping it in a IVSLocalStageStream.

    Declaration

    Objective-C

    @protocol IVSMixedDevice

    Swift

    protocol IVSMixedDevice
  • A mixed image device that can accept multiple image sources to be mixed together to produce a final output. A mixed image device with no input sources will still produce output, but the output will be a black image.

    Note

    Previews for this device will be delayed slightly from the input sources due to the time it takes to composite and render the various sources into a single output stream.

    @warn This device will stop mixing when the application is in the background as a result of limited access to metal shaders while backgrounded.

    This can be created through IVSDeviceDiscovery.

    See more

    Declaration

    Objective-C

    @protocol IVSMixedImageDevice <IVSImageDevice, IVSMixedDevice>

    Swift

    protocol IVSMixedImageDevice : IVSImageDevice, IVSMixedDevice
  • A delegate that provides information about the associated IVSRemoteStageStream.

    See more

    Declaration

    Objective-C

    @protocol IVSRemoteStageStreamDelegate <IVSStageStreamDelegate>

    Swift

    protocol IVSRemoteStageStreamDelegate : IVSStageStreamDelegate
  • A protocol to implement that can be used to build user interfaces. Implementing a IVSStageRenderer provides all the necessary information about a Stage to create a complete UI.

    See more

    Declaration

    Objective-C

    @protocol IVSStageRenderer <NSObject>

    Swift

    protocol IVSStageRenderer : NSObjectProtocol
  • The Strategy is the decision engine associated with an IVSStage. It is how the Stage asks the host application what actions to take. If the host application wants to change their answer to a question, they can call [IVSStage refreshStrategy].

    See more

    Declaration

    Objective-C

    @protocol IVSStageStrategy

    Swift

    protocol IVSStageStrategy
  • A delegate that provides information about the associated IVSStageStream.

    See more

    Declaration

    Objective-C

    @protocol IVSStageStreamDelegate <NSObject>

    Swift

    protocol IVSStageStreamDelegate : NSObjectProtocol