Classes
The following classes are available globally.
-
Declaration
Swift
@objcMembers public class DefaultEventAnalyticsController : NSObject, EventAnalyticsController
-
Declaration
Swift
@objcMembers public class MeetingHistoryEvent : NSObject
-
See moreSDKEvent
defines event that composes of name of event and attribute to describe the eventDeclaration
Swift
@objcMembers public class SDKEvent : NSObject
-
See moreAudioVideoConfiguration
represents the configuration to be used for audio and video during a meeting session.Declaration
Swift
@objcMembers public class AudioVideoConfiguration : NSObject
-
Declaration
Swift
@objcMembers public class DefaultAudioVideoController : NSObject, AudioVideoControllerFacade
-
Declaration
Swift
@objcMembers public class DefaultAudioVideoFacade : NSObject, AudioVideoFacade
-
Declaration
Swift
@objcMembers public class DefaultActiveSpeakerDetector : NSObject, ActiveSpeakerDetectorFacade, RealtimeObserver
-
Declaration
Swift
@objcMembers public class DefaultActiveSpeakerPolicy : NSObject, ActiveSpeakerPolicy
-
See moreContentShareSource
contains the media sources to attach to the content shareDeclaration
Swift
@objcMembers public class ContentShareSource : NSObject
-
See moreContentShareStatus
indicates a status received regarding the content share.Declaration
Swift
@objcMembers public class ContentShareStatus : NSObject
-
Declaration
Swift
@objcMembers public class DefaultContentShareController : NSObject, ContentShareController
-
See moreInAppScreenCaptureSource
is used to share screen capture within the app. When the app is in the background, there is no sample sent to handler, and screen sharing is paused.InAppScreenCaptureSource
is only available on iOS 11+ because ofRPScreenRecorder.startCapture(handler:completionHandler:)
method.InAppScreenCaptureSource
does not support rotation while it’s in progress.Declaration
Swift
@available(iOS 11.0, *) @objcMembers public class InAppScreenCaptureSource : NSObject, VideoCaptureSource
-
ReplayKitSource
repackagesCMSampleBuffer
objects from ReplayKit into SDK usableVideoFrame
objects. It currently supports resending video frames to maintain a minimum frame rate.It does not directly contain any system library calls that actually captures the screen. Builders can use
See moreInAppScreenCaptureSource
to share screen from only their application. For device level screen broadcast, take a look at theSampleHandler
in AmazonChimeSDKDemoBroadcast.Declaration
Swift
@objcMembers public class ReplayKitSource : VideoSource
-
Declaration
Swift
@objcMembers public class AttendeeInfo : NSObject, Comparable
-
Data message received from server.
See moreDeclaration
Swift
@objcMembers public class DataMessage : NSObject
-
Declaration
Swift
@objcMembers public class SignalUpdate : NSObject
-
Declaration
Swift
@objcMembers public class VolumeUpdate : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class Transcript : NSObject, TranscriptEvent
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptAlternative : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptEntity : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptItem : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptLanguageWithScore : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptResult : NSObject
-
See Using Amazon Chime SDK live transcription developer guide for details about transcription message types and data guidelines
See moreDeclaration
Swift
@objcMembers public class TranscriptionStatus : NSObject, TranscriptEvent
-
Declaration
Swift
@objcMembers public class DefaultVideoRenderView : UIImageView, VideoRenderView
-
Declaration
Swift
@objcMembers public class DefaultVideoTileController : NSObject, VideoTileController
-
Configuration for a local video or content share to be sent
See moreDeclaration
Swift
@objcMembers public class LocalVideoConfiguration : NSObject
-
A video source available in the current meeting. RemoteVideoSource need to be consistent between
See moreremoteVideoSourcesDidBecomeAvailable
andupdateVideoSourceSubscriptions
as they are used as keys in maps that may be updated. I.e. when setting up a map forupdateVideoSourceSubscriptions
do not construct RemoteVideoSource yourselves or the configuration may or may not be updated.Declaration
Swift
@objcMembers public class RemoteVideoSource : NSObject
-
video bitrates for regular and high-resolution meetings
See moreDeclaration
Swift
@objc public class VideoBitrateConstants : NSObject
-
See moreVideoFrame
is a class which contains aVideoFrameBuffer
and metadata necessary for transmission. Typically produced via aVideoSource
and consumed via aVideoSink
Declaration
Swift
@objcMembers public class VideoFrame : NSObject
-
See moreVideoFramePixelBuffer
is a buffer which contains a single video frame in the form ofCVPixelBuffer
.Declaration
Swift
@objcMembers public class VideoFramePixelBuffer : NSObject, VideoFrameBuffer
-
Customizable video resolution parameters for a remote video source.
See moreDeclaration
Swift
@objc public class VideoResolution : NSObject
-
Configuration for a specific video source. The values are intentionally mutable so that a map of all current configurations can be kept and updated as needed.
See moreVideoSubscriptionConfiguration
is used to contain the priority and resolution of remote video sources and content share to be receivedDeclaration
Swift
@objcMembers public class VideoSubscriptionConfiguration : NSObject
-
See moreBackgroundFilterProcessor
is a processor that usesSegmentationProcessor
to process a frame by creating the alpha mask of the foreground image and blending the mask with the input image which is then rendered on top of a background image.Declaration
Swift
public class BackgroundFilterProcessor
-
See moreNoopSegmentationProcessor
is a processor that does nothing except pass image frames in and out. This is used as a placeholder for implementations ofSegmentationProcessor
that cannot be initialized.Declaration
Swift
public class NoopSegmentationProcessor : SegmentationProcessor
-
Declaration
Swift
@objcMembers public class BackgroundBlurConfiguration : NSObject
-
See moreBackgroundBlurVideoFrameProcessor
is a processor which receives video frames viaVideoSource
and then applies Gaussian blur to each video frame and renders the foreground on top of the blurred image. Gaussian blur is applied using built inCIGaussianBlur
CIFilter.Declaration
Swift
@objcMembers public class BackgroundBlurVideoFrameProcessor : NSObject, VideoSource, VideoSink
-
Declaration
Swift
@objcMembers public class BackgroundReplacementConfiguration : NSObject
-
See moreBackgroundReplacementVideoFrameProcessor
is a processor which receives video frames viaVideoSource
and then creates the foreground image which is rendered on top of a background image.Declaration
Swift
@objcMembers public class BackgroundReplacementVideoFrameProcessor : NSObject, VideoSource, VideoSink
-
Declaration
Swift
@objcMembers public class DefaultCameraCaptureSource : NSObject, CameraCaptureSource
extension DefaultCameraCaptureSource: AVCaptureVideoDataOutputSampleBufferDelegate
-
See moreVideoCaptureFormat
describes a given capture format that may be possible to apply to aVideoCaptureSource
. Note thatVideoCaptureSource
implementations may ignore or adjust unsupported values.Declaration
Swift
@objcMembers public class VideoCaptureFormat : NSObject
-
Declaration
Swift
@objcMembers public class DefaultDeviceController : NSObject, DeviceController
-
See moreMediaDevice
represents an IOS audio/video device.Declaration
Swift
@objcMembers public class MediaDevice : NSObject
-
Declaration
Swift
@objcMembers public class DefaultEventReporter : NSObject, EventReporter
-
Declaration
Swift
@objcMembers public class DefaultMeetingEventReporterFactory : EventReporterFactory
-
See moreIngestionConfiguration
defines the configuration needed for ingestion service. This will be passed down toDefaultEventReporter
Declaration
Swift
@objcMembers public class IngestionConfiguration : NSObject
-
See moreIngestionConfigurationBuilder
helps to createIngestionConfiguration
by providing builder pattern.Declaration
Swift
@objcMembers public class IngestionConfigurationBuilder : NSObject
-
Event data that will be send to the ingestion server
See moreDeclaration
Swift
@objcMembers public class IngestionEvent : NSObject, Codable
-
See moreIngestionEventConverter
converts data from payload intoMeetingEventItem
/DirtyEventItem
or vice versa.Declaration
Swift
@objcMembers public class IngestionEventConverter : NSObject
-
Declaration
Swift
@objcMembers public class IngestionPayload : NSObject, Codable
-
See moreIngestionRecord
is the format of data that will be consumed on the ingestion server.Declaration
Swift
@objcMembers public class IngestionRecord : NSObject, Codable
-
See moreMeetingEventClientConfiguration
is one type ofEventClientConfiguration
that contains information about the meetingDeclaration
Swift
@objcMembers public class MeetingEventClientConfiguration : NSObject, EventClientConfiguration
-
Declaration
Swift
@objcMembers public class NoopEventReporterFactory : EventReporterFactory
-
Declaration
Swift
@objcMembers public class DefaultRealtimeController : NSObject, RealtimeControllerFacade
-
Declaration
Swift
@objcMembers public class CreateAttendeeResponse : NSObject
-
Declaration
Swift
@objcMembers public class Attendee : NSObject
-
Declaration
Swift
@objcMembers public class CreateMeetingResponse : NSObject
-
Declaration
Swift
@objcMembers public class Meeting : NSObject
-
Declaration
Swift
@objcMembers public class MediaPlacement : NSObject
-
Declaration
Swift
@objcMembers public class MeetingFeatures : NSObject
-
Declaration
Swift
@objcMembers public class DefaultMeetingSession : NSObject, MeetingSession
-
See moreMeetingSessionConfiguration
contains the information necessary to start a session. Constructs a MeetingSessionConfiguration with a chime:CreateMeetingResponse
and chime:CreateAttendeeResponse
response and optional customURLRewriter
that will rewrite urls given to new urls.Declaration
Swift
@objcMembers public class MeetingSessionConfiguration : NSObject
-
See moreMeetingSessionCredentials
includes the credentials used to authenticate. the attendee on the meetingDeclaration
Swift
@objcMembers public class MeetingSessionCredentials : NSObject, Codable
-
See moreMeetingSessionStatus
indicates a status received regarding the session.Declaration
Swift
@objcMembers public class MeetingSessionStatus : NSObject
-
See moreMeetingSessionURLs
contains the URLs that will be used to reach the meeting service.Declaration
Swift
@objcMembers public class MeetingSessionURLs : NSObject, Codable
-
See moreURLRewriterUtils
is class that defines default Url rewrite behaviorDeclaration
Swift
@objcMembers public class URLRewriterUtils : NSObject
-
DefaultModality
is a backwards compatible extension of the attendee id (UUID string) and session token schemas (base 64 string). It appends #to either string, which indicates the modality of the participant. For example,
See moreattendeeId
: “abcdefg”contentAttendeeId
: “abcdefg#content”DefaultModality(id: contentAttendeeId).base
: “abcdefg”DefaultModality(id: contentAttendeeId).modality
: “content”DefaultModality(id: contentAttendeeId).isOfType(type: .content)
: trueDeclaration
Swift
@objcMembers public class DefaultModality : NSObject
-
Declaration
Swift
@objcMembers public class Versioning : NSObject
-
ConsoleLogger writes logs with console.
See more// working with the ConsoleLogger let logger = new ConsoleLogger("demo"); //default level is LogLevel.INFO logger.info("info"); logger.debug("debug"); logger.fault("fault"); logger.error("error"); // setting logging levels let logger = new ConsoleLogger("demo", .INFO); logger.debug("debug"); // does not print logger.setLogLevel(LogLevel.DEBUG) logger.debug("debug"); // print
Declaration
Swift
@objcMembers public class ConsoleLogger : NSObject, Logger