Optional ReadonlyaddFlag to add column name information to CSV output files for S3 data lake integration
Optional ReadonlybucketS3 bucket folder name for organizing migrated data with hierarchical structure
ReadonlybucketS3 bucket name for DMS data migration destination in data lake architecture
Optional ReadonlycannedPredefined access control list (ACL) for S3 objects created during data migration
Optional ReadonlycdcFlag to enable CDC INSERT and UPDATE operations capture to S3 files for change tracking
Optional ReadonlycdcFlag to enable CDC INSERT-only operations capture to S3 files for insert-focused change tracking
Optional ReadonlycdcMaximum batch interval in seconds for CDC file output to S3 for time-based file creation
Optional ReadonlycdcMinimum file size in kilobytes for CDC file output to S3 for size-based file creation
Optional ReadonlycdcCDC folder path specification for change data capture file organization in S3
Optional ReadonlycompressionCompression type for S3 target files to optimize storage and transfer performance
Optional ReadonlycsvColumn delimiter for CSV file format in S3 data lake integration
Optional ReadonlycsvString value for columns not included in supplemental log during CDC CSV operations
Optional ReadonlycsvNull value representation for CSV files in S3 data lake operations
Optional ReadonlycsvRow delimiter for CSV files in S3 data lake integration
Optional ReadonlydataData format specification for S3 output files in data lake architecture
Optional ReadonlydataData page size in bytes for Parquet file format optimization
Optional ReadonlydateDate partition delimiter for S3 folder partitioning organization
Optional ReadonlydateFlag to enable date-based folder partitioning for S3 bucket organization
Optional ReadonlydateDate format sequence for folder partitioning organization in S3 data lake
Optional ReadonlydateTime zone specification for date partition folder creation and CDC file naming
Optional ReadonlydictMaximum dictionary page size limit for Parquet column encoding optimization
Optional ReadonlyenableFlag to enable statistics collection for Parquet pages and row groups for query optimization
Optional ReadonlyencodingEncoding type specification for Parquet file compression and storage optimization
Optional ReadonlyexternalExternal table definition for S3 source configuration in data lake integration
Optional ReadonlyignoreNumber of header rows to ignore in CSV files for S3 source processing
Optional ReadonlyincludeFlag to include INSERT operation indicators in full load CSV output for consistency with CDC operations
Optional ReadonlymaxMaximum CSV file size in KB for S3 target during full load migration operations
Optional ReadonlyparquetFlag to set TIMESTAMP column precision to milliseconds in Parquet files for Athena and Glue compatibility
Optional ReadonlyparquetApache Parquet format version specification for S3 data lake columnar storage
Optional ReadonlypreserveFlag to preserve transaction order for CDC loads in S3 target for data consistency
Optional Readonlyrfc4180Flag to enable RFC 4180 compliance for CSV quotation mark handling in S3 operations
Optional ReadonlyrowNumber of rows in Parquet row group for read/write performance optimization
ReadonlyserverKMS key ID for server-side encryption when using SSE_KMS encryption mode for S3 data lake security
Optional ReadonlyserviceIAM role ARN for DMS service access to S3 bucket operations for data lake integration
Optional ReadonlytimestampTimestamp column name for adding migration timing information to S3 data lake files
Optional ReadonlyuseFlag to use CsvNoSupValue for columns not in supplemental log during CDC CSV operations
Optional ReadonlyuseFlag to use task start time for full load timestamp column instead of data arrival time
Provides information that defines an Amazon S3 endpoint. Modified from the equivalent L1 Construct to prevent use of plaintext credentials and enforce use of KMS encryption. This information includes the output format of records applied to the endpoint and details of transaction and control table data information. For more information about the available settings, see Extra connection attributes when using Amazon S3 as a source for AWS DMS and Extra connection attributes when using Amazon S3 as a target for AWS DMS in the AWS Database Migration Service User Guide .
Struct
See: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-dms-endpoint-s3settings.html