AWS DMS本身不支持将数据流式传输到S3 Glacier,但是可以使用Lambda函数将数据从DMS传输到S3 Glacier。以下是解决方法的代码示例:
创建一个Lambda函数,将以下源代码复制到Lambda中。
import boto3
import os
import json
from io import BytesIO
import gzip
s3_res = boto3.resource('s3')
def lambda_handler(event, context):
# Capture the DynamoDB stream event
input_data = event['Records']
# S3 bucket configuration - replace with your bucket & folder name
bucket_name = 'my-glacier-bucket'
folder_name = 'my_data_folder'
# Create a new S3 object
client = boto3.client('s3')
key = folder_name + '/ssm-' + str(time.time()) + '.json.gz'
byte_data = json.dumps(input_data).encode('UTF-8')
# Compress data using GZIP compression
gzip_data = BytesIO()
with gzip.GzipFile(fileobj=gzip_data, mode='wb') as gz:
gz.write(byte_data)
data = gzip_data.getvalue()
# Upload compressed data to S3 Glacier
client.upload_archive(vaultName=bucket_name, body=data)
# Log output to CloudWatch
print('Data uploaded to S3 Glacier.')
# Return success message
return 'Success'
在DMS任务中设置Lambda函数作为目标端点,并将输入映射到Lambda函数的输入参数。在AWS管理控制台中启动DMS任务之前,请确保按照以下格式创建表映射:
{
"TableName": "my_table_name",
"TableMappings": [
{
"Type": "Include",
"SourceSchema": "my_source_schema",
"SourceTable": "my_source_table",
"TargetTable": "my_target_table",
"TargetSchema": "my_target_schema",
"MappingProperties": {}
}
],
"TargetEndpointArn": "arn:aws:dms:us-east-1:123456789012:endpoint:QJLTZ7VV5YB6OQF435DG5PR73M",
"ReplicationInstanceArn": "arn:aws:dms:us-east-1:123456789012:rep:3H5PB45D9JPYSOI25LVGNAE5GI",
"MigrationType": "full-load",
"TableMappingsEnabled": true,
"StreamEnabled":