-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
* added CDK datasync example --------- Co-authored-by: Michael Kaiser <[email protected]>
- Loading branch information
Showing
13 changed files
with
446 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
*.swp | ||
package-lock.json | ||
__pycache__ | ||
.pytest_cache | ||
.venv | ||
*.egg-info | ||
|
||
# CDK asset staging directory | ||
.cdk.staging | ||
cdk.out |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,132 @@ | ||
|
||
# Welcome to your CDK Python project! | ||
|
||
This repo uses the AWS CDK to enable Data movement among S3 buckets using AWS Datasync service. This example will support use-cases such as backups, bucket consolidation, data lake creations and so on. | ||
|
||
This CDK example application creates the following resources. | ||
- New S3 buckets (Optional) | ||
- One or more DataSync S3 locations | ||
- IAM Role and policy for the DataSync service to read/write to S3 buckets | ||
- DataSync task(s) to synchronize content between source and destination bucket pairs | ||
|
||
When you run `cdk deploy`, the CDK application creates two CloudFormation Stacks. | ||
1. `cdk-datasync-s3-to-s3-iam` - creates necessary IAM Roles. This Stack is implemented in [datasync_s3_to_s3_IAM_stack.py](datasync_s3_to_s3/datasync_s3_to_s3_IAM_stack.py) | ||
2. `cdk-datasync-s3-to-s3` - creates S3 buckets. This Stack is implemented in [datasync_s3_to_s3_stack.py](datasync_s3_to_s3/datasync_s3_to_s3_stack.py) | ||
|
||
|
||
## Steps to use | ||
1. Follow steps for intializing the CDK environment below. Ensure that Virtual Env is activated. See **Configuring Virtual Env**. | ||
2. Ensure that you have exported AWS credentials, IAM profile, or EC2 instance role with Permissions to create IAM Role and DataSync resources. | ||
3. Add the source and destination bucket names in `cdk.context.json`, and define one or more Datasync tasks to move data between source and destination pairs. See **Setting CDK Context**. | ||
4. From the directory where `cdk.json` is present, run the `cdk diff` command. Adjust `app.py` if needed. | ||
5. Run `cdk deploy --all` to create the resources. The Task outputs will be shown upon successful deployment. | ||
6. Start the DataSync task using AWS CLI: `aws datasync start-task-execution --task-arn <task-arn>` | ||
|
||
|
||
## Cleanup | ||
1. Follow the above steps 1 through 3. | ||
2. Run `cdk destroy --all` to delete previously created Stacks. | ||
|
||
|
||
## Setting CDK Context | ||
This CDK application operates on two input lists, one for DataSync locations another for DataSync tasks. Each list can be populated with any number of configuration items. Below is an example of copying content from an existing source bucket to a new destination bucket. | ||
``` | ||
{ | ||
"S3_datasync_locations": [ | ||
{ | ||
"bucketName": "cdk-example-datasync-source-bucket", | ||
"create": true, | ||
"storage_lass": "STANDARD", | ||
"subDirectory": "", | ||
"tags": [ | ||
{ | ||
"key": "Project", | ||
"value": "CDK-example" | ||
} | ||
] | ||
}, | ||
{ | ||
"create": true, | ||
"bucketName": "cdk-example-datasync-destination-bucket", | ||
"storageClass": "STANDARD", | ||
"subDirectory": "", | ||
"tags": [] | ||
} | ||
], | ||
"S3_datasync_tasks": [ | ||
{ | ||
"source": "cdk-example-datasync-source-bucket", | ||
"destination": "cdk-example-datasync-destination-bucket" | ||
} | ||
] | ||
} | ||
``` | ||
|
||
Below are the configuration elements. | ||
|
||
| Key | Description | Example | | ||
| S3_datasync_locations | List containing S3 location configurations | | | ||
| bucketName | List containing S3 location object configurations | | | ||
| create | List containing S3 location object configurations | | | ||
| storage_lass | List containing S3 location object configurations | | | ||
| subDirectory | List containing S3 location object configurations | | | ||
| tags | | | ||
| S3_datasync_tasks | List containing S3 Datasync task configurations | | | ||
| source | Source S3 bucket name for DataSync task | | | ||
| destination | Destination S3 bucket name for DataSync task | | | ||
|
||
## Configuring Virtual Env | ||
The `cdk.json` file tells the CDK Toolkit how to execute your app. | ||
|
||
This project is set up like a standard Python project. The initialization | ||
process also creates a virtualenv within this project, stored under the `.venv` | ||
directory. To create the virtualenv it assumes that there is a `python3` | ||
(or `python` for Windows) executable in your path with access to the `venv` | ||
package. If for any reason the automatic creation of the virtualenv fails, | ||
you can create the virtualenv manually. | ||
|
||
To manually create a virtualenv on MacOS and Linux: | ||
|
||
``` | ||
$ python3 -m venv .venv | ||
``` | ||
|
||
After the init process completes and the virtualenv is created, you can use the following | ||
step to activate your virtualenv. | ||
|
||
``` | ||
$ source .venv/bin/activate | ||
``` | ||
|
||
If you are a Windows platform, you would activate the virtualenv like this: | ||
|
||
``` | ||
% .venv\Scripts\activate.bat | ||
``` | ||
|
||
Once the virtualenv is activated, you can install the required dependencies. | ||
|
||
``` | ||
$ pip install -r requirements.txt | ||
``` | ||
|
||
At this point you can now synthesize the CloudFormation template for this code. | ||
|
||
``` | ||
$ cdk synth | ||
``` | ||
|
||
To add additional dependencies, for example other CDK libraries, just add | ||
them to your `setup.py` file and rerun the `pip install -r requirements.txt` | ||
command. | ||
|
||
## Useful commands | ||
|
||
* `cdk ls` list all stacks in the app | ||
* `cdk synth` emits the synthesized CloudFormation template | ||
* `cdk deploy` deploy this stack to your default AWS account/region | ||
* `cdk diff` compare deployed stack with current state | ||
* `cdk docs` open CDK documentation | ||
|
||
Enjoy! |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
#!/usr/bin/env python3 | ||
|
||
import aws_cdk as cdk | ||
|
||
from datasync_s3_to_s3.datasync_s3_to_s3_IAM_stack import DataSyncS3toS3StackIAM | ||
from datasync_s3_to_s3.datasync_s3_to_s3_stack import DataSyncS3toS3Stack | ||
|
||
|
||
app = cdk.App() | ||
|
||
# Create Stack as defined in: datasync_s3_to_s3/datasync_s3_to_s3_IAM_stack.py | ||
iam_stack = DataSyncS3toS3StackIAM(app, "cdk-datasync-s3-to-s3-iam") | ||
|
||
# Create Stack as defined in: datasync_s3_to_s3/datasync_s3_to_s3_stack.py | ||
datasync_stack = DataSyncS3toS3Stack(app, "cdk-datasync-s3-to-s3") | ||
|
||
# Wait until the IAM stack has completed provisioning | ||
datasync_stack.add_dependency(iam_stack) | ||
|
||
app.synth() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
{ | ||
"S3_datasync_locations": [ | ||
{ | ||
"bucketName": "cdk-example-datasync-source-bucket", | ||
"create": false, | ||
"storage_lass": "STANDARD", | ||
"subDirectory": "", | ||
"tags": [ | ||
{ | ||
"key": "Project", | ||
"value": "CDK-example" | ||
} | ||
] | ||
}, | ||
{ | ||
"create": false, | ||
"bucketName": "cdk-example-datasync-dest-bucket", | ||
"storageClass": "STANDARD", | ||
"subDirectory": "", | ||
"tags": [] | ||
} | ||
], | ||
"S3_datasync_tasks": [ | ||
{ | ||
"source": "cdk-example-datasync-source-bucket", | ||
"destination": "cdk-example-datasync-dest-bucket" | ||
} | ||
] | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,51 @@ | ||
{ | ||
"app": "python3 app.py", | ||
"watch": { | ||
"include": [ | ||
"**" | ||
], | ||
"exclude": [ | ||
"README.md", | ||
"cdk*.json", | ||
"requirements*.txt", | ||
"source.bat", | ||
"**/__init__.py", | ||
"python/__pycache__", | ||
"tests" | ||
] | ||
}, | ||
"context": { | ||
"@aws-cdk/aws-lambda:recognizeLayerVersion": true, | ||
"@aws-cdk/core:checkSecretUsage": true, | ||
"@aws-cdk/core:target-partitions": [ | ||
"aws", | ||
"aws-cn" | ||
], | ||
"@aws-cdk-containers/ecs-service-extensions:enableDefaultLogDriver": true, | ||
"@aws-cdk/aws-ec2:uniqueImdsv2TemplateName": true, | ||
"@aws-cdk/aws-ecs:arnFormatIncludesClusterName": true, | ||
"@aws-cdk/aws-iam:minimizePolicies": true, | ||
"@aws-cdk/core:validateSnapshotRemovalPolicy": true, | ||
"@aws-cdk/aws-codepipeline:crossAccountKeyAliasStackSafeResourceName": true, | ||
"@aws-cdk/aws-s3:createDefaultLoggingPolicy": true, | ||
"@aws-cdk/aws-sns-subscriptions:restrictSqsDescryption": true, | ||
"@aws-cdk/aws-apigateway:disableCloudWatchRole": true, | ||
"@aws-cdk/core:enablePartitionLiterals": true, | ||
"@aws-cdk/aws-events:eventsTargetQueueSameAccount": true, | ||
"@aws-cdk/aws-iam:standardizedServicePrincipals": true, | ||
"@aws-cdk/aws-ecs:disableExplicitDeploymentControllerForCircuitBreaker": true, | ||
"@aws-cdk/aws-iam:importedRoleStackSafeDefaultPolicyName": true, | ||
"@aws-cdk/aws-s3:serverAccessLogsUseBucketPolicy": true, | ||
"@aws-cdk/aws-route53-patters:useCertificate": true, | ||
"@aws-cdk/customresources:installLatestAwsSdkDefault": false, | ||
"@aws-cdk/aws-rds:databaseProxyUniqueResourceName": true, | ||
"@aws-cdk/aws-codedeploy:removeAlarmsFromDeploymentGroup": true, | ||
"@aws-cdk/aws-apigateway:authorizerChangeDeploymentLogicalId": true, | ||
"@aws-cdk/aws-ec2:launchTemplateDefaultUserData": true, | ||
"@aws-cdk/aws-secretsmanager:useAttachedSecretResourcePolicyForSecretTargetAttachments": true, | ||
"@aws-cdk/aws-redshift:columnId": true, | ||
"@aws-cdk/aws-stepfunctions-tasks:enableEmrServicePolicyV2": true, | ||
"@aws-cdk/aws-ec2:restrictDefaultSecurityGroup": true, | ||
"@aws-cdk/aws-apigateway:requestValidatorUniqueId": true | ||
} | ||
} |
Empty file.
72 changes: 72 additions & 0 deletions
72
python/datasync-s3/datasync_s3_to_s3/datasync_s3_to_s3_IAM_stack.py
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
import uuid | ||
from constructs import Construct | ||
from aws_cdk import ( | ||
Stack, | ||
aws_iam as iam, | ||
CfnOutput | ||
) | ||
|
||
class DataSyncS3toS3StackIAM(Stack): | ||
|
||
# Function to create IAM Role for Datasync | ||
def create_datasync_roles(self, bucket_configs): | ||
# Create a list of bucket paths ending in /* for IAM policy | ||
suffix = "/*" | ||
i=0 | ||
datasync_s3_roles = [] | ||
|
||
for bc in bucket_configs: | ||
# Create an IAM Role for DataSync to read and write to S3 bucket | ||
# Create an IAM role | ||
|
||
role_name="CDKDataSyncS3Access-" + bc["bucketName"] | ||
s3_role = iam.Role( | ||
self, "CDKDataSyncS3AccessRole"+str(i), | ||
assumed_by=iam.ServicePrincipal("datasync.amazonaws.com"), | ||
description="CDK Datasync role for S3", | ||
role_name=role_name | ||
) | ||
|
||
stmt1 = iam.PolicyStatement( | ||
effect=iam.Effect.ALLOW, | ||
actions=["s3:GetBucketLocation", "s3:ListBucket","s3:ListBucketMultipartUploads"], | ||
resources=[bc["arn"]] | ||
) | ||
|
||
stmt2 = iam.PolicyStatement( | ||
effect=iam.Effect.ALLOW, | ||
actions=["s3:AbortMultipartUpload", "s3:DeleteObject","s3:GetObject","s3:ListMultipartUploadParts","s3:PutObjectTagging","s3:GetObjectTagging","s3:PutObject"], | ||
resources=[bc["arn"]+suffix] | ||
) | ||
|
||
s3_policy = iam.ManagedPolicy(self,"CDKDataSyncS3Policy"+str(i), statements = [stmt1, stmt2], roles = [s3_role]) | ||
|
||
datasync_s3_roles.append(s3_role) | ||
|
||
# Export the name using the same format as the Role name | ||
# This will be important by downstream Stack | ||
CfnOutput(self, role_name, value=s3_role.role_arn, export_name=role_name) | ||
|
||
i = i+1 | ||
|
||
return datasync_s3_roles | ||
|
||
|
||
# Main function | ||
def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: | ||
super().__init__(scope, construct_id, **kwargs) | ||
|
||
# Store bucket configs in an array | ||
bucket_configs = self.node.try_get_context("S3_datasync_locations") | ||
if bucket_configs: | ||
# Add the arn to bucket_config, if it is not provided already | ||
for b in bucket_configs: | ||
if not "arn" in b: | ||
b["arn"] = "arn:aws:s3:::" + b["bucketName"] | ||
|
||
self.create_datasync_roles(bucket_configs) | ||
else: | ||
print("ERROR: Please set a context variable for S3_datasync_locations") | ||
|
||
|
||
|
Oops, something went wrong.