Import JSON Data into DynamoDB

Amazon DynamoDB is a fully managed NoSQL database service where maintenance, administrative burden, operations and scaling are managed by AWS. It provides single digit latency, even for terabytes of data. This is why it is used for applications where very fast reads are required.

Below is a breakdown of the tasks needed for automatically importing data from a file uploaded to S3 and pushing to a DynamoDB table:

  1. Create  an Amazon DynamoDB table
  2. Create an S3 bucket and upload a JSON file
  3. Create and configure a Lambda function
  4. Create an S3 trigger for the Lambda function
  5. Adding Event Triggers in Lambda for the S3 Bucket
  6. Test the DynamoDB table to check the data imported

Note: I will use the US East (N. Virginia) us-east-1 region to provision all required AWS resources.

Step 1: Create a DynamoDB Table

On the DynamoDB dashboard, click on Create table and then provide the following values:

  • Table Name: Enter thabolebelo_blog
  • Partition Key : post_id , click the drop-down, choose String, and click on Create table.
  • The DynamoDB Table will be ready to use when the Status changes to Active. You can verify the status of the table by navigating to Tables menu in the Dynamodb dashboard.

Step 2: Create an S3 bucket and upload a JSON File

Navigate to S3 page by clicking on the Services menu at the top. S3 is available under the Storage section. Click on Create bucket and then provide the following values:

  • Name of S3 bucket : Enter thabos3todynamo (Enter Unique bucket name)
  • Make sure the bucket is created in the US East (N. Virginia) us-east-1 region.
  • Creata a postdata.json file on your local machine, this data will be imported into the DynamoDB Table
[
    {
        "post_id": "1",
        "title": "Ilse",
        "tag": "Eadie",
        "date": "2022-03-17"
    },
    {
        "post_id": "2",
        "title": "Peri",
        "tag": "Edee",
        "date": "2022-05-12"
    },
    {
        "post_id": "3",
        "title": "Netty",
        "tag": "Sabina",
        "date": "2022-05-12"
    },
    {
        "post_id": "4",
        "title": "Phedra",
        "tag": "Andree",
        "date": "2022-03-11"
    }
]
This file contains the post data in JSON format.
  • Upload the postdata.json file to the thabos3todynamo S3 Bucket.
  • Once the file has uploaded successfully, you will be able to see the file inside the bucket.

Step 3: Creating a Lambda Function

Navigate to Lambda by click on the Lambda service under Compute in the AWS Console. Make sure you are in the N.Virginia region, click on Create function and:

  • Select Author from Scratch
  • Function Name :  json_s3_dynamodb
  • Runtime :  Python 3.8 (choose from Drop-down)
  • Click on Change default execution role and then select Create a new role from AWS policy templates
  • Select policy templates Amazon S3 object read-only permissions and Simple microservice permissions
  • Click on Create function
  • Copy and paste the below code in the Function Code Environment window and save the function as lambda_function.py
import json
import boto3

s3client = boto3.client('s3')
ddbclient = boto3.resource('dynamodb')

def lambda_handler(event, context):

    bucketname = event['Records'][0]['s3']['bucket']['name']
    jsonfilename = event['Records'][0]['s3']['object']['key'].strip()

    print(bucketname)
    print(jsonfilename)

    jsonobject = s3client.get_object(Bucket=bucketname,Key=jsonfilename)

    jsonfilereader = jsonobject['Body'].read()
    jsonDict = json.loads(jsonfilereader)

    for item in jsonDict:
        print(item)
        table = ddbclient.Table('thabolebelo_blog')
        table.put_item(Item=item)

    return {
        'statusCode': 200,
        'body': json.dumps('JSON Data Imported')
    }
Imports data from S3 and uploads it to DynamoDB

Note: Update the DynamoDB table name (line 22) with your chosen table name

  • After deploying the code, go to Configuration and click on Edit change the Timeout value to 1 minute and click on Save.

Step 4: Test the JSON Data Import using a mock test in Lambda

On the json_s3_dynamodb lambda function page, click Test dropdown then click on configure test event as follows:

  • Event Name: json
  • Template: s3-put
  • Under S3 → bucket → name →  enter thabos3todynamo
  • Under S3 → object → key → enter postdata.json
  • Click on Test to trigger the Lambda function.
  • Once the Lambda function has successfully executed, you will be able to see a detailed success message.
  • Navigate to the DynamoDB table named thabolebelo_blog to see the imported data.

Step 5 : Adding Event Triggers in Lambda for the S3 Bucket

Open the json_s3_dynamodb Lambda function then click on Add trigger. Configure the S3 trigger as follows:

  • Bucket :  thabos3todynamo
  • Event type : All object create events
  • Suffix : .json
  • Accept the Recursive Invocation warning and Click on Add.

Note: each time a json file with the extension .json is uploaded to the S3 bucket thabos3todynamo , it will trigger the json_s3_dynamodb Lambda function.

Step 6: Test the Lambda S3 Trigger

  • Open the postdata.json file locally and add new posts . Save it as newpostdata.json :
[
    {
        "post_id": "1",
        "title": "Ilse",
        "tag": "Eadie",
        "date": "2022-03-17"
    },
    {
        "post_id": "2",
        "title": "Peri",
        "tag": "Edee",
        "date": "2022-05-12"
    },
    {
        "post_id": "3",
        "title": "Netty",
        "tag": "Sabina",
        "date": "2022-05-12"
    },
    {
        "post_id": "4",
        "title": "Phedra",
        "tag": "Andree",
        "date": "2022-03-11"
    },
    {
        "post_id": "5",
        "title": "Dacia",
        "tag": "Meg",
        "date": "2022-05-27"
    },
    {
        "post_id": "6",
        "title": "Doralynne",
        "tag": "Halette",
        "date": "2022-03-31"
    }
]
  • Upload the newpostdata.json file to the thabos3todynamo S3 bucket.
  • This upload event should have triggered our Lambda function to import the JSON data into the DynamoDB table.
  • Navigate to the DynamoDB table named thabolebelo_blog to see the changes. Click on the refresh button if the items have not yet changed.
  • You can see that JSON data has been successfully imported into the DynamoDB table.

Summary

We have successfully created an Amazon DynamoDB table and Lambda function. Also created an S3 trigger in the Lambda configuration to trigger our Lambda function on file upload.

Delete the DynamoDB table once your are done as you will be billed for it.