Test Category

Test Blog Post

Starter template for writing out a blog post using MDX/JSX and Next.js.

No Name Exists

Abdullah Muhammad

Published on May 17, 20265 min read 1 views

Share:
Article Cover Image

Introduction

So far, we have explored how we can use the AWS SDK in conjunction with AWS IAM to programmatically access and communicate with various AWS services.

The ability to create and scope a user to certain permissions before using their credentials to access AWS services was a quick and secure way of working with the desired services.

However, AWS has created a newer version of their SDK and it is named V3. Today, we are going to explore some of the changes that have been made to the SDK and how some of the refactoring has made the library more useful.

If you are not familiar with the SDK, IAM, or any other AWS service, please refer to this tutorial link. We are going to be working with the same codebase from that tutorial in addition to some changes reflecting the new SDK version.

Project Setup

As seen before, we are going to be working with a full stack application with a separate front-end and back-end (React/Node.js) that will communicate with each other to complete desired tasks.

The main activity takes place in the back-end where the AWS SDK is used to programmatically access the AWS S3 service through the use of an IAM identity. This allows us to insert and delete objects stored inside the S3 bucket.

All this is covered in the aforementioned tutorial link. If you are not familiar with this setup, please refer to that tutorial before proceeding with this one:

No Image Found
Full stack application incorporating the AWS SDK, IAM identity to access the AWS S3 service

That is all there is to it. The codebase will be the exact same except for changes which reflect the new SDK version.

Code Overview

You can follow along by cloning this repository. The directory we will work with is /demos/Demo28_AWS_SDK_Migration.

Here, you will find three directories. Two pertaining to the back-end with each relating to V2 and V3 of the AWS SDK. The third is the front-end directory containing all the relevant client code.

As seen in the aforementioned tutorial, the main file we will focus on is the controller file in the back-end. This will handle the insert and delete object requests in a S3 bucket.


V2 Implementation

The older implementation using AWS SDK V2 can be found in /backend_v2/Controller/PhotoController.js:

GitHub GistJavaScript
require("dotenv").config({ path: '.env' });
const AWS = require("aws-sdk");
const uuid = require('uuid');
const fs = require('fs'); // Built-in modules for working with file data
const { picturePath } = require('../util/picturePath');

// Set up configuration using an AWS IAM identity to programmatically access AWS S3 using SDK
// Set up your user to have appropriate permissions

let configuration = {
    accessKeyId: process.env.AWS_ACCESS_ID,
    secretAccessKey: process.env.AWS_SECRET_KEY,
    region: process.env.AWS_REGION    
}

// Setting the S3 object to work with user identification and region
let S3 = new AWS.S3(configuration);

exports.uploadPhotoController = (req, res) => {

    // Create a read stream for the file and use it as data prior to uploading to S3 Bucket
    fs.readFile(picturePath, (err, fileData) => {
        if (err) {
            res.status(400).json({
                message: "Could not read file data. " + err
            });
        }
        else {
            // Using the upload function to send sunflower picture to AWS S3 bucket
            S3.upload({
                Bucket: process.env.AWS_S3_BUCKET_NAME,
                Key: String(uuid.v4().split('-')[0]) + '.png', // Defining a unique key to identify the object
                Body: fileData
            }, (err, bucketData) => {
                if (err){
                    res.status(400).json({
                        message: "Could not upload sunflower picture. " + err
                    });
                }
                else {
                    res.status(201).json({
                        message: "File uploaded successfully!"
                    });
                }
            });
        }
    });
}

exports.deletePhotoController = (req, res) => {
    const { pictureId } = JSON.parse(req.body.body);

    // Delete requested object stored in S3 bucket using key
    S3.deleteObject({ 
        Bucket: process.env.AWS_S3_BUCKET_NAME, 
        Key: pictureId + '.png'
    }, (err, data) => {
        if (err) {
            res.status(400).json({
                message: "Could not delete object"
            });
        }
        else {
            res.status(200).json({
                message: "Object deleted successfully!"
            });
        }
    });
}
PhotoController.js file containing upload and delete functions handling insert and delete object requests

In typical fashion, we pass in configuration of an AWS IAM identity using their access id, secret key, and the AWS region from where we would like to access the AWS service.

After that, we make use of the AWS SDK to access the S3 service and utilize the ready-made functions the package has to offer such as the upload() and deleteObject() functions.

We ensure that the AWS IAM identity has been assigned the appropriate permissions which will allow it to programmatically access and perform these actions.

If you are not familiar with setting up an AWS IAM identity and assigning it permissions, please refer to the aforementioned tutorial to figure out how to do that.

The same goes for setting with the S3 bucket. The tutorial covers how to set up a S3 bucket and assign it appropriate policies for usage.


V3 Implementation

There is adequate documentation on the web that extensively covers how one can migrate an existing codebase from AWS SDK V2 to V3.

AWS itself provides a nice jumpstart guide which developers can use in their own projects. Link can be found here.

The newer implementation using AWS SDK V3 can be found in /backend_v3/Controller/PhotoController.js and is as follows:

GitHub GistJavaScript
require("dotenv").config({ path: '.env' });
const { DeleteObjectCommand, PutObjectCommand, S3Client } = require('@aws-sdk/client-s3');
const uuid = require('uuid');
const fs = require('fs'); // Built-in modules for working with file data
const { picturePath } = require('../util/picturePath');

// Set up configuration using an AWS IAM identity to programmatically access AWS S3 using SDK
// Set up your user to have appropriate permissions

// Setting the S3 object to work with user identification and region
let S3 = new S3Client({ region: process.env.AWS_REGION, 
                        credentials: { 
                            accessKeyId: process.env.AWS_ACCESS_ID,
                            secretAccessKey: process.env.AWS_SECRET_KEY
                        }
                     });

exports.uploadPhotoController = (req, res) => {
    // Create a read stream for the file and use it as data prior to uploading to S3 Bucket
    fs.readFile(picturePath, (err, fileData) => {
        if (err) {
            res.status(400).json({
                message: "Could not read file data. " + err
            });
        }
        else {
            // Using the upload function to send sunflower picture to AWS S3 bucket
            S3.send(new PutObjectCommand({
                Bucket: process.env.AWS_S3_BUCKET_NAME,
                Key: String(uuid.v4().split("-")[0]) + '.png', // Defining a unique key to identify the object
                Body: fileData
            }))
            .then(() => {
                res.status(201).json({
                    message: "File uploaded successfully!"
                });
            })
            .catch(err => {
                res.status(400).json({
                    message: "Could not upload sunflower picture. " + err
                });
            });
        }
    });
}

exports.deletePhotoController = (req, res) => {
    const { pictureId } = JSON.parse(req.body.body);

    // Delete requested object stored in S3 bucket using key
    S3.send(new DeleteObjectCommand({
        Bucket: process.env.AWS_S3_BUCKET_NAME,
        Key: pictureId + '.png'
    }))
    .then(() => {
        res.status(200).json({
            message: "Object deleted successfully!"
        });
    })
    .catch((err) => {
        res.status(400).json({
            message: "Could not delete object. " + err
        });
    });
}
PhotoController.js file containing SDK V3 implementation for working with a S3 bucket

One of the main benefits of working with V3 is that packages are modularized. In V2, we needed to add the entire aws-sdk package.

However, we can make use of package modularity provided by V3 and only access those service specific packages we need in the codebase.

We need to install the services specifically using the following format:

@aws-sdk/client-<service name>

This will scope down the large package as we will be able to import the desired service and the functions it provides that we would like to work with.

In this case, we install the @aws-sdk/client-s3 package in order to work with the AWS S3 service.

Each package contains functions which correspond to the various actions that can be performed with the given service.

For instance, when working with the S3 service, we can GET, PUT, DELETE objects stored inside a bucket. We can also perform actions on the bucket itself such as creating and deleting a bucket.

In the codebase, we import DeleteObjectCommand and PutObjectCommand which correspond to the PUT and DELETE object actions.

We also import the S3Client and pass in an object containing the region from where we would like to access the bucket as well as a credentials object which contains the AWS IAM identity access id and secret key enabling programmatic access.

The S3Client contains a promisified function called send() which allows us to pass in a command type which itself takes in an object containing details pertaining to the action type (bucket name, object key for naming, object data, etc.).

We simply resolve the promise with a try-catch block which will indicate whether or not the specific request was completed successfully.

We follow this exact process in both the uploadPhotoController() function as well as the deletePhotoController() function.

That is all there is to it. Most, if not all, AWS services are modularized this way in SDK V3.

Feel free to explore other services such as DynamoDB, RDS, and so on. We have explored working with each of these services in the past using SDK V2.

AWS provides extensive documentation related to SDK V3 so feel free to check it out here.

Demo Time!

The demo is straight forward. We will follow the same procedure we did when working with the S3 bucket using AWS SDK V2.

Before proceeding, make sure you have your AWS IAM identity set up with appropriate permissions enabling S3 access. Make sure you have created a S3 bucket and assigned it appropriate bucket and CORS policies as well.

All this is covered in the aforementioned tutorial (AWS SDK V2) so feel free to refer to that before proceeding with the demo.

We will be working with the backend_v3 directory as that contains the AWS SDK V3 code which we would like to test.

Make sure you have the correct dependencies installed in each of the backend_v3 and frontend directories by running npm install.

You will also need to add a .env file in the root location of the backend_v3 directory containing the following:

GitHub GistDotenv
AWS_ACCESS_ID=''
AWS_SECRET_KEY=''
AWS_REGION=''
AWS_S3_BUCKET_NAME=''
.env file containing AWS IAM and S3 credentials for programmatic access

As usual, the back-end server will run on port 5000 and the front-end server will run on the default port 3000.

Running npm start in each directory in a separate terminal should yield the following:

No Image Found
Home page of the React-Node.js-AWS S3 demo application

Navigating to the upload picture section, you should see the following:

No Image Found
Upload picture page allowing users to upload a picture with a click of a button

All you need to do is click the button. The application will make an API call to the back-end and upload a photo.

The photo is a sunflower patch and is located here /backend/util/sunflower.png.

Assuming everything was set up correctly, you should be notified like this:

No Image Found
Upload picture page notifying user that the object was uploaded successfully

If you head over to your S3 bucket using the AWS console, you should notice a new object uploaded with a unique identifier as its name and .png as its extension:

No Image Found
Object uploaded to S3 object and is visible from the console

The key we use to upload objects make use of the uuid library. We can see that in action here.

You can select the object to evaluate details related to it.

Now, copy the id of the object name (without the extension) and head over to the delete picture section of the application and paste in the id into the form:

No Image Found
Object stored in the S3 bucket successfully deleted

Head back to your S3 bucket and refresh the page to see the list of bucket objects stored:

No Image Found
An empty S3 bucket

You should notice that the lone object has been removed from the S3 bucket.

That concludes the demo!

Conclusion

All in all, we did a deep dive of the AWS SDK V3 and looked at the changes made from the previous version (V2).

We worked with modularized packages, incorporating service specific functions for working with the S3 service, setting up credentials using S3Client, and so much more.

Feel free to explore the AWS SDK V3 in greater detail. Each AWS service comes with its own modularized package containing helpful functions for completing tasks programmatically.

In the list below, you will find links to the GitHub repository (used in this article), AWS SDK V3 documentation, AWS SDK V3 GitHub repository, and a Node.js/AWS S3 guide:

AWS SDK V2 is expected to be in maintenance mode by the end of 2023. From here on out, we will be using AWS SDK V3 when working with AWS services.

I hope you found this article helpful and look forward to more in the future.

Thank you!

No Name

Abdullah Muhammad

Blogger. Software Engineer. Designer.

Subscribe to the newsletter

Get new articles, code samples, and project updates delivered straight to your inbox.