AWS SDK 2.0 - S3 Object Operations using Spring Boot

Carvia Tech | August 05, 2019 | 4 min read | 20 views | AWS Tutorials


In this tutorial, we will walk through new AWS SDK 2.0 for doing object level operations on S3 bucket. We will specifically cover PutObject, GetObject and GetUrl operation on S3 Objects using AWS SDK 2.0

Table of contents

  1. Configure Gradle Build

  2. Creating Singleton Bean for S3 service client

  3. Uploading an object to S3 Bucket

  4. Fetching object from S3 bucket

  5. S3utilities to getUrl for an Object

Why AWS SDK 2.0

The AWS SDK for Java 2.0 is a major rewrite of the version 1.x code base. It’s built on top of Java 8+ and adds several frequently requested features. These include support for non-blocking I/O and the ability to plug in a different HTTP implementation at run time.

For more information

Developer Guide V2

1. Gradle dependencies

AWS has changed the dependencies versions and naming convention in SDK 2.0

You can follow the AWS Developer Guide for more details

To use the AWS SDK for Java in your Gradle project, use Spring’s dependency management plugin for Gradle.

build.gradle
group 'foo.bar'
version '1.0'

apply plugin: 'java'

sourceCompatibility = 1.8

repositories {
mavenCentral()
}

dependencies {
	implementation platform('software.amazon.awssdk:bom:2.5.29')
	implementation 'software.amazon.awssdk:s3'  (1)
}
1 Gradle automatically resolves the correct version of your SDK dependencies using the information from the BOM. No need to specify the version for service client libraries.

2. Creating Bean declaration

To make requests to AWS, you first need to create a service client object (S3Client for example). AWS SDK 2.0 provides service client builders to facilitate creation of service clients.

AWS SDK 2.0 has changed the class naming convention and removed AWS prefix from most of the classes. AmazonS3Client has been replaced with S3Client. When using Spring Boot, we can simply create bean of S3Client and use it as and when required.

Creating Bean for S3 Service Client
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import software.amazon.awssdk.auth.credentials.AwsBasicCredentials;
import software.amazon.awssdk.auth.credentials.StaticCredentialsProvider;
import software.amazon.awssdk.regions.Region;
import software.amazon.awssdk.services.s3.S3Client;

@Configuration
public class S3Config {

    @Bean(destroyMethod = "close")
    public S3Client s3Client() {
        return  S3Client.builder()
                .region(Region.of(region))
                .credentialsProvider(StaticCredentialsProvider.create(AwsBasicCredentials.create(accessKey, secretKey)))
                .build();
    }
}
Service Client Lifecycle

Service clients in the SDK are thread-safe. For best performance, treat them as long-lived objects. Each client has its own connection pool resource that is released when the client is garbage collected. The clients in the AWS SDK for Java 2.0 now extend the AutoClosable interface. For best practices, explicitly close a client by calling the close method.

3. Uploading object to S3 bucket

Now we have the service client bean ready, which we can inject into a service and start uploading an object to S3 bucket with specified keyname.

S3UploadService
@Service
public class S3Service {

    private static final Logger logger = LoggerFactory.getLogger(S3Service.class);

    @Autowired
    private S3Client s3Client;

    @Value("${aws.s3.bucket}")
    private String bucket;

    public String upload(String keyName, byte[] attachment) {
        try {
            logger.info("Uploading a PDF to S3 - {}", keyName);
            PutObjectResponse putObjectResult = s3Client.putObject(
                    PutObjectRequest.builder()
                            .bucket(bucket)
                            .key(keyName)
                            .contentType(MediaType.APPLICATION_PDF.toString())
                            .contentLength((long) attachment.length)
                            .build(),
                    RequestBody.fromByteBuffer(ByteBuffer.wrap(attachment)));
            final URL reportUrl = s3Client.utilities().getUrl(GetUrlRequest.builder().bucket(bucket).key(keyName).build());
            logger.info("putObjectResult = " + putObjectResult);
            logger.info("reportUrl = " + reportUrl);
            return reportUrl.toString();
        } catch (SdkServiceException ase) {
            logger.error("Caught an AmazonServiceException, which " + "means your request made it "
                    + "to Amazon S3, but was rejected with an error response" + " for some reason.", ase);
            logger.info("Error Message:    " + ase.getMessage());
            logger.info("Key:       " + keyName);
            throw ase;
        } catch (SdkClientException ace) {
            logger.error("Caught an AmazonClientException, which " + "means the client encountered "
                    + "an internal error while trying to " + "communicate with S3, "
                    + "such as not being able to access the network.", ace);
            logger.error("Error Message: {}, {}", keyName, ace.getMessage());
            throw ace;
        }
    }
}

S3Client exposes S3Utilities object that can be used to create the utilities class that heps us with getting URL for a given S3 object.

4. Get object from S3 bucket

We can compose a GetObjectRequest using builder pattern specifying the bucket name and key and then use s3 service client to get the object and save it into a byte array or file.

Get object from S3 bucket
@Service
public class S3Service {

    private static final Logger logger = LoggerFactory.getLogger(S3Service.class);

    @Autowired
    private S3Client s3Client;

    @Value("${aws.s3.bucket}")
    private String bucket;

    public byte[] getObject(String keyName) {
        try {
            logger.info("Retrieving file from S3 for key: {}/{}", bucket, keyName);
            ResponseBytes<GetObjectResponse> s3Object = s3Client.getObject(
                    GetObjectRequest.builder().bucket(bucket).key(keyName).build(),
                    ResponseTransformer.toBytes());
            final byte[] bytes = s3Object.asByteArray();
            return bytes;
        } catch (SdkClientException ase) {
            logger.error("Caught an AmazonServiceException, which " + "means your request made it "
                    + "to Amazon S3, but was rejected with an error response" + " for some reason: " + keyName, ase);
            throw ase;
        } catch (SdkServiceException ace) {
            logger.error("Caught an AmazonClientException, which " + "means the client encountered "
                    + "an internal error while trying to " + "communicate with S3, "
                    + "such as not being able to access the network: " + keyName, ace);
            throw ace;
        }
    }
}

Top articles in this category:
  1. Spring Boot 2.0 Reactive Web Performance Metrics
  2. Custom banner in spring boot application
  3. Setting a Random Port in Spring Boot Application at startup
  4. Running Spring Boot app as a service in unix
  5. Run method on application startup in Spring Boot
  6. What is new in Spring Boot 2
  7. Prevent Lost Updates in Database Transaction using Spring Hibernate



Find more on this topic:
Spring Framework image
Spring Framework

Spring Framework - MVC, Dependency Injection, Spring Hibernate, Spring Data JPA, Spring Boot and Spring Cloud for Microservices Architecture.

Last updated 1 week ago


Recommended books for interview preparation:

This website uses cookies to ensure you get the best experience on our website. more info