Skip to content

A SpringBoot app that resizes images and saves them to AWS S3

Notifications You must be signed in to change notification settings

hajali-amine/image-resizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

63 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Image Resizer

Introduction

This project is my submission for our DevOps & Software Testing's workshop at INSAT.

We were required to come up with a project, implement its Unit Tests, Integration Tests and E2E tests, then implement a CI/CD pipeline.

Project

The project is a simple Spring Boot application that communicates with an AWS S3 bucket.

Through POST /uploadImage, an image that you embed to the body of the request through image key will be resized and saved in an S3 bucket.

Through GET /getImage/{image}, image being the name of the image that you sent, you can get back your picture.

The project requires two environment variables; AWS_ACCESS_KEY and AWS_SECRET_KEY to be able to connect to the S3 bucket.

The project contains:

  • .github: pipelines.
  • assets: assets necessary for the README.
  • e2e: E2E test.
  • infra: simple IAC using Terraform.
  • src: SpringBoot code and Tests.
  • images, resizedimages and retrive: directories that the project require.
  • pom.xml: the XML file that contains information about the project and configuration details used by Maven to build the project.

Tests

Unit Tests

Click here to go to the Unit Tests.

To mock the calls to S3, I used Mockito, a framework that allows the creation of test double objects (mock objects) in automated unit tests for the purpose of test-driven development (TDD) or behavior-driven development (BDD).

@Mock
AmazonS3 s3;

@Rule
public MockitoRule rule = MockitoJUnit.rule();

@Autowired
@Mock
private ImageServiceS3 imageServiceS3;

@Before
public void setUp() {
    MockitoAnnotations.openMocks(this);
    ReflectionTestUtils.setField(imageServiceS3, // inject into this object
            "property", // assign to this field
            "value"); // object to be injected
}

@Test
public void uploadFileFromMultipartFileTestCase() throws IOException {
    Mockito.when(s3.putObject(anyString(), anyString(), anyString())).thenReturn(new PutObjectResult()); // Mock call to S3
    Mockito.when(imageServiceS3.getS3()).thenReturn(s3);
    BufferedImage bufferedImage = ImageIO.read(Paths.get(resourcePath + "/test_image.jpg").toFile());
    Mockito.when(imageServiceS3.uploadImage(uploadedFileName, bufferedImage)).thenCallRealMethod(); // Call real method
    // rest of the test
}

Integration Tests

In this part, we're going to test the interfaces, meaning everything related to the communication with S3 buckets.

For that, we are going to test the retrieval and upload on a test bucket, which is just another bucket not related to the one we use in the application.

You can find the tests here.

E2E Tests

As for the E2E test, since we only have a backend project, I opted for a Python script that tests the main workflow.

The script is the following:

  1. Send POST /uploadImage with our test image.
  2. Check that the response status is 200 and response message is clear.
  3. Send GET /getImage/testImage.
  4. Check that the image is retrieved correctly and that it had been resized.
  5. Remove image from S3 to clear everything up.

To clear everything up, we use this method.

# Remove object from S3
def clean_up(s3):
    os.remove(TEMP_FILEPATH)
    b = boto3.Bucket(s3, BUCKET_NAME)
    k = boto3.Key(b)
    k.key = FILENAME
    b.delete_key(k)

CI/CD Pipeline

Dockerfile

As you can see in the Dockerfile, I opted for a multi-stage build.

  • Stage one:

    1. Copy pom.xml;
    2. Install maven dependencies;
    3. Copy ./src
    4. Build .jar file.
  • Stage two:

    1. Make the directories necessary for the project;
    2. Copy the .jar file from Stage one;
    3. Expose port and annouce env variables;
    4. Run the .jar file.

Preparing the Infrastructure using IAC

First, we need to prepare the EC2 instance.

For that, you can find in the infra directory the Terraform code to provision the EC2 and prepare the Security Groups, and also install Docker in the instance.

For all of this to work, we need to do the following.

  1. Prepare our RSA keys by ssh-keygen -t rsa -m PEM and put them in ./infra/keys directories;
  2. Add the values in terraform.tfvars file;
  3. Go to terraform.io and prepare the workspace;
  4. Generate an API access token in the Terraform cloud.
  5. Add the private key, terraform.tfvars's content and the API access token to GitHub secrets.

Here's and example of terraform.tfvars;

aws-region = "AWS region"
aws-access-key = "AWS access key"
aws-secret-key = "AWS secret key"
ec2-public-key = "The public key generated"

And we're all set.

CI/CD Pipeline on Push in the Main Branch

  1. Tests: Run Unit Tests and Integration Tests.
  2. Build and Release: Build the Docker Image and push it to Dockerhub.
  3. E2E Tests: Run the python script.
  4. Run the IAC: Apply the changes in the Terraform code.
  5. Deploy:
    1. SSH into the EC2 instance.
    2. Kill the docker container that's currently running and remove it.
    3. Pull the new image.
    4. Run the new image.

cicd pipeline

CI Pipeline on Pull Request in any branch

A simple pipeline to run the Unit Tests and Integration Tests.

pr pipeline


Thank you for your time and don't forget to leave a 🌟