How to set up a continuous deployment pipeline to deploy versions of an application on AWS Elastic Beanstalk using AWS CodePipeline (Part 2)?

Reading Time: 11 minutes

AWS Elastic Beanstalk allows us to quickly deploy and manage applications without having to worry about the underlying infrastructure. One of its major benefits is that it reduces the complexity of management of the infrastructure without any restrictions thereby making it easy for developers to focus on just building the application and least worry about the deployment architecture.

We can make use of AWS Elastic Beanstalk to deploy and scale the deployed web applications with an easy-to-use interface. Elastic Beanstalk supports the deployment of Java, .Net, Python, Node.js, Ruby, etc. applications. 

Having a variety of options makes it a most popular service amongst developers to quickly deploy web applications. All that is required by the service is to just upload the code base as a .zip file and Elastic Beanstalk will automatically handle the deployment process taking care of everything that is required to maintain and manage the infrastructure. 

The service does not charge you a fixed amount since you only pay for the resource needed to run the applications. AWS CodePipeline is basically a web service that automates the deployment of an application. It consists of various stages like Source, Builds, Approval (optional), and Deploy. The Source stage can easily be integrated with 3rd party tools or with AWS CodeCommit or Amazon S3 to update the code base.

The build stage can then be integrated with Amazon S3 to store the logs and similarly, the deploy stage can further be integrated with multiple services including AWS Elastic Beanstalk. You can push the application version updates quickly by automating the process of deployment of the new application versions to ease the process of multiple version releases.

Refer to part 1 of the blog here.

AWS Elastic Beanstalk – Recap

AWS Elastic Beanstalk

Amazon ECS (Amazon Elastic Container Solution) is a highly scalable and fast container management service that simplifies the process of starting, stopping, and managing containers in a cluster. The containers you use to run individual tasks or tasks within a service are defined in a task definition.

Simply upload your code, and Elastic Beanstalk will take care of everything else, including capacity provisioning, load balancing, auto-scaling, and application health monitoring. You have complete control over the AWS resources that power your application and can access them at any time. Elastic Beanstalk is free; you only pay for the resources you use.

Hands-on

AWS Elastic Beanstalk

In this hands on we will see how we can make use of AWS CodePipeline to set up a continuous deployment pipeline to deploy versions of an application on Amazon Elastic Beanstalk. We will first navigate to the Amazon Elastic Beanstalk dashboard to explore the pricing structure and its use cases leading to creating a new environment configuring the advanced settings in a private VPC. We will then navigate to the demo source code URL to download the code base and then traverse to the Amazon S3 dashboard, create a new bucket with versioning enabled, and then we will upload the downloaded zip file of the code base to the Amazon S3 bucket from the local machine. Traversing to the Amazon CodePipeline dashboard, we will create a new pipeline with source as Amazon S3 bucket that contains the code base and the destination as the Amazon Elastic Beanstalk environment on which the code is to be deployed. Uploading the code file to the S3 bucket will trigger the CodePipeline initiating the process of deployment on the Elastic Beanstalk environment. We will then navigate to the application URL to test out the code changes and check if the version deployment of the application was successful. We will then extract the code files to the local machine from the zip file, alter the code in the main file, navigate to the newly created Amazon S3 bucket and then redeploy the new zip file to the S3 bucket. After uploading, we’ll be able to see two versions of the zip code file. Uploading the new code file to the S3 bucket will then trigger the code pipeline and initiate the process of deploying the new code base on the Elastic Beanstalk environment. We will then navigate to the application URL again to test out the code changes and check if the version deployment of the application was successful.

To implement this, we will do the following:

  • Navigate to the dashboard after logging into your AWS console.
  • Navigate to the Amazon Elastic Beanstalk dashboard.
  • Explore the Amazon Elastic Beanstalk dashboard to view its offerings and pricing structure.
  • Create a new environment and configure the advanced settings for the environment.
  • Select a VPC in which you want to create a new environment.
  • Navigate to the provided URL and download a demo source code.
  • Create a new bucket in the Amazon S3 console.
  • Upload the downloaded zip file to the bucket.
  • Navigate to the AWS CodePipeline dashboard.
  • Create a new pipeline with the required stages.
  • Wait for the code to be deployed to the Elastic Beanstalk environment.
  • Navigate to the Elastic Beanstalk environment dashboard and check the health status after the deployment stage completes on the codepipeline.
  • Click on the URL provided for the application version and test out the deployment.
  • Extract the code files to your local machine from the downloaded zip file.
  • Make new changes in the code base as required.
  • Add the newly created code file to the zip file.
  • Navigate to the S3 dashboard and upload the zip file with the new code base.
  • Check if the code pipeline is triggered.
  • Once the deployment completes, navigate back to the application version URL and check if the deployment was a success and if the new code is reflected on the UI.

Login to the AWS console and navigate to the dashboard.

Search for the service Elastic Beanstalk and click on the service.

You will be navigated to the Elastic Beanstalk dashboard. Click on Environments in the left navigation pane.

Elastic Beanstalk

Click on Create a new environment.

Select Web server environment from the drop-down menu and click Select.

Elastic Beanstalk

Enter a name for the Application and the Environment name will be pre-populated. You can change the name.

In case you have a registered domain, you can add the domain and a description under the Environment information.

Elastic Beanstalk

Select the Platform as PHP from the dropdown and the platform version will be populated.

Scroll down and click on Configure more options.

You can change the presets as per your preference.

Scrolling down, you will see there are configurations available for different sections. You may choose to alter the default configurations.

Elastic Beanstalk

Click on Edit besides the Network section.

Select a VPC from the dropdown. Check the Public IP address and select at least 2 Instance subnets.

Elastic Beanstalk

Similarly, for the database settings, select both the subnets. Click on Save.

Once done, click on Create environment.

Environment creation process will be initiated. Wait for a few minutes for the environment to be created.

Elastic Beanstalk

Meanwhile, navigate to the below URL. Go to the dist folder and select the file. Click on View raw and a zip file of the code base will be downloaded to your system.

https://github.com/awslabs/aws-codepipeline-s3-aws-codedeploy_linux

Once the environment creation process completes, you will see the message as shown in the image below.

Refresh the page and you will see the environment created with the health status as Ok.

Elastic Beanstalk

Search for S3 service in the search bar and navigate to the dashboard.

Click on Create bucket.

Enter a name for the bucket and select the AWS region.

Check the box for Block all public access and under Bucket versioning, select Enable.

Elastic Beanstalk

Scroll down and click on Create bucket.

On success, you will see the message as shown in the image below.

Open the newly created bucket and click on Upload.

Select the zip file that you downloaded above from the local machine and click on Upload.

On success, you will see the file added to your bucket.

Now, search for Amazon CodePipeline and click on the service.

Click on Create pipeline.

Elastic Beanstalk

Enter a name for the pipeline and select a new service role option.

Under Advanced Settings, select the option as shown below. Click on Next.

Elastic Beanstalk

Select the S3 bucket as the source provider, select the newly created bucket and add the object key. Click on Next and skip the Build stage.

For the Deploy stage, select the Deploy provider as AWS Elastic Beanstalk, a region, the application name of the elastic beanstalk environment that you created above, and the Elastic Beanstalk environment name. Click on Next.

Review the changes.

Click Create Pipeline at the bottom of the page.

Elastic Beanstalk

The deployment will begin. Once the source is updated, the deployment process will begin.

Elastic Beanstalk

If you navigate back to the Beanstalk environment, the environment will start updating.

Once the deployment completes, you will see the updates under the recent events tab. Once the deployment completes and the Health status changes to Ok, click on the environment URL below the environment name.

You will see the application deployed and will get to see the page as shown in the image below.

Elastic Beanstalk

Now, extract the index.html file from the downloaded zip file. Make some changes in the file as you want. The below file snippet contains the changes we made in the file. Once you make the changes, add the newly updated file in the same zip folder replacing the old index.html file.

Elastic Beanstalk

Navigate to the S3 bucket dashboard. Click on Upload.

Upload the new zip file. Click on Upload.

Elastic Beanstalk

Toggle the versions to on and you will see the newly zip file added in the bucket.

The code pipeline will be triggered to update the new version.

Once the deployment completes, navigate back to the same URL and refresh the page.

You will see the updated website as per the changes you made in the code base to deploy a new version for the application.

Elastic Beanstalk

Conclusion

In this blog, we saw how we can make use of AWS CodePipeline to set up a continuous deployment pipeline to deploy versions of an application on Amazon Elastic Beanstalk.

We first navigated to the Amazon Elastic Beanstalk dashboard to explore the pricing structure and its use cases leading to creating a new environment configuring the advanced settings in a private VPC. We then navigated to the demo source code URL to download the code base and then traversed to the Amazon S3 dashboard, created a new bucket with versioning enabled, and then we uploaded the downloaded zip file of the code base to the Amazon S3 bucket from the local machine.

Traversing to the Amazon CodePipeline dashboard, we created a new pipeline with the source as an Amazon S3 bucket that contained the code base and the destination as the Amazon Elastic Beanstalk environment on which the code was to be deployed. Uploading the code file to the S3 bucket triggered the CodePipeline initiating the process of deployment on the Elastic Beanstalk environment.

We then navigated to the application URL to test out the code changes and check if the version deployment of the application was successful. We then extracted the code files to the local machine from the zip file, altered the code in the main file, navigated to the newly created Amazon S3 bucket, and then redeployed the new zip file to the S3 bucket. After uploading, we saw the two versions of the zip code file.

Uploading the new code file to the S3 bucket then triggered the code pipeline again and initiated the process of deploying the new code base on the Elastic Beanstalk environment. We then navigated to the application URL again to test out the code changes and check if the version deployment of the application was successful.

We will discuss more use cases for the services used in our upcoming blogs. Stay tuned to keep getting all updates about our upcoming new blogs on AWS and relevant technologies.

Meanwhile …

Keep Exploring -> Keep Learning -> Keep Mastering

This blog is part of our effort towards building a knowledgeable and kick-ass tech community. At Workfall, we strive to provide the best tech and pay opportunities to AWS-certified talents. If you’re looking to work with global clients, build kick-ass products while making big bucks doing so, give it a shot at workfall.com/partner today.

Back To Top