How to use AWS SSM Parameter Store in AWS Elastic Beanstalk instances?

Reading Time: 9 minutes

Almost every developer makes use of environment variables. They are excellent for establishing parameters that alter depending on the system’s surroundings. Sadly, they struggle to scale. There are methods to accomplish it, but they aren’t enjoyable. Here’s why AWS can replace environment variables (or at least reduce your reliance on them). Building contemporary cloud apps is made simple using the AWS Systems Manager Parameter Store.

A Parameter Store like this would be utilized in a real cloud setup for a variety of purposes, including setting your configuration in Cloud Formation templates. Additionally, if you employ container technologies like ECS and EKS/Kubernetes, these characteristics would be made available to certain containers as environment variables.

In this blog, we will cover:

  • What is AWS Systems Manager, its functions & use cases
  • AWS Systems Manager Parameter Store, its features, & types
  • What is AWS Elastic Beanstalk?
  • Hands-on
  • Conclusion

What is AWS Systems Manager?

AWS Systems Manager is a set of tools that can assist you in managing the infrastructure and applications that are hosted on the AWS Cloud. Systems Manager makes it easier to manage applications and resources, speeds up the process of identifying and fixing operational issues, and aids in the safe, scalable management of your AWS resources.

Functions of Systems Manager

The following categories are used by the Systems Manager to classify its functions:

Functions of Systems Manager in AWS Elastic Beanstalk

To know more about AWS Systems Manager, you can check our blog: How to run commands remotely on an EC2 instance using AWS Systems Manager

Common use cases of SSM

  • Saving configuration for runtime initialization of Docker containers
  • Lambda functions and apps, secret-keeping
  • SSM Parameters utilized in CloudFormation

What is a Parameter?

Any piece of information that is saved in the AWS Parameter Store, such as a block of text, a list of names, a password, an AMI ID, a licensing key, and so on, is referred to as a Parameter Store parameter. This information can be securely and centrally referred to in your scripts, instructions, and SSM papers.

AWS Systems Manager Parameter Store

AWS Systems Manager is a tool created to make it easier for you to manage sizable groups of servers installed in the cloud. For instance, it offers remote access to systems, scalability for various administrative activities, remote command execution, and security and patch updates.

Additionally, it offers a function known as the Parameter Store. A great place to keep centralized data like API keys, database strings, passwords, and other configuration information is the parameter store.

Parameter Store Types

There are three types of Parameter Store types, as mentioned below:

  • String: Strings are precisely what you would imagine. Any block of text, such as Hello World, test, or wow, this is a fantastic blog article, is a string.
  • StringList: Again, StringList is quite understandable. A StringList is a group of strings that have been separated by commas. Examples of string lists include Tiger,Lion,Puma and Venus,Neptune,Saturn.
  • SecureString: API keys and other sensitive data are stored in SecureString. AWS Key Management Service-managed keys are used to encrypt the data included in a SecureString parameter.

How can a company benefit from a Parameter Store?

These benefits are provided by Parameter Store:

  • Using hosted secrets management service that is safe, scalable, and requires no management of servers
  • By separating your data and code, you may strengthen your security posture.
  • Maintain version control while storing encrypted strings and configuration data in hierarchies
  • Access control and auditing
  • Parameter Store is hosted across different Availability Zones within an AWS Region, hence the parameters can be stored safely

Who ought to utilize the Parameter Store?

  • Anyone who uses AWS and wishes to handle configuration data in a centralized manner.
  • Software developers who wish to store various logins and reference streams.
  • Administrators who desire notifications when their passwords and secrets are updated or not.

Features of Parameter Store

  • Change notification: Both parameters and parameter policies provide the ability to configure change notifications and launch automatic actions.
  • Organize & control access: You can tag each of your parameters separately to make it easier for you to recognize one or more parameters depending on the tags you’ve given them. You may, for instance, tag parameters for particular environments, divisions, users, groups, or times. By establishing an AWS Identity and Access Management (IAM) policy that details the tags that a user or group can access, you can also limit access to parameters.
  • Label versions: By defining labels, you can give different iterations of your parameter an alias. When there are numerous variants of a parameter, labels might assist you to recall what it is for.
Features of Parameter Store in AWS Elastic Beanstalk
  • Data validation: You can define parameters that lead to an Amazon Elastic Compute Cloud (Amazon EC2) instance, and Parameter Store validates these parameters to ensure the intended resource type is referenced, the resource is there, and the customer has the authorization to use the resource.
  • Reference secrets: When using other AWS services that already enable references to Parameter Store parameters, such as AWS Secrets Manager, you can get Secrets Manager secrets thanks to the integration between Parameter Store and AWS Secrets Manager.

What is AWS Elastic Beanstalk?

For delivering and scaling web applications and services created using Java,.NET, PHP, Node.js, Python, Ruby, Go, and Docker on well-known servers like Apache, Nginx, Passenger, and IIS, AWS Elastic Beanstalk is a simple-to-use tool.

Elastic Beanstalk will handle everything from capacity provisioning, load balancing, auto-scaling, and application health monitoring when you simply upload your code. The AWS resources that are powering your application are still completely under your control, and you always have access to them.

Elastic Beanstalk is free; you just pay for the AWS resources required to store and run your apps.

If you want to know more about AWS Elastic Beanstalk, refer to our blog: How to set up a continuous deployment pipeline to deploy versions of an application on AWS Elastic Beanstalk using AWS CodePipeline (Part 1)?


For this example, we will use a Node.js application deployed on Elastic Beanstalk.

In order to ensure the safety of some vital information such as API keys and login credentials, it is a good practice not to include them in your code. 

The typical best practice is to use a .env file and then add it to .gitignore so that the changes in the file are not tracked. It might be a tragedy for your API keys for example to end up somewhere in a public repository.

The most common way to do this is to manually copy the .env file to the server you are deploying the application and export the variables in the shell running the “start script” so that the Node.js application can have access to them at runtime.

Elastic Beanstalk provides a way to add environment variables that the application you are deploying will use.

Problem Statement

Elastic Beanstalk environments have a size limit of 4096Kb. If that limit is reached, you cannot add further environment variables, which is frustrating.

Hands On Learning the AWS Elastic Beanstalk Instances

For this solution, we will use Platform Hooks combined with a pre-deployment script to create a .env file for PM2 to use. To know more about platform hooks, refer here. The pre-deployment script in our case runs in the pipeline which creates a .env file and we’ve zipped it together with the other artifacts which will be deployed on Elastic Beanstalk.

Create a “.platform” folder at the root of our application repository. Use the following structure:

`-- .platform/
    |-- hooks/ # Application deployment hooks 
    |   `-- postdeploy/
    |       `-- 
     `-- confighooks/ # Configuration deployment hooks
         `-- postdeploy/

The file is as follows:


# exit on any failure
set -e
# keep track of the last executed command
trap 'last_command=$current_command; current_command=$BASH_COMMAND' DEBUG
# echo an error message before exiting (seems to happen on error or success?)
trap 'echo "\"${last_command}\" command completed with exit code $?."' EXIT
echo " executing"
echo "Running script to fetch parameter store values and add them to /opt/elasticbeanstalk/deployment/env file."
# We need to check the Elastic Beanstalk environment properties to find out
# what the path is to use for the parameter store values to fetch.
# Only the parameters under that path will be fetched, allowing each Beanstalk
# config to specify a different path if desired.
apt install jq
readarray eb_env_vars < /opt/elasticbeanstalk/deployment/env
for i in ${eb_env_vars[@]}
    if [[ $i == *"parameter_store_path"* ]]; then
        parameter_store_path=$(echo $i | grep -Po "([^\=]*$)")
if [ -z ${parameter_store_path+x} ]; then
    echo "Error: parameter_store_path is unset on the Elastic Beanstalk environment properties.";
    echo "You must add a property named parameter_store_path with the path prefix to your SSM parameters.";
    echo "Success: parameter_store_path is set to '$parameter_store_path'";
    # AWS_REGION="us-east-1"
  # export AWS_REGION
 #Create a copy of the environment variable file.
    cp /opt/elasticbeanstalk/deployment/env /opt/elasticbeanstalk/deployment/custom_env_var
 # Add values to the custom file
    # echo "AWS_REGION=$AWS_REGION" >> /opt/elasticbeanstalk/deployment/custom_env_var
# jq_actions=$(echo -e ".Parameters | .[] | [.Name, .Value] | \042\(.[0])=\(.[1])\042 | sub(\042${parameter_store_path}/\042; \042\042)")
aws ssm get-parameters-by-path \
    --path $parameter_store_path \
    --parameter-filters Key=Label,Values=YourLabel,Option=Equals \
    --with-decryption \
    --recursive \
    --region us-east-1 \
    --output json \
    | jq -r '.Parameters[] | "\(.Name)=\(.Value)"' >> /opt/elasticbeanstalk/deployment/custom_env_var
 cp /opt/elasticbeanstalk/deployment/custom_env_var /opt/elasticbeanstalk/deployment/env
 export $(cat /opt/elasticbeanstalk/deployment/env | xargs)
 #Remove temporary working file.
    rm -f /opt/elasticbeanstalk/deployment/custom_env_var
 #Remove duplicate files upon deployment.
    rm -f /opt/elasticbeanstalk/deployment/*.bak

N/B: parameter_store_path must be set on Elastic Beanstalk environment properties

To access the EB environment properties, click on Configuration on the left navigation then click on Edit under the Software category.

The above solution will still spin up an unhealthy instance on Elastic Beanstalk, this is where the pre-deployment script comes in.

The pre-deployment script is as follows:

# exit on any failure
set -e
# keep track of the last executed command
trap 'last_command=$current_command; current_command=$BASH_COMMAND' DEBUG
# echo an error message before exiting (seems to happen on error or success?)
trap 'echo "\"${last_command}\" command completed with exit code $?."' EXIT
export DEBIAN_FRONTEND=noninteractive
apt-get update && \
apt-get install -y \
    apt-utils \
    curl \
    git \
    python3 python3-dev python3-pip libpython3-dev \
awscli \
rpl \
build-essential \
libkrb5-dev \
locales \
jq \
zip unzip && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
&& localedef -i en_US -c -f UTF-8 -A /usr/share/locale/locale.alias 
export LANG=en_US.UTF-8
export PATH=/root/.local/bin:$PATH
python3 --version
curl -O
python3 --user
pip --version
pip install awsebcli --upgrade --user
touch .env
pip3 install --upgrade awscli
aws ssm get-parameters-by-path \
--path / \
--parameter-filters Key=Label,Values=YourLabel,Option=Equals \
--with-decryption \
--recursive \
--region us-east-1 \
--output json \
| jq -r '.Parameters[] | "\(.Name)=\(.Value)"' >> .env

Last but not least, we have to update the “start” script in package.json to declare the variables in the shell environment for PM2 to use (your script might be different):

"start": "export $(cat .env | xargs) && node dist/main",

To ascertain that the scripts have been executed, check the elastic Beanstalk logs. The screenshots are as shown below:

Click on the request logs dropdown to pick either full logs or the first 100 lines (Full logs are more detailed and categorized hence recommended). Click download and a zip file will be downloaded to your computer then extract the zip file to view its contents.

There are two log files that will be of interest: eb-engine.log and eb-hooks.log. Eb-engine.log will be the go-to in case the deployment fails when the script does not execute successfully and eb-hooks.log will keep track of all the executed commands in the platform hooks.


In this blog, we explored what is a Parameter, AWS Systems Manager, its functions, common use case, AWS Systems Manager Parameter Store, its types, features, and Elastic Beanstalk. Moreover, we demonstrated how to operate a Node.js application installed on Elastic Beanstalk. We will come up with more such use cases in our upcoming blogs.

Meanwhile …

If you are an aspiring AWS Professional and want to explore more about the above topics, here are a few of our blogs for your reference:

How to set up a continuous deployment pipeline to deploy versions of an application on AWS Elastic Beanstalk using AWS CodePipeline (Part 2)?

How to create CI/CD workflow using AWS CodeStar?

Stay tuned to get all the updates about our upcoming blogs on the cloud and the latest technologies.

Keep Exploring -> Keep Learning -> Keep Mastering 

At Workfall, we strive to provide the best tech and pay opportunities to kickass coders around the world. If you’re looking to work with global clients, build cutting-edge products and make big bucks doing so, give it a shot at today!

Back To Top