Today, Amazon Web Services’ (AWS) cost to the company is everyone’s concern. Be it management or QA, everybody wants to implement cost effective measures. But the question is, how can we do it? Is it really possible?
From my brief experience in AWS services, I have analysed a few things which are leading to cost leakage. Let’s look at them in detail -
Running unused EC2 instances:
EC2 instances are the VMs created on AWS platform. We need these instances to be operational to test our probe. Every running instance comes with a cost, even if we try to manually control it i.e., moving it up and down. These instances are functional in 10 regions and there is a fair chance that one might forget to shut them down. This becomes hard to monitor on every occasion, because several people are working on the same server simultaneously.
Every table created in AWS raises cost, although the rate could vary, depending upon your requirement.
Hit and Try:
Hit and Try is another area which leads to leakage. Let’s take an example of ECS – for configuring it, you need to deal with a lot of other AWS services, like EC2 instances or Cluster and Docker.
Sometimes, during investigations, we may create some instances and images/repository which involve a lot of cost when not in use. And since you are doing it for the first time, you don’t know what to keep up and what to keep down.
Environment setup takes time and money:
Setting up the environment can involve all the above mentioned points, which may hike the cost as well as prolong the process.
How can we automate all the processes and make them more efficient?
Let’s take examples from the following tasks in AWS services:
- Creating instances (EC2)
- Stopping instances (EC2)
- Deleting instances (EC2)
- Create, delete, populate, batch write, scan, get item, put item, etc. for table (DYNAMO DB Metrics)
What if we have something that can do all the above-mentioned tasks from the backend without logging in to the AWS from Web?
This calls for Python and Boto for help!
What is Python and Boto?
AWS with Python:
- It is a scripting language, which can do almost anything including backend automation, DB automation or browser automation.
- It is a Python interface to Amazon Web Services.
- An integrated interface to current and future infrastructural services offered by Amazon Web Services.
- Install Python 2.7,download it from https://www.Python.org/downloads/
- Open cmd
- Go to cd C:\Python27\Scripts
- Type pip install Boto
Once you have Boto, you can just play with the AWS.
That’s it!! Your Python and Boto installation is complete.
Connecting to the AWS server and playing with EC2 component
You can connect to the AWS server with a few simple commands, where it will ask for credentials.
Once you are logged in, you can count the number of active instances.
If you want to know the status of your instances and make them start-stop, you can use the functions below:
You can even launch an instance through
Working with Dynamo DB
Creating tables manually and inserting/deleting/updating data are tedious tasks, especially when you have to generate the data in your tables in order to verify the metrics.
Here’s how we can do it automatically through Python –
Making a connection is mandatory for all the features in Boto, which you can easily do with the above stated commands.
Once the connection is established, you can obtain the list of already existing tables through the following command:
To create a table
You can create a table with the help of Create Schema and Create Table command as specified below:
DynamoDB tables are created with the Layer2.create table method. We need to create a schema for the table’s hash key element and the optional range key element.
You can add the data through table.new_item and then save in the Dynamo db through put()
Similarly, you can retrieve the data table automatically.
The above mentioned examples demonstrate how effortlessly we can manage AWS. Once you have a basic understanding of Python and Boto, get ready for a smoother hassle-free execution of the essential tasks. You can further deep dive into more stringent areas following a fundamental knowledge gain.