AWS Code and Build Services with name and Descriptions

Mukund Mundhra
9 min readMay 28, 2021

So to start with let’s discuss the release process first to understand it better, basically it’s divided into four primary parts namely source, build, test and production.

Firstly, What this does is it just checks with the source code like it will identify the java and other formatted files. A new node review will take place thereafter we move onto the build step where the code complies and unit testing takes place moving onto test which includes integrating tests with other systems, load testing, UI tests, and penetration testing after which we move onto the production where deployment to the production environments takes place. To understand in a much better way we can say that during a software release AWS CodePipeline works from the source to the production, and for the specifics, AWS code commits works during the source and AWS CodeBuild works for the building stage and for testing the availability of third party testing toolkits is available and while deploying AWS CodeDeploy comes into the picture. Okay so now let’s understand about the AWS Code Services one by one which are :

  1. AWS CodePipeline
  2. AWS CodeDeploy
  3. AWS CodeCommit
  4. AWS CodeBuild

— AWS CodePipeline

So when we talk about AWS particularly we come across something called a code pipe which lets us simplify the processes by creating channels or a pipeline where automation takes place. In, general terms it is a code pipeline that is nothing but a continuous delivery service and we can use this service to model / visualize/automate the steps required while releasing software. To understand this in a better way firstly let’s understand continuous delivery and integration, so what actually happens in it is that the gap between the developers and operation team is being reduced, it tries to automate the process of integration and delivery. On a bigger scale let’s say there are sixty to eighty developer working on a project parallelly and a certain number of resources will be used by everyone, so the problem that might occur in such an instance is that let’s say a developer is working on a code and another developer is also working on it as well and there is like a central system where all this data is being stored, now let’s assume both of them make a change and save it to the system, now when reviewing the individual developers will need like a fresh copy of that code, basically which code to be selected would be an ambiguity, so what if the central system is so smart that whenever a code is submitted it runs test and check if it is more relevant or not, this will enable the developers to access the best piece of code on review. This process of committing the code, putting in that code, and automating this whole process so as the workflow continues it gets delivered and deployed to the production in a similar fashion with the required tests is called continuous integration and delivery. So here once the piece of code is ready to move to the production environment gets continuously deployed to the end customer.

Getting back to our code pipeline, so automation gets really important and once we use the service it :

  • lets us monitor the processes in real-time which is super important and helpful as it saves lots of time.
  • also, it ensures a consistent release process as deploying a server is also a time-consuming task and this helps to automate that too.
  • not to forget the speed up delivery it will provide while improving the quality.
  • at any point in time pipeline history details can be looked into (looking into all the processes happening). Like we can get a constant update on what is happening at a particular stage. and on failure, detection can be done easily, and edits and that part can be made without issues.

Let’s now talk a bit about the architecture, so basically there are developers who work on various pieces of code and the continuous fixes and changes that need to be uploaded so for that we have various services like code commit and much more, what code commit does is that it lets us have an initial source management system which lets us take care of repositories it connects with git and manages it. On change, it goes to the source, the developer can commit those changes, and then it goes into the build stage where development takes place, the source code compiles and gets tested and then goes to staging phase where it is deployed and final testing takes place and then after manual checking code gets deployed to a public server where it goes to the general public so they can use it. If there are any changes that need to be conducted those can be readily taken from them and it goes to the developer wherefrom the cycle continues. When the developer commits and it goes to source then the data is stored in a container called S3 which is called a simple storage service in the form of an object, so if there is a change that is taking place then the data gets fetched from the storage container (i.e. S3) and changes are build and a copy is maintained as a zip. S3 should mostly be in a place/region where our pipeline is. Multiple buckets can also be added to different regions. So to carry out this we have services like Code Deploy, Code Build, and Code Commit in AWS.

— AWS CodeCommit

So when talking about moving our code to a central place for continuously committing the code and get the most let’s say optimal copy, so this code commit helps us manage all the repositories in a much better way. It helps connection with git which in itself is a central place to store everything, from where we can push and pull anytime based on our needs, work on it more make a copy of it, submit it back to the central server, so code commits basically helps us connect with git in a much better way without worrying about multiple things, it takes care of authorization, pulling and much more. From a more extensive perspective, it is a Secure, versatile, and oversaw Git source control. It has no repository size limit and also deals with the scalability, availability, and durability of Amazon S3. What we can say is that it is like a version control service that is provided by amazon that we can use to store and manage assets which include documentation, source code, or any kind of file. It’s more like an infrastructure that houses all the assets. Coming to the benefits, this includes:

  • storing any code securely at any point in time.
  • ensures easy collaborative work under let’s say a security group.
  • easily scalable.
  • makes integration with third-party tools and stuff really easy.

— AWS CodeBuild

As the name proposes assists the client to automate the building process where the code gets compiled, tested while making sure that artifacts and copies of the code are being maintained with S3.

More like rapidly building and testing code with not having factors that slow the building process, it’s more like a fully managed continuous integration service that complies, runs tests, and handles artifacts with ease. It is also super easy to set up with the ability to either use a custom environment or use one of the preconfigured environments. So in a broader sense, the CodeBuild is more of a fully managed build service that produces software packages and we do not have to take care of tests and source code, it scales continuously and has the ability to process multiple builds at the same time concurrently that can be a huge a plus for large scale companies also the integration of the feature where via Docker image the clients can provide custom build is like a huge plus (, the application designer can characterize a custom-built environment by utilizing a Docker container image, which incorporates a working framework, programming language, and important tools). It provides ease of payment for just the minutes you actually use compute resources and it finally launches with code pipeline and Jenkins integration. The next question that comes to mind is that how everything in this process works?

So it first downloads the source code then executes commands configured in the build spec in the containers that are basically temporary and new ones are created for every build. Then it streams the output for the build to console and CloudWatch logs and then it finally uploads the generated artifacts to the S3 bucket.

Now, what exactly is meant by building our code?

The answer to this is that building code refers to languages that require compiled binaries like .Net, Go, Java, Scala, or ios languages (swift, objective-c). There are some languages that don’t require building these are interpreted languages like node js, python, and ruby or PHP, direct deployment can take place for these languages.

With AWS CodeBuild — AWS CLI(It is basically an AWS tool that enables a developer to control amazon public cloud services), AWS Management Console, software development kits, and programming interfaces work together to showcase detailed information in regard to every build this includes, commit ID, start and end time, status, and much more. AWS key management service is also used to get the build artifacts encrypted.

AWS Key Management Service -

It is used to encrypt the data and its main purpose is to store and manage all the encryption keys. To understand this better let’s talk about why to encrypt the data in the first place, so the reason behind it lets say the database server got hacked now the hacker will have full access to the client’s sensitive data, so if it is stored in like plane text the hacker can do whatever he wants with it and exploit the data, but if the data is encrypted he will have a tough time decrypting it even if it got hacked. In server-side encryption we let AWS manage the keys for us which basically uses KMS. These keys that KMS manages are called Customer Master Keys.

AWS makes sure that these encryption keys never leave this hardware appliance, so to access these KMS Apis are used.

— AWS CodeDeploy

As we know that deployment is not an easy task, if a user is working on a project he/she is already taking care of many tasks like managing the code working with database and in between all this if he has to get the server setup and look into all the deployment process it would get really hard for the user. So it is more like a deployment service that automates application deployment on different platforms. So when we have the option of putting it on EC2 instance this implies that the application would be on the cloud platform, next we can get it to working on-premises too, then the option of Lambda function that has specific purposes or Amazon ECS service which is a container service which lets us create containers that can run on any platform that has a container object or an agent that helps run a container. What we mean by this is that code deploy provides automating the ability to deploy an application on these many platforms. So we can deploy our code, serverless lambda functions, web configuration files, executable and packages, and also scripts. So the code we have has multiple revisions because we are talking about continuous integration and deployment, so these revisions are actually our applications which we host on a particular platform, this platform is nothing but instances, these instances can be located by applications and deployment configuration by a key-value pair or details. So not moving on how do we deploy it, we do it by deployment configuration which includes What to deploy? the How to deploy and finally where to deploy? So now let’s talk about the components that CodeDeploy has to offer. Firstly, Compute Platform is like an infrastructure to host the website in this case web servers or maybe a virtual machine (EC2 / container services). Secondly, deployment types and groups, so when we have multiple instances or let’s say virtual machines running for an application, this handles like the setting and configuration during deployment. , more like load balancing, understanding what we need to deploy and how we need to deploy them comes into the picture. Thirdly, IAM and Service Roles, here since we are deploying the application at some node we give the application some kind of access to the IAM specifies those roles, and service roles tell the application what & how to do it.

--

--