DevOps (a clipped compound of development and operations) is a culture, movement or practice that emphasizes the collaboration and communication of both software developers and other information-technology (IT) professionals while automating the process of software delivery and infrastructure changes. It aims at establishing a culture and environment where building, testing, and releasing software can happen rapidly, frequently, and more reliably.


C is for Culture: We need develop an agile, collaborative culture to practice DevOps
A is for Automation: Automate at every step of software development and deployment
M is for Measurement: We can’t improve what we don’t measure. We can measure performance, services, infrastructure, even business metrics.
S is for Sharing: We need to have shared goals and practices. All need to work together toward the same goal.

Screen Shot 2016-09-09 at 11.45.56 AM.png

Screen Shot 2016-09-09 at 11.33.00 PM.png

Screen Shot 2016-09-09 at 11.34.10 PM.png

Screen Shot 2016-09-09 at 11.35.18 PM.png

Golden Image: Packer can build golden images for different build targets like VirtualBox, VMWare, AWS, Google Cloud, Microsoft Azure, etc.

Packer template is a JSON file that defines one or more builds by configuring various components of Packer.
Builders are a component of Packer that builds an image for the respective platform. Builders take a source image that is different for each specific builder.
Provisioners are a component of Packer that installs and configures software within a running machine, prior to turning that machine into a static image. They perform the major work of making the image contain useful software. Examples are shell scripts.
Post-Processors are the components of Packer that take the result of a builder or another post-processor and process that to create a new artifact. Examples of post-processors are compress to compress artifacts, vagrant to vagrant box.
Artifacts are the result of a single build and are usually a set of IDs or files to represent a machine image. Every builder produces a single artifact. And build is a single task that eventually produces an image.

Screen Shot 2016-09-15 at 5.35.24 PM.png

Vagrant is an Open-source software product for building and maintaining portable virtual development environments. The core idea behind its creation lies in the fact that the environment maintenance becomes increasingly difficult in a large project with multiple technical stacks. Vagrant manages all the necessary configurations for the developers in order to avoid the unnecessary maintenance, setup time and increase development productivity.

In most cases there are 6 environment that we have to maintain:
1. Local environment (Developer’s workstation)
2. Development environment (Sandbox)
Developer usually works on a personal branch
3. Integration environment (Continuous Integration (CI) build target)
This is where the app gets built and tested and changes are merged into main working branch.
4. Test/QA environment (for functional, performance, QA and UAT)
5. Staging environment
Mirror of production
6. Production environment

Screen Shot 2016-09-19 at 2.03.37 AM.png

Continuous Integration (CI)

An automated process to get changes into existing code bases and building and running tests.
– Maintain a code repository
– Automate the build
– Test the build
– Commit your changes often
– Build each commit
– Fix bugs right away
– Test in a production clone environment

Continuous Integration Tools:
– Jenkins (open source, written in Java)
– Travis CI
– CircleCIScreen Shot 2016-09-19 at 2.11.45 AM.png


Unit tests: Tests written alongside code, to test the behavior of individual units such functions or classes.
Regression tests: Tests written as part of debugging, which verify that a bug is fixed. Kept in the test suite to ensure that the bug is not reintroduced.
Smoke testing: Preliminary test of a system just after build, to make sure it runs at all – for instance doesn’t crash on boot.
System Integration testing: Tests of a whole system, including dependencies such as databases or APIs, under a test load.
Automated Acceptance testing: Scripted tests that verify that user-facing features work as planned.
Manual QA testing: Approval process integrated with Continuous Deployment.

Continuous Delivery (CD)

Automating the release pipeline down to the software release.


Jenkins is an open-source continuous integration software tool written in the Java programming language for testing and reporting on isolated changes in a larger code base in real time. The software enables developers to find and solve defects in a code base rapidly and to automate testing of their builds.


Monitoring Data Sources:

1. External probing (Test queries)
2. Application level stats (queries per second, latency)
3. Environment Stats (JVM memory profile)
4. Host or Container Stats (load average, disk errors)

Monitoring Data Products:

1. Alerting
2. Performance Analysis
3. Capacity Prediction
4. Growth Measurement
5. Debugging metrics

Monitoring Systems:

  • Nagios
  • Grafana
  • OpenTSDB
  • InfluxDB
  • Graphite


Other DevOps Tools:

  • Chef
  • Ansible
  • Docker


Build, Ship, and Run any app, anywhere

It allows you to package up an application or service with all of its dependencies into a standardized unit. This unit is typically labeled at a Docker image. Everything you need to run that service is included such as the code, runtime, system libraries, and anything else you would install on the system to make it run if you weren’t using Docker. All of those components are layered on top of each other. Docker images are meant to be ran. Since everything is packaged together, it was always run in the same light regardless of where its being ran.

  • Packages a service into a standardized unit
  • Everything is included to make it run
  • Runs the same way on multiple machine

Docker Image vs Docker Container

  • What we previously described is a Docker image. When we run a Docker image, it creates a Docker container.
  • Docker Image is like a class and Docker Container is the instance of the class.
  • A single Docker image can be run multiple times and each time it is run, it creates a unique Docker Container that runs on its own.

Virtual Machine vs Docker Container

Screen Shot 2016-09-23 at 9.53.18 AM.png

A Virtual Machine allows you to isolate resources in a similar fashion to a Docker Container but there are a few differences that make VMs much less efficient. VMs require an entire guest OS for each app that you want to isolate. It can take many seconds to boot up a VM and each one can be potentially GBs in size. Containers share the host OS kernel, and isolation is done using cgroups (control groups: a Linux kernel feature that limits , accounts for, and isolates the resource usage) and other linux kernel libraries. Docker is very light weight. It typically takes 15 msecs for container to start and running a container doesn’t take much disk space at all. Disk space is used only once for the docker image.

  • Open source – Based on open standards
  • Secure – Containers are isolated

Benefits of using Docker

  1. Scale up quickly – All new packages and libraries are baked into the image
  2. Expand your development team painlessly – No need to spend days to setup a new developers dev environment
  3. Use whatever technology fits best – Since everything is isolated
  4. Cross environment consistency – Its runs the same way on different environments
  5. Docker is a framework – It acts as an abstraction


Docker Compose

Compose is a tool for defining and running multi-container Docker applications. With Compose, you use a Compose file to configure your application’s services. Then, using a single command, you create and start all the services from your configuration.