(Credits: Samuel Sianipar)

In this article we are going to explore the git plumbing operations like cat-file, write-tree, commit-tree, update-ref and more. These git commands are low level instructions which are used under the hood by the common git commands. Exploring these operations allows for a better understanding of the GIT inner workings.

You might in fact be familiar with the add , commit , checkout , merge, etc.. Git commands, however, under the hood, Git uses these so called ‘plumbing’ operations. The just mentioned commands may be seen as high-level abstractions and combinations of low-level instructions. In this article, we are first…


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

In this final part of our journey, we are going to finally implement the CI / CD pipeline using a Jenkinsfile. As it was mentioned in the first part of the tutorial, this pipeline will consist of several parts which will be triggered once a push on the bitbucket repo is made. We will also setup the webhook to let bitbucket inform Jenkins to start a new build on each push.

Part 1 (here)→ Set up the project by downloading and looking at the Web App which will be used to test our infrastructure and pipeline. …


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

In this fifth part of the tutorial, we are going to implement the user data of both the Web App instance and the Jenkins instance. For the Jenkins instance, this will require the creation of various scripts which will be uploaded to the correct S3 bucket and then pull down and run in the user data to avoid the 16k characters limit. For more information regarding this topic, you can check this article.

Part 1 (here)→ Set up the project by downloading and looking at the Web App which will be used to test our infrastructure and pipeline. …


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

In this fourth part, we are going to complete the construction of our AWS infrastructure. What we are left to create are:

  • S3 Buckets for logs and to store the Jenkins User Data (which we are going to implement in the fifth Part)
  • ECR Repository where the Docker images will be uploaded
  • IAM Policies to let the EC2 instance have the right access to other AWS resources.
  • Secrets Manager Secret to store our bitbucket SSH keys (that we are going to create) which will be needed by the Jenkins instance to pull down the repository at the start of the…


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

In this third part of our tutorial we are going to set up the AWS infrastructure using Terraform. We are going to create all the necessary elements to secure our instances while making them able to communicate with the external world.

Part 1 (here)→ Set up the project by downloading and looking at the Web App which will be used to test our infrastructure and pipeline. Here we also create & test suitable Dockerfiles for our project and upload everything to Bitbucket.

Part 2 (here)→ Set up Slack and create a Bot which will be used by Jenkins to send…


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

In this second part we are going to setup Slack to be ready to be implemented in the pipeline that we are going to create in the next parts. We are going to create an Account, a Workspace, an App (a bot) and mess around a bit to get some confidence with the Slack API.

Part 1 (here)→ Set up the project by downloading and looking at the Web App which will be used to test our infrastructure and pipeline. Here we also create & test suitable Dockerfiles for our project and upload everything to Bitbucket.

Part 2 (current article)…


Technologies used: Terraform, Bitbucket, Docker, AWS, Jenkins

Hello guys and welcome to this tutorial in which I will be guiding you through the creation of a complete CI / CD Pipeline governed by Jenkins with all the infrastructure on AWS.

Let me first lay down a summary of what we are going to build and the different steps we’ll be taking.

Part 1 (current article) → Set up the project by downloading and looking at the Web App which will be used to test our infrastructure and pipeline. Here we also create & test suitable Dockerfiles for our project and upload everything to Bitbucket.

Part 2 (here)→…


In this brief article, I would like to explain a workaround on the 16k characters limit for the user data in an EC2 instance.

It is pretty straightforward and the bottom line is: upload your configuration files to an S3 bucket and grab them in the user data of the instance and run them.

Setup

I suppose you already have an AWS infrastructure governed by your Terraform code with an EC2 instance where you want to stick a long user data configuration.

The situation might be something like the following:


Initial Stage

From the Downloaded files, we can see that the website is using MongoDB as a database, in particular, we can see that in challenge/routes/index.js there is:

So we can see that the parameters usernameand passsword passed in the form are not sanitized before querying mongoDB, meaning that we can exploit it using noSQL injection techniques.

We can try first to gain an unauthorized access, by passing an object in the password field with the following content:

  • Using Content-Type: application/x-www-form-urlencoded :

Initital Stage

By inspecting the source files, let’s first note that the flag filename is not simply flag but it is generated randomly, starting from flag and then appending 5 random characters, as we can see in entrypoint.sh :

Kevin De Notariis

Theoretical Physicist and Software Developer

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store