Displaying 1 to 20 from 21 results

mu - A full-stack DevOps on AWS framework

  •    Go

Amazon ECS (EC2 Container Service) provides an excellent platform for deploying microservices as containers. The challenge however is that there is a significant learning curve for microservice developers to deploy their applications in an efficient manner. Specifically, they must learn to use CloudFormation to orchestrate the management of ECS, ECR, EC2, ELB, VPC, and IAM resources. Additionally, tools like CodeBuild and CodePipeline must be mastered to create a continuous delivery pipeline for their microservices. To address these challenges, this tool was created to simplify the declaration and administration of the AWS resources necessary to support microservices. Similar to how the Serverless Framework improved the developer experience of Lambda and API Gateway, this tool makes it easier for developers to use ECS as a microservices platform.

cloudformation_templates - AWS - CloudFormation Templates

  •    Shell

This repository contains a collaboration of general and specific Amazon Web Services CloudFormation Template Examples. The basic design is a layered approach so there is less repeat content between all the templates. That way you can build a custom environment by picking the solution templates you wish to use. In other words you won't see a VPC created over and over throughout the templates. You simply use the VPC template then move to the next piece you would like to create. Also, The general design leans towards not having to refactor the template to fit your account/environment. By using the configured parameters from the console or CLI you should be able to use the template without the need to edit it. The templates generally output all the information you may need for another template. So, be sure to examine the Outputs tab after creating the stack.

phantom-lambda-template - The bare minimum for a phantomjs app to run on Amazon Lambda.

  •    Javascript

This is a reference implementation of running PhantomJS on AWS Lambda deployed with AWS CodePipeline. PhantomJS needs to be compiled for the OS you plan on running it and this can be painful because of that fact. This could be circumvented with a build server, which is a very personal decision and it was hard to commit to a build server for this little project. Now with AWS CodeBuild, this has become a trivial matter. So we are using AWS Developer tools 100%, AWS created a great walk through: Automating Deployment of Lambda-based Applications I have done my best to automate the walk through, so its simple and repeatable.




ecs-blue-green-deployment - Reference architecture for doing blue green deployments on ECS.

  •    Python

This reference architecture is in reference to blog post on blue green deployments on ECS. It creates a continuous delivery by leveraging AWS CloudFormation templates. The templates creates resources using Amazon's Code* services to build and deploy containers onto an ECS cluster as long running services. It also includes a manual approval step facilitated by lambda function that discovers and swaps target group rules between 2 target groups, promoting the green version to production and demoting the blue version to staging. This example uses AWS Command Line Interface to run Step-3 below.

ecs-blue-green-deployment - Reference architecture for doing blue green deployments on ECS.

  •    Python

This reference architecture is in reference to blog post on blue green deployments on ECS. It creates a continuous delivery by leveraging AWS CloudFormation templates. The templates creates resources using Amazon's Code* services to build and deploy containers onto an ECS cluster as long running services. It also includes a manual approval step facilitated by lambda function that discovers and swaps target group rules between 2 target groups, promoting the green version to production and demoting the blue version to staging.This example uses AWS Command Line Interface to run Step-3 below.

devops-essentials - Source code samples for DevOps Essentials on AWS Complete Video Course

  •    HTML

AWS CodePipeline (along with other AWS Developer Tools such as AWS CodeCommit, AWS CodeBuild, and AWS CodeDeploy) is a fully-managed service for orchestrating continuous delivery. In DevOps Essentials on AWS Complete Video Course, you'll learn how to continuous delivery pipelines using AWS services and tools so if you're some type of software or DevOps-focused engineer or architect interested in learning how to use AWS Developer Tools to create a full-lifecycle software delivery solution, it's the course for you. The focus of the course is on deployment pipeline architectures and its implementations versus software architectures. To launch the first solution (i.e. a static website to S3), you'll need to specify a unique S3 bucket name for the website bucket that will be created along with a GitHub token. Review and ensure you've configured the Prerequisites before launching the stack below.

buildpipeline - AWS-powered serverless build, test and deploy pipeline ft. multiple environments

  •    TypeScript

This project demonstrates an AWS-powered serverless build, test and deploy pipeline ft. multiple environments. The /src directory contains a React/TypeScript/Webpack-powered web app that is served from S3 with CloudFront as a CDN and Route 53 for DNS. The /infrastructure directory contains all infrastructure and deployment steps defined as code (Terraform and bash scripts). CodeBuild and CodePipeline take care of building, testing and deploying the project. All build logs are stored in CloudWatch. CodePipeline accesses GitHub using an access token. When using CodeBuild to build, test and deploy each project, information about the build environment must be provided. A build environment represents a combination of operating system, programming language runtime, and tools that CodeBuild uses to build, test and deploy - A.K.A. a Docker image. I maintain build environments for the programming languages and tools I use frequently - e.g. docker-node-terraform-aws. The build commands and related settings must also be specified in a buildspec declaration (YAML format) stored at the root level of the project - e.g. buildspec-test.yml. Because a buildspec declaration must be valid YAML, the spacing in a buildspec declaration is important. If the number of spaces in a buildspec declaration is invalid, builds might fail immediately. A YAML validator can be used to test whether a buildspec declaration is valid YAML. See AWS CodeBuild Concepts and Build Phase Transitions for further information.


awsloft-terraform-ci - Digging into DevOps with AWS and Terraform

  •    HCL

During our session at AWS Loft in Munich on November 22nd we presented “How to deploy AWS Infrastructure in a true DevOps fashion” using Terraform, AWS Codepipeline and AWS Codebuild. As best practise Cloudreach suggests to deploy infrastructure as a code using AWS Cloudformation, AWS CLI or our tool Sceptre. For the purpose of this article we will focus on Terraform in order to evaluate a different tool.

ecs-pipeline - :cloud: :whale: :zap: :rocket: Create environment and deployment pipelines to ECS Fargate with CodePipeline, CodeBuild and Github using Terraform

  •    HCL

Edit variables.tf file to customize application preferences like Github account, repo and owner, Load Balancer ports and cluster preferences. Edit the Github preferences in the same file to specify infos like repo, owner or organization, branches e etc.

terraform-aws-ecs-codepipeline - Terraform Module for CI/CD with AWS Code Pipeline and Code Build for ECS https://cloudposse

  •    HCL

Terraform Module for CI/CD with AWS Code Pipeline using GitHub webhook triggers and Code Build for ECS. This project is part of our comprehensive "SweetOps" approach towards DevOps.

serverless-sinatra-sample - Demo code for running Ruby Sinatra on AWS Lambda

  •    Ruby

This sample code helps get you started with a simple Sinatra web app deployed on AWS Lambda. It is tested with Ruby 2.5.x and bundler-1.17.x. These directions assume you already have Ruby 2.5.x and AWS CLI installed and configured. Please fork the repo and create an access token if you want to create a CodePipeline to deploy the app. The pipeline-cfn.yaml template can be used to automate the process.

es2017-lambda-boilerplate - AWS Lambda boilerplate for Node

  •    Javascript

Note: As of April 2018, AWS has announced support for Lambda functions on Node.js 8.10, which already supports all ES2018-2016 features added by this boilerplate (and more). You can however, still use this boilerplate on Node.js 8.10 to make use of the unit testing features. This is a boilerplate for AWS Lambda Node.js 6.10.0 functions, which allows you to use the latest JavaScript ES2016, ES2017 and ES2018 features. The boilerplate also allows you to test your function in a Docker container (thanks to docker-lambda), and also includes common configurations for CI/CD, for both Travis CI and AWS CodeBuild + AWS CloudFormation.

sagemaker-pipeline - Sagemaker pipeline for AWS Summit New York

  •    Python

This is a sample solution using a SageMaker pipeline. This implementation could be useful for any organization trying to automate their use of Machine Learning. With an implementation like this, any inference is easy, and can simply be queried through an endpoint to receive the output of the model’s inference, tests can be automatically performed for QA, and ML code can be quickly updated to match needs. AWS CloudFormation – AWS::CloudFormation::Interface sets parameter group metadata.

terraform-aws-cicd - Terraform Module for CI/CD with AWS Code Pipeline and Code Build

  •    HCL

GitHub -> S3 (build artifact) -> Elastic Beanstalk (running application stack). The module gets the code from a GitHub repository (public or private), builds it by executing the buildspec.yml file from the repository, pushes the built artifact to an S3 bucket, and deploys the artifact to Elastic Beanstalk running one of the supported stacks (e.g. Java, Go, Node, IIS, Python, Ruby, etc.). GitHub -> ECR (Docker image) -> Elastic Beanstalk (running Docker stack). The module gets the code from a GitHub repository, builds a Docker image from it by executing the buildspec.yml and Dockerfile files from the repository, pushes the Docker image to an ECR repository, and deploys the Docker image to Elastic Beanstalk running Docker stack.

terraform-aws-jenkins - Terraform module to build Docker image with Jenkins, save it to an ECR repo, and deploy to Elastic Beanstalk running Docker stack

  •    HCL

terraform-aws-jenkins is a Terraform module to build a Docker image with Jenkins, save it to an ECR repo, and deploy to Elastic Beanstalk running Docker. This is an enterprise-ready, scalable and highly-available architecture and the CI/CD pattern to build and deploy Jenkins.

codepipeline-datadog-events - Tool for monitoring AWS CodePipeline status and pushing events to Datadog, Slack and Cloudwatch

  •    TypeScript

Monitor your AWS CodePipelines with CloudWatch, Datadog and Slack. This project was created as an internal tool to get information about succeeded and failed steps in CodePipelines deployed in almost all regions available in Amazon Web Services. As CloudWatch does not provide any metrics for this service, we had to create something on our own. Initial idea was to subscribe to Cloudwatch Events and push everything to Datadog Events, but then we have added integration for Slack and a scheduled lambda which pushes custom metric to CloudWatch.

cfn-nag-pipeline - Lambda function to run cfn_nag in CodePipeline

  •    Ruby

A lambda function to run cfn_nag as an action in CodePipeline. To install, navigate to the cfn-nag-pipeline application in the AWS Serverless Repo (SAR) console and click deploy.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.