Standalone Windows .EXE command line utility for Amazon S3 & EC2

  •        63

A Windows command-line utility for Amazon's S3 & EC2 web services that requires no installation, is a single .EXE file with no DLLs, and requires only .NET 2.0 or Mono, so will work on a plain vanilla Windows 2003 installation.

http://s3.codeplex.com/

Tags
Implementation
License
Platform

   




Related Projects

C# Library and Code for Amazon S3

  •    

An advanced C# library for interfacing with the Amazon S3 system. Among its powerful features are: - Full support for data streaming. No need to load data into memory before sending to S3. - Data encryption. - Thread safety and live statistics. Perform multiple simultaneous up...

s3cmd - Official s3cmd repo -- Command line tool for managing Amazon S3 and CloudFront services

  •    Python

S3cmd requires Python 2.6 or newer. Python 3+ is also supported starting with S3cmd version 2. S3cmd (s3cmd) is a free command line tool and client for uploading, retrieving and managing data in Amazon S3 and other cloud storage service providers that use the S3 protocol, such as Google Cloud Storage or DreamHost DreamObjects. It is best suited for power users who are familiar with command line programs. It is also ideal for batch scripts and automated backup to S3, triggered from cron, etc.

s3gof3r - Fast, concurrent, streaming access to Amazon S3, including gof3r, a CLI

  •    Go

s3gof3r provides fast, parallelized, pipelined streaming access to Amazon S3. It includes a command-line interface: gof3r.These tests were performed on an m1.xlarge EC2 instance with a virtualized 1 Gigabit ethernet interface. See Amazon EC2 Instance Details for more information.

aws - Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

  •    Perl

Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

aws - Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

  •    Perl

Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)


Amazon S3 Library for Lucene.Net

  •    

A library with all the needed classes to run Lucene.Net off the cloud (on Amazon S3 storage).

Stratosphere

  •    

Mono compatible .NET/C# library with set of primitives to work with table, queue and block containers with corresponding implementations for Amazon SimpleDB, SQS and S3. Additionally includes local machine (file system and SQLite) implementations to enable debugging and testing.

AWS for .NET Sample (Amazon EC2, S3, SQS, DynamoDB)

  •    

Amazon Web Services (AWS) Sample for .NET (C#) with Asp.NET MVC and Web API. Including S3, DynamoDB, Elastic Beanstalk and SQS

lambda-refarch-imagerecognition - The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition

  •    Javascript

The Image Recognition and Processing Backend demonstrates how to use [AWS Step Functions] (https://aws.amazon.com/step-functions/) to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition. This workflow processes photos uploaded to Amazon S3 and extracts metadata from the image such as geolocation, size/format, time, etc. It then uses image recognition to tag objects in the photo. In parallel, it also produces a thumbnail of the photo.This repository contains sample code for all the Lambda functions depicted in the diagram below as well as an AWS CloudFormation template for creating the functions and related resources. There is also a test web app that you can run locally to interact with the backend.

lambda-refarch-imagerecognition - The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition

  •    Javascript

The Image Recognition and Processing Backend demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition. This workflow processes photos uploaded to Amazon S3 and extracts metadata from the image such as geolocation, size/format, time, etc. It then uses image recognition to tag objects in the photo. In parallel, it also produces a thumbnail of the photo. This repository contains sample code for all the Lambda functions depicted in the diagram below as well as an AWS CloudFormation template for creating the functions and related resources. There is also a test web app that you can run locally to interact with the backend.

serverless-image-resizing - CloudFormation template to resize images on-the-fly using Amazon API Gateway, AWS Lambda, and Amazon S3

  •    Javascript

Resizes images on the fly using Amazon S3, AWS Lambda, and Amazon API Gateway. Using a conventional URL structure and S3 static website hosting with redirection rules, requests for resized images are redirected to a Lambda function via API Gateway which will resize the image, upload it to S3, and redirect the requestor to the resized image. The next request for the resized image will be served from S3 directly.Use the Amazon Linux Docker container image to build the package using your local system. This repo includes Makefile that will download Amazon Linux, install Node.js and developer tools, and build the extensions using Docker. Run make all.

Amazon S3 Provider for the Microsoft Sync Framework

  •    

Allows files to be synchronised to and from an Amazon S3 bucket using the Microsoft Sync Framework 2.0. Developed in C# against the Sync Framework 2.0 RTM SDK.

boto3 - AWS SDK for Python

  •    Python

Boto3 is the Amazon Web Services (AWS) Software Development Kit (SDK) for Python, which allows Python developers to write software that makes use of services like Amazon S3 and Amazon EC2. You can find the latest, most up to date, documentation at Read the Docs, including a list of services that are supported. To see only those features which have been released, check out the stable docs.

aws-cognito-angular-quickstart - An AngularV4-based QuickStart web app utilizing Amazon Cognito, S3, and DynamoDB (serverless architecture)

  •    TypeScript

This sample application can be deployed to either Elastic Beanstalk or S3. S3 will host this application as a static site while Elastic Beanstalk gives you the capability of adding backend operations to the application.createResources.sh requires your aws cli to be configured for JSON output.

yas3fs - YAS3FS (Yet Another S3-backed File System) is a Filesystem in Userspace (FUSE) interface to Amazon S3

  •    Python

YAS3FS (Yet Another S3-backed File System) is a Filesystem in Userspace (FUSE) interface to Amazon S3. It was inspired by s3fs but rewritten from scratch to implement a distributed cache synchronized by Amazon SNS notifications. A web console is provided to easily monitor the nodes of a cluster through the YAS3FS Console project. This is a personal project. No relation whatsoever exists between this project and my employer.

Flajaxian S3 Amazon Service Uploader

  •    CSharp

Flajaxian S3 Amazon Service Uploader is adapter for the Flajaxian FileUploader .NET web control designed for asynchronous file upload to an Amazon Service of multiple files at the same time, without a page post back and with a progress bar indicating the current upload progres...

DotNetNuke Amazon S3 Folder Integration

  •    

A set of providers (data and authorization) that provide Amazon S3 folder services to a DotNetNuke installation in an integrated and seamless manner. This package allows existing modules to utilize externally-persisted S3 files without any modification.

SQL Server S3 Backup

  •    

SQL Server command line backup tool that will perform full, differential and transaction log backups, zip them up and then upload to an Amazon S3 bucket.

fake-s3 - A lightweight server clone of Amazon S3 that simulates most of the commands supported by S3 with minimal dependencies

  •    Ruby

Fake S3 is a lightweight server that responds to the same API of Amazon S3. It is extremely useful for testing of S3 in a sandbox environment without actually making calls to Amazon, which not only requires a network connection, but also costs money with every use.