S3tools - Python command-line ftp and rsync-like clients for Amazon S3 storage service

  •        222

S3cmd is a free command line tool and client for uploading, retrieving and managing data in Amazon S3 and other cloud storage service providers that use the S3 protocol, such as Google Cloud Storage or DreamHost DreamObjects. It is best suited for power users who are familiar with command line programs. It is also ideal for batch scripts and automated backup to S3, triggered from cron, etc.

https://s3tools.org/s3cmd
https://github.com/s3tools/s3cmd
http://sourceforge.net/projects/s3tools
http://code.google.com/p/s3tools

Tags
Implementation
License
Platform

   




Related Projects

s3cmd - Official s3cmd repo -- Command line tool for managing Amazon S3 and CloudFront services

  •    Python

S3cmd requires Python 2.6 or newer. Python 3+ is also supported starting with S3cmd version 2. S3cmd (s3cmd) is a free command line tool and client for uploading, retrieving and managing data in Amazon S3 and other cloud storage service providers that use the S3 protocol, such as Google Cloud Storage or DreamHost DreamObjects. It is best suited for power users who are familiar with command line programs. It is also ideal for batch scripts and automated backup to S3, triggered from cron, etc.

Standalone Windows .EXE command line utility for Amazon S3 & EC2

  •    

A Windows command-line utility for Amazon's S3 & EC2 web services that requires no installation, is a single .EXE file with no DLLs, and requires only .NET 2.0 or Mono, so will work on a plain vanilla Windows 2003 installation.

S3 - A simple helper for easily uploading files to Amazon S3 from Meteor

  •    CoffeeScript

S3 provides a simple way for uploading files to the Amazon S3 service with a progress bar. This is useful for uploading images and files that you want accesible to the public. S3 is built on Knox and AWS-SDK. Both modules are made available on the server after installing this package. S3 now uploads directly from the client to Amazon. Client files will not touch your server.

aws - Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

  •    Perl

Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

aws - Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)

  •    Perl

Easy command line access to Amazon EC2, S3, SQS, ELB, and SDB (new!)


SQL Server S3 Backup

  •    

SQL Server command line backup tool that will perform full, differential and transaction log backups, zip them up and then upload to an Amazon S3 bucket.

s3gof3r - Fast, concurrent, streaming access to Amazon S3, including gof3r, a CLI

  •    Go

s3gof3r provides fast, parallelized, pipelined streaming access to Amazon S3. It includes a command-line interface: gof3r.These tests were performed on an m1.xlarge EC2 instance with a virtualized 1 Gigabit ethernet interface. See Amazon EC2 Instance Details for more information.

mc - Minio Client is a replacement for ls, cp, mkdir, diff and rsync commands for filesystems and object storage

  •    Go

Minio Client (mc) provides a modern alternative to UNIX commands like ls, cat, cp, mirror, diff, find etc. It supports filesystems and Amazon S3 compatible cloud storage service (AWS Signature v2 and v4).then use the mc config command.

s4cmd - Super S3 command line tool

  •    Python

S4cmd is a command-line utility for accessing Amazon S3, inspired by s3cmd. We have used s3cmd heavily for a number of scripted, data-intensive applications. However as the need for a variety of small improvements arose, we created our own implementation, s4cmd. It is intended as an alternative to s3cmd for enhanced performance and for large files, and with a number of additional features and fixes that we have found useful.

transfer.sh - Easy and fast file sharing from the command-line

  •    Go

Easy and fast file sharing from the command-line. This code contains the server with everything you need to create your own instance. Transfer.sh currently supports the s3 (Amazon S3), gdrive (Google Drive) providers, and local file system (local).

node-s3-client - high level amazon s3 client for node.js

  •    Javascript

See also the companion CLI tool which is meant to be a drop-in replacement for s3cmd: s3-cli.This contains a reference to the aws-sdk module. It is a valid use case to use both this module and the lower level aws-sdk module in tandem.

S3 tools

  •    PHP

OpenSource tools to access Amazon S3 file storage. Subprojects: s3cmd - unix-like tools to manipulate stored files from the command line, s3browser - PHP interface for viewing stored files in a browser, s3fuse - driver to mount the S3 storage locally

megacmd - A command-line client for mega.co.nz storage service

  •    Go

I'm (@ncw) doing light maintenance on megacmd, but mainly on the underlying go-mega library which I'm using to provide a mega backend for rclone. Mega (mega.co.nz) is an excellent free storage service which provides 50 GB of free storage space. It has a web based user interface to upload and download files. Megacmd is a command-line tool for performing file and directory transfer between local directories and mega service. Features of megacmd are much similar to s3cmd utility, which is used to perform file transfer to Amazon S3.

serverless-image-resizing - CloudFormation template to resize images on-the-fly using Amazon API Gateway, AWS Lambda, and Amazon S3

  •    Javascript

Resizes images on the fly using Amazon S3, AWS Lambda, and Amazon API Gateway. Using a conventional URL structure and S3 static website hosting with redirection rules, requests for resized images are redirected to a Lambda function via API Gateway which will resize the image, upload it to S3, and redirect the requestor to the resized image. The next request for the resized image will be served from S3 directly.Use the Amazon Linux Docker container image to build the package using your local system. This repo includes Makefile that will download Amazon Linux, install Node.js and developer tools, and build the extensions using Docker. Run make all.

S3-Uploads - The WordPress Plugin to Store Uploads on Amazon S3

  •    PHP

S3 is a WordPress plugin to store uploads on S3. S3-Uploads aims to be a lightweight "drop-in" for storing uploads on Amazon S3 instead of the local filesystem. It's focused on providing a highly robust S3 interface with no "bells and whistles", WP-Admin UI or much otherwise. It comes with some helpful WP-CLI commands for generating IAM users, listing files on S3 and Migrating your existing library to S3.

C# Library and Code for Amazon S3

  •    

An advanced C# library for interfacing with the Amazon S3 system. Among its powerful features are: - Full support for data streaming. No need to load data into memory before sending to S3. - Data encryption. - Thread safety and live statistics. Perform multiple simultaneous up...

S3 - Node.js implementation of a server handling the Amazon S3 protocol

  •    Javascript

CloudServer (formerly S3 Server) is an open-source Amazon S3-compatible object storage server that is part of Zenko, Scality’s Open Source Multi-Cloud Data Controller.CloudServer provides a single AWS S3 API interface to access multiple backend data storage both on-premise or public in the cloud.

fake-s3 - A lightweight server clone of Amazon S3 that simulates most of the commands supported by S3 with minimal dependencies

  •    Ruby

Fake S3 is a lightweight server that responds to the same API of Amazon S3. It is extremely useful for testing of S3 in a sandbox environment without actually making calls to Amazon, which not only requires a network connection, but also costs money with every use.

s3-parallel-put - Parallel uploads to Amazon AWS S3

  •    Python

s3-parallel-put speeds the uploading of many small keys to Amazon AWS S3 by executing multiple PUTs in parallel. The program reads your credentials from the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.

lambda-refarch-imagerecognition - The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition

  •    Javascript

The Image Recognition and Processing Backend demonstrates how to use [AWS Step Functions] (https://aws.amazon.com/step-functions/) to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition. This workflow processes photos uploaded to Amazon S3 and extracts metadata from the image such as geolocation, size/format, time, etc. It then uses image recognition to tag objects in the photo. In parallel, it also produces a thumbnail of the photo.This repository contains sample code for all the Lambda functions depicted in the diagram below as well as an AWS CloudFormation template for creating the functions and related resources. There is also a test web app that you can run locally to interact with the backend.