node-s3-client - high level amazon s3 client for node.js

  •        275

See also the companion CLI tool which is meant to be a drop-in replacement for s3cmd: s3-cli.This contains a reference to the aws-sdk module. It is a valid use case to use both this module and the lower level aws-sdk module in tandem.

https://github.com/andrewrk/node-s3-client

Dependencies:

aws-sdk : ~2.4.9
fd-slicer : ~1.0.0
findit2 : ~2.2.3
graceful-fs : ~4.1.4
mime : ~1.2.11
mkdirp : ~0.5.0
pend : ~1.2.0
rimraf : ~2.2.8
streamsink : ~1.2.0

Tags
Implementation
License
Platform

   




Related Projects

s3-upload-stream - A Node.js module for streaming data to Amazon S3 via the multipart upload API

  •    Javascript

A pipeable write stream which uploads to Amazon S3 using the multipart file upload API. NOTE: This module is deprecated after the 2.1.0 release of the AWS SDK on Dec 9, 2014, which added S3.upload(). I highly recommend switching away from this module and using the official method supported by AWS.

lambda-refarch-imagerecognition - The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition

  •    Javascript

The Image Recognition and Processing Backend demonstrates how to use [AWS Step Functions] (https://aws.amazon.com/step-functions/) to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition. This workflow processes photos uploaded to Amazon S3 and extracts metadata from the image such as geolocation, size/format, time, etc. It then uses image recognition to tag objects in the photo. In parallel, it also produces a thumbnail of the photo.This repository contains sample code for all the Lambda functions depicted in the diagram below as well as an AWS CloudFormation template for creating the functions and related resources. There is also a test web app that you can run locally to interact with the backend.

lambda-refarch-imagerecognition - The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition

  •    Javascript

The Image Recognition and Processing Backend demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition. This workflow processes photos uploaded to Amazon S3 and extracts metadata from the image such as geolocation, size/format, time, etc. It then uses image recognition to tag objects in the photo. In parallel, it also produces a thumbnail of the photo. This repository contains sample code for all the Lambda functions depicted in the diagram below as well as an AWS CloudFormation template for creating the functions and related resources. There is also a test web app that you can run locally to interact with the backend.

s3-parallel-put - Parallel uploads to Amazon AWS S3

  •    Python

s3-parallel-put speeds the uploading of many small keys to Amazon AWS S3 by executing multiple PUTs in parallel. The program reads your credentials from the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.

Amazon S3 Provider for the Microsoft Sync Framework

  •    

Allows files to be synchronised to and from an Amazon S3 bucket using the Microsoft Sync Framework 2.0. Developed in C# against the Sync Framework 2.0 RTM SDK.


s3-plugin-webpack - Uploads files to s3 after complete

  •    Javascript

I notice a lot of people are setting the directory option when the files are part of their build. Please don't set directory if you're uploading your build. Using the directory option reads the files after compilation to upload instead of from the build process. include and exclude rules behave similarly to Webpack's loader options. In addition to a RegExp you can pass a function which will be called with the path as its first argument. Returning a truthy value will match the rule. You can also pass an Array of rules, all of which must pass for the file to be included or excluded.

multer-s3 - multer storage engine for amazon s3

  •    Javascript

Streaming multer storage engine for AWS S3. This project is mostly an integration piece for existing code samples from Multer's storage engine documentation with s3fs as the substitution piece for file system. Existing solutions I found required buffering the multipart uploads into the actual filesystem which is difficult to scale.

DotNetNuke Amazon S3 Folder Integration

  •    

A set of providers (data and authorization) that provide Amazon S3 folder services to a DotNetNuke installation in an integrated and seamless manner. This package allows existing modules to utilize externally-persisted S3 files without any modification.

S3 - A simple helper for easily uploading files to Amazon S3 from Meteor

  •    CoffeeScript

S3 provides a simple way for uploading files to the Amazon S3 service with a progress bar. This is useful for uploading images and files that you want accesible to the public. S3 is built on Knox and AWS-SDK. Both modules are made available on the server after installing this package. S3 now uploads directly from the client to Amazon. Client files will not touch your server.

bucket-stream - Find interesting Amazon S3 Buckets by watching certificate transparency logs.

  •    Python

Find interesting Amazon S3 Buckets by watching certificate transparency logs. This tool simply listens to various certificate transparency logs (via certstream) and attempts to find public S3 buckets from permutations of the certificates domain name.

Quickwit - Fast and cost-efficient distributed search engine for immutable data

  •    Rust

Quickwit is a distributed search engine built from the ground up to offer cost-efficiency and high reliability. By mere mortals for mere mortals, Quickwit's architecture is as simple as possible. Quickwit is written in Rust and built on top of the mighty tantivy library. It designed to index big datasets straight from object storage like AWS S3 in a stateless manner.

Minio - Open source object storage server compatible with Amazon S3 APIs

  •    Go

Minio is an object storage server, It is compatible with Amazon S3 cloud storage service. It is best suited for storing unstructured data such as photos, videos, log files, backups and container / VM images. Size of an object can range from a few KBs to a maximum of 5TB. Minio server is light enough to be bundled with the application stack, similar to NodeJS, Redis and MySQL.

cloudserver - Zenko CloudServer, an open-source Node

  •    Javascript

CloudServer (formerly S3 Server) is an open-source Amazon S3-compatible object storage server that is part of Zenko, Scality’s Open Source Multi-Cloud Data Controller. CloudServer provides a single AWS S3 API interface to access multiple backend data storage both on-premise or public in the cloud.

serverless-image-resizing - CloudFormation template to resize images on-the-fly using Amazon API Gateway, AWS Lambda, and Amazon S3

  •    Javascript

Resizes images on the fly using Amazon S3, AWS Lambda, and Amazon API Gateway. Using a conventional URL structure and S3 static website hosting with redirection rules, requests for resized images are redirected to a Lambda function via API Gateway which will resize the image, upload it to S3, and redirect the requestor to the resized image. The next request for the resized image will be served from S3 directly.Use the Amazon Linux Docker container image to build the package using your local system. This repo includes Makefile that will download Amazon Linux, install Node.js and developer tools, and build the extensions using Docker. Run make all.

s3cmd - Official s3cmd repo -- Command line tool for managing Amazon S3 and CloudFront services

  •    Python

S3cmd requires Python 2.6 or newer. Python 3+ is also supported starting with S3cmd version 2. S3cmd (s3cmd) is a free command line tool and client for uploading, retrieving and managing data in Amazon S3 and other cloud storage service providers that use the S3 protocol, such as Google Cloud Storage or DreamHost DreamObjects. It is best suited for power users who are familiar with command line programs. It is also ideal for batch scripts and automated backup to S3, triggered from cron, etc.

S3-Uploads - The WordPress Plugin to Store Uploads on Amazon S3

  •    PHP

S3 is a WordPress plugin to store uploads on S3. S3-Uploads aims to be a lightweight "drop-in" for storing uploads on Amazon S3 instead of the local filesystem. It's focused on providing a highly robust S3 interface with no "bells and whistles", WP-Admin UI or much otherwise. It comes with some helpful WP-CLI commands for generating IAM users, listing files on S3 and Migrating your existing library to S3.

C# Library and Code for Amazon S3

  •    

An advanced C# library for interfacing with the Amazon S3 system. Among its powerful features are: - Full support for data streaming. No need to load data into memory before sending to S3. - Data encryption. - Thread safety and live statistics. Perform multiple simultaneous up...

S3 - Node.js implementation of a server handling the Amazon S3 protocol

  •    Javascript

CloudServer (formerly S3 Server) is an open-source Amazon S3-compatible object storage server that is part of Zenko, Scality’s Open Source Multi-Cloud Data Controller.CloudServer provides a single AWS S3 API interface to access multiple backend data storage both on-premise or public in the cloud.

fake-s3 - A lightweight server clone of Amazon S3 that simulates most of the commands supported by S3 with minimal dependencies

  •    Ruby

Fake S3 is a lightweight server that responds to the same API of Amazon S3. It is extremely useful for testing of S3 in a sandbox environment without actually making calls to Amazon, which not only requires a network connection, but also costs money with every use.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.