Displaying 1 to 20 from 32 results

through2 - Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise

  •    Javascript

Inspired by Dominic Tarr's through in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: through(function (chunk) { ... }).Note: As 2.x.x this module starts using Streams3 instead of Stream2. To continue using a Streams2 version use npm install through2@0 to fetch the latest version of 0.x.x. More information about Streams2 vs Streams3 and recommendations see the article Why I don't use Node's core 'stream' module.

object-stream-map - Perform a map on a stream of objects

  •    Javascript

Perform a map on a stream of objects.Let's say you are consuming an array of objects in the form of a stream. This node module lets you return a named property for each object in the stream, in reality performing a map on the entire array.

stream-template - An ES6 Tagged String Literal tag that can interpolate Node

  •    Javascript

An ES6/ES2015 Tagged String Literal tag that can interpolate Node.JS streams, strings and Promises that return either (or arrays of any of those) and produces a Node.JS stream. This allows you to join several Streams together with bits in between without having to buffer anything.Written by Thomas Parslow (almostobsolete.net and tomparslow.co.uk) for IORad (iorad.com) and released with their kind permission.

through2-concurrent - Simple Node

  •    Javascript

A simple way to create a Node.JS Transform stream which processes in parallel. You can limit the concurrency (default is 16) and order is not preserved (so chunks/objects can end up in a different order to the order they started in if the transform functions take different amounts of time).Built using through2 and has the same API with the addition of a maxConcurrency option.




tar-stream - tar-stream is a streaming tar parser and generator.

  •    Javascript

tar-stream is a streaming tar parser and generator and nothing else. It is streams2 and operates purely using streams which means you can easily extract/parse tarballs without ever hitting the file system.Note that you still need to gunzip your data if you have a .tar.gz. We recommend using gunzip-maybe in conjunction with this.

content-addressable-blob-store - Streamable content addressable blob object store that is streams2 and implements the blob store interface

  •    Javascript

Streamable content addressable blob object store that is streams2 and implements the blob store interface on top of the fs module.Conforms to the abstract-blob-store API and passes it's test suite.

duplexify - Turn a writable and readable stream into a streams2 duplex stream with support for async initialization and streams1/streams2 input

  •    Javascript

Turn a writeable and readable stream into a single streams2 duplex stream.If you call setReadable or setWritable multiple times it will unregister the previous readable/writable stream. To disable the readable or writable part call setReadable or setWritable with null.


size-limit-stream - a through stream that destroys itself if an overall size limit for the combined stream throughput is exceeded

  •    Javascript

a through stream that destroys itself if an overall size limit for the combined stream throughput is exceeded. useful for e.g. limiting HTTP upload size

bellhop - A node.js module that exposes streams for doing Pubsub and RPC.

  •    Javascript

A node.js module that exposes streams for doing Pubsub and RPC.Benchmarks code can be found in bench/.

block-stream2 - transform input into equally-sized chunks as output

  •    Javascript

Create a new transform stream b that outputs chunks of length size or opts.size.When opts.zeroPadding is false, do not zero-pad the last chunk.

csv2 - A Node Streams2 CSV parser

  •    Javascript

Will parse an input character stream and pass on an array for each line of CSV data.The only main "feature" not currently supported is newlines within quoted strings; newlines are treated strictly as row separators.

readable-wrap - upgrade streams1 to streams2 streams as a standalone module

  •    Javascript

This module provides a wrap function based on Readable().wrap() from node core but as a standalone module.Use this module if you don't want to wait for a patch in node core to land that fixes falsey objectMode values in wrapped readable streams.

sculpt - Manipulate streams.

  •    Javascript

A collection of Node.js transform stream utilities for simple data manipulation.Install with npm install sculpt --save.

node-stream-sink - Collect all data piped to this stream when it closes

  •    Javascript

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

reduplexer - reduplexer(writable, readable, options)

  •    Javascript

Takes a Writable stream and a Readable stream and makes them appear as a Duplex stream. Heavily inspired by duplexer but using Stream2 with a bundled readable-stream. It is assumed that the two streams are connected to each other in some way.

digest-stream - Simple node

  •    Javascript

Provide a the digest algorithm, optional input encoding, digest encoding, and a listener function when you construct the stream. The listener will be called with the resultant digest and length of the stream just prior to end being emitted. Since this uses the node.js crypto package, refer to http://nodejs.org/api/crypto.html for the specific options available.

pass-stream - pass-stream - pass-through node.js stream which can filter/adapt and pause data

  •    Javascript

pass-stream is a pass-through stream which allows transform fns for easily filtering or adapting the data that flows through the stream. To add transform/filter functionality you may provide a writeFn and/or endFn which allows you to tap into the write and end processing.

redis-rstream - redis-rstream is a node

  •    Javascript

redis-rstream is a node.js redis read stream which streams binary or utf8 data in chunks from a redis key using an existing redis client. (streams2) Tested with mranney/node_redis client. You will also need the redis client (npm install redis) or other compatible library. You an also optionally install hiredis along with redis for additional performance.

redis-wstream - redis-wstream is a node

  •    Javascript

You will also need the redis client (npm install redis) or other compatible library. You an also optionally install hiredis along with redis for additional performance. Construct a write stream instance by passing in client and key to save stream to. Pipe to this instance and when end is emitted, the stream has been saved to redis.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.