unbzip2-stream - streaming unbzip2 implementatio in pure javascript for node and browsers

  •        18

streaming bzip2 decompressor in pure JS for Node and browserify. When browserified, the stream emits instances of feross/buffer instead of raw Uint8Arrays to have a consistant API across browsers and Node.

https://github.com/regular/unbzip2-stream

Dependencies:

through : ^2.3.6
buffer : ^3.0.1

Tags
Implementation
License
Platform

   




Related Projects

smart_open - Utils for streaming large files (S3, HDFS, gzip, bz2...)

  •    Python

There are a few optional keyword arguments that are useful only for S3 access. These are both passed to boto.s3_connect() as keyword arguments. The S3 reader supports gzipped content, as long as the key is obviously a gzipped file (e.g. ends with ".gz").

Stream-2-Stream

  •    Java

Stream-2-Stream's multicast+ is the next generation of streaming. Multicast+ is more efficient and requires less bandwidth than direct streaming (e.g. shoutcast/icecast). Stream-2-Stream stations can be set up without paying a fortune for bandwidth.

SharpCompress - a fully native C# library for RAR, 7Zip, Zip, Tar, GZip, BZip2

  •    CSharp

SharpCompress is a compression library for .NET/Mono/Silverlight/WP7 that can unrar, un7zip, unzip, untar unbzip2 and ungzip with forward-only reading and file random access APIs. Write support for zip/tar/bzip2/gzip is implemented. The major feature is support for non-seek...

streamDM - Stream Data Mining Library for Spark Streaming

  •    Scala

streamDM is a new open source software for mining big data streams using Spark Streaming, started at Huawei Noah's Ark Lab. streamDM is licensed under Apache Software License v2.0. Big Data stream learning is more challenging than batch or offline learning, since the data may not keep the same distribution over the lifetime of the stream. Moreover, each example coming in a stream can only be processed once, or they need to be summarized with a small memory footprint, and the learning algorithms must be very efficient.

automi - A stream API for Go (alpha)

  •    Go

Automi abstracts away (not too far away) the gnarly details of using Go channels to create pipelined and staged processes. It exposes higher-level API to compose and integrate stream of data over Go channels for processing. This is still alpha work. The API is still evolving and changing rapidly with each commit (beware). Nevertheless, the core concepts are have been bolted onto the API. The following example shows how Automi could be used to compose a multi-stage pipeline to process stream of data from a csv file. The code implements stream processing based on the pipeline patterns. What is clearly absent, however, is the low level channel communication code to coordinate and synchronize goroutines. The programmer is provided a clean surface to express business code without the noisy channel infrastructure code. Underneath the cover however, Automi is using patterns similar to the pipeline patterns to create safe and concurrent structures to execute the processing of the data stream.


node-trumpet - parse and transform streaming html using css selectors

  •    Javascript

Create a new trumpet stream. This stream is readable and writable. Pipe an html stream into tr and get back a transformed html stream.Parse errors are emitted by tr in an 'error' event.

peerflix-server - Streaming torrent client for Node.js with web ui.

  •    Javascript

Streaming torrent client for node.js with web ui. Based on torrent-stream, inspired by peerflix.

Maxwell's daemon - A mysql-to-json kafka producer

  •    Java

This is Maxwell's daemon, an application that reads MySQL binlogs and writes row updates to Kafka as JSON. Maxwell has a low operational bar and produces a consistent, easy to ingest stream of updates. It allows you to easily "bolt on" some of the benefits of stream processing systems without going through your entire code base to add (unreliable) instrumentation points.

snappydata - SnappyData - The Spark Database. Stream, Transact, Analyze, Predict in one cluster

  •    Scala

Apache Spark is a general purpose parallel computational engine for analytics at scale. At its core, it has a batch design center and is capable of working with disparate data sources. While this provides rich unified access to data, this can also be quite inefficient and expensive. Analytic processing requires massive data sets to be repeatedly copied and data to be reformatted to suit Spark. In many cases, it ultimately fails to deliver the promise of interactive analytic performance. For instance, each time an aggregation is run on a large Cassandra table, it necessitates streaming the entire table into Spark to do the aggregation. Caching within Spark is immutable and results in stale insight. At SnappyData, we take a very different approach. SnappyData fuses a low latency, highly available in-memory transactional database (GemFireXD) into Spark with shared memory management and optimizations. Data in the highly available in-memory store is laid out using the same columnar format as Spark (Tungsten). All query engine operators are significantly more optimized through better vectorization and code generation. The net effect is, an order of magnitude performance improvement when compared to native Spark caching, and more than two orders of magnitude better Spark performance when working with external data sources.

Pravega - Streaming as a new software defined storage primitive

  •    Java

Pravega is an open source distributed storage service implementing Streams. It offers Stream as the main primitive for the foundation of reliable storage systems: a high-performance, durable, elastic, and unlimited append-only byte stream with strict ordering and consistency.

stream-parser - ⚡ PHP7 / Laravel Multi-format Streaming Parser

  •    PHP

DOM loading: loads all the document, making it easy to navigate and parse, and as such provides maximum flexibility for developers. Streaming: implies iterating through the document, acts like a cursor and stops at each element in its way, thus avoiding memory overkill.

torrent-stream - The low level streaming torrent engine that peerflix uses

  •    Javascript

torrent-stream is a node module that allows you to access files inside a torrent as node streams.Per default no files are downloaded unless you create a stream to them. If you want to fetch a file without creating a stream you should use the file.select and file.deselect methods.

socket.io-stream - Stream for Socket.IO

  •    Javascript

This is the module for bidirectional binary data transfer with Stream API through Socket.IO. If you are not familiar with Stream API, be sure to check out the docs. I also recommend checking out the awesome Stream Handbook.

streaming-benchmarks - Benchmarks for Low Latency (Streaming) solutions including Apache Storm, Apache Spark, Apache Flink,

  •    Java

Code licensed under the Apache 2.0 license. See LICENSE file for terms.At Yahoo we have adopted Apache Storm as our stream processing platform of choice. But that was in 2012 and the landscape has changed significantly since then. Because of this we really want to know what Storm is good at, where it needs to be improved compared to other systems, and what its limitations are compared to other tools so we can recommend the best tool for the job to our customers. To do this we started to look for stream processing benchmarks that we could use to do this evaluation, but all of them ended up lacking in several fundamental areas. Primarily they did not test anything close to a read world use case, so we decided to write a simple one. This is the first round of these tests. The tool here is not polished and only covers three tools and one specific use case. We hope to expand this in the future in terms of the tools tested, the variety of processing tested, and the metrics gathered.

Apache Flink - Platform for Scalable Batch and Stream Data Processing

  •    Java

Apache Flink is an open source platform for scalable batch and stream data processing. Flink’s core is a streaming dataflow engine that provides data distribution, communication, and fault tolerance for distributed computations over data streams.

awesome-streaming - a curated list of awesome streaming frameworks, applications, etc

  •    

A curated list of awesome streaming (stream processing) frameworks, applications, readings and other resources. Inspired by other awesome projects.

download - Download and extract files

  •    Javascript

See download-cli for the command-line version.Returns both a Promise<Buffer> and a Duplex stream with additional events.

through2 - Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise

  •    Javascript

Inspired by Dominic Tarr's through in that it's so much easier to make a stream out of a function than it is to set up the prototype chain properly: through(function (chunk) { ... }).Note: As 2.x.x this module starts using Streams3 instead of Stream2. To continue using a Streams2 version use npm install through2@0 to fetch the latest version of 0.x.x. More information about Streams2 vs Streams3 and recommendations see the article Why I don't use Node's core 'stream' module.

jstorm - Enterprise Stream Process Engine

  •    Java

Alibaba JStorm is an enterprise fast and stable streaming process engine. It runs program up to 4x faster than Apache Storm. It is easy to switch from record mode to mini-batch mode. It is not only a streaming process engine. It means one solution for real time requirement, whole realtime ecosystem.

streamjs - Lazy Object Streaming Pipeline for JavaScript

  •    Javascript

Stream.js is a lightweight (2.6 KB minified, gzipped), intensely tested (700+ assertions, 97% coverage) functional programming library for operating upon collections of in-memory data. It requires EcmaScript 5+, has built-in support for ES6 features and works in all current browsers, Node.js and Java 8 Nashorn. Before explaining how Stream.js works in detail, here's a few real world code samples.