Fast and powerful CSV parser for the browser that supports web workers and streaming large files. Converts CSV to JSON and JSON to CSV.
csv csv-parser parser parse parsing delimited text data auto-detect comma tab pipe file filereader stream worker workers thread threading multi-threaded jquery-pluginTypeSerializer - Another library you might find interesting. Serializer / Deserializer, designed to make prettier code while using decorators (Can be used both with Angular or Node.js).
angular angular2 ng ng2 typescript pipes pipe ng5 angular5 angularjs filters library angular2-pipes ng2-pipes angular-filters angular-pipesRuns a sequence of gulp tasks in the specified order. This function is designed to solve the situation where you have defined run-order, but choose not to or cannot use dependencies. If you are hiring developers, you can support this project and future open source work by checking out our company, Qualified.io.
gulpfriendly pipe sequence gulp orchestratorEasy way to create a Stream that is both readable and writable.This function is the basis for most of the synchronous streams in event-stream.
stream streams user-streams pipeThis package is a mirror of the Streams2 and Streams3 implementations in Node-core.Full documentation may be found on the Node.js website.
readable stream pipeipt (pronounced iPipeTo) introduces the missing cli interactive workflow. It takes any kind of list as an input and uses that list to build an interactive interface to let you select an element from it. Stop manually dragging your mouse around to copy output data from a terminal, using the ipt workflow you can pipe data from a command and select what to copy to clipboard from a convenient visual menu.
cli command-line unix bash pipe workflow stdout nodejs vim ipipeto js interactive list inquirer terminal cli-app ipt menuFour operations are performed to arrive at the desired data set, and they are written in a natural order: the same as the order of execution. Also, no temporary variables are needed. If yet another operation is required, it is straight-forward to add to the sequence of operations wherever it may be needed. If you are new to magrittr, the best place to start is the pipes chapter in R for data science.
r pipepump is a small node module that pipes streams together and destroys all of them if one of them closes.When using standard source.pipe(dest) source will not be destroyed if dest emits close or an error. You are also not able to provide a callback to tell when then pipe has finished.
streams pipe destroy callbackDistributed Named Pipes (or: dnpipes) are essentially a distributed version of Unix named pipes comparable to, for example, SQS in AWS or the Service Bus in Azure. Conceptually, we're dealing with a bunch of distributed processes (dpN above). These distributed processes may be long-running (such as dp0 or dp5) or batch-oriented ones, for example dp3 or dp6. There are a number of situations where you want these distributed processes to communicate, very similar to what IPC enables you to do on a single machine. Now, dnpipes are a simple mechanism to facilitate IPC between distributed processes. What follows is an interface specification as well as a reference implementation for dnpipes.
distributed-systems pipe ipc named-pipes kafkanode-bcat features auto scrolling (with enable/disable), ansi to html coloring (--ansi) and behavior and color customization.This module uses RC to manage its configuration, so in addition to command line arguments you may save your favorite configuration in .bcatrc.
nodejs node-js pipe browser cat bcatReturn a duplex stream that will be compressed with gzip, deflate, or no compression depending on the accept-encoding headers sent.oppressor will emulate calls to http.ServerResponse methods like writeHead() so that modules like filed that expect to be piped directly to the response object will work.
gzip compress deflate http negotiator pipe stream request responseA pipeable write stream which uploads to Amazon S3 using the multipart file upload API. NOTE: This module is deprecated after the 2.1.0 release of the AWS SDK on Dec 9, 2014, which added S3.upload(). I highly recommend switching away from this module and using the official method supported by AWS.
aws s3 upload pipe streamA package to build multi-staged concurrent workflows with a centralized logging output. The package could be used to define and execute CI/CD tasks(either sequential or concurrent). A tool with similar goals would be Jenkins Pipeline. However, compared to Jenkins Pipeline, this package has fewer constructs since the logic is specified in code, as opposed to a Jenkinsfile.
jenkins-pipeline pipeline workflow ci ci-cd jenkins pipe concurrentPapa Parse is a powerful CSV (delimited text) parser that gracefully handles large files and malformed input
jquery-plugin csv parse parsing parser delimited text data auto-detect comma tab pipe file filereader streamAngular 2+ pipeline for filtering arrays
filter-array pipe array-filter angular-2 filter angular angular2 angular4Bytes go in, but they don't come out (when muted).
mute stream pipePlain functions for a more functional Deku approach to creating stateless React components, with functional goodies such as compose, memoize, etc... for free.
functional react promises state components functions compose pipe identity memoize curry javascript-framework mvc frameworkTiny node.js module to intercept, modify and/or ignore chunks of data and events in any readable compatible stream before it's processed by other stream consumers (e.g: via pipe()).It becomes particularly useful to deal with net/http/fs streams.
stream intercept interceptor control modify capture sniff sniffer spy events emitter readable pipe http transformIt pipes stdin to a tempfile and spawns the chosen app with the tempfile path as the first argument.Similar to process substitution in ZSH/Bash, but cross-platform and without its limitation.
cli-app cli bin stdio stdin tmp temp shell pipe app file path process substitution
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.