Miller is like awk, sed, cut, join, and sort for name-indexed data such as CSV, TSV, and tabular JSON. With Miller, you get to use named fields without needing to count positional indices, using familiar formats such as CSV, TSV, JSON, and positionally-indexed.
http://johnkerl.org/miller/doc
The following is a list of text-based file formats and command line tools for manipulating each. Delimiter-separated values, including CSV, TSV, etc.
clis list csv tsv delimited-files text-files json yaml toml xml html ini structured-text conversion configuration-fileJackson is one of best JSON parser for Java. More than that, Jackson is a suite of data-processing tools for Java (and the JVM platform), including the flagship streaming JSON parser / generator library, matching data-binding library (POJOs to and from JSON) and additional data format modules to process data encoded in Avro, BSON, CBOR, CSV, Smile, (Java) Properties, Protobuf, XML or YAML; and even the large set of data format modules to support data types of widely used data types such as Guava, Joda, PCollections and many, many more.
json json-parser serializationThis repo is an ESRI toolbox and tool(s) that exports ESRI Feature Classes to open data formats, CSV, JSON, SQLite, and GeoJSON. Much of the data in government coffers is contained in spatial databases. A large percentage of government spatial data is created and managed using ESRI software. While the common interchange format, the ESRI Shapefile, is easily exported and imported by many other softwares, this data file format (the Shapefile) is not intrinsically part of the www ecology. Moreover, ESRI software does not provide an export of its generic 'feature class' (shapefile, file geodatabase, and personal geodatabase) to the most common open data file formats, CSV, JSON, and/or GeoJSON. Finally while open source tools easily transform ESRI shapefiles to open data, most government geospatial infrastructures only have ESRI tools. Lacking this basic export feature presented here, means the lion's share of government spatial data users cannot export their data to the most common open data formats.
A simple library for exporting tabular data to Excel-friendly XML, CSV, or TSV. It supports streaming exported data to a file or directly to the browser as a download so it is suitable for exporting large datasets (you won't run out of memory). See the test/ directory for more examples.
csv-parser can convert CSV into JSON at at rate of around 90,000 rows per second (perf varies with data, try bench.js with your data).The data emitted is a normalized JSON object. Each header is used as the property name of the object.
csv parser fast jsonSimilar to FASTA/Q format in field of Bioinformatics, CSV/TSV formats are basic and ubiquitous file formats in both Bioinformatics and data sicence. People usually use spreadsheet softwares like MS Excel to do process table data. However it's all by clicking and typing, which is not automatically and time-consuming to repeat, especially when we want to apply similar operations with different datasets or purposes.
csv tsv cross-platform toolkit bioinformatics command-line toolHere is a free online csv to json convert service utilizing latest csvtojson module. csvtojson has released version 2.0.0.
csv csvtojson csvparser nodejs csv-data csv-row csv-columns csv-stream parse csv-parser parse-csv json csv-to-json csv-convert tojson convert-csv-to-json csv-jsonReckon automagically converts CSV files for use with the command-line accounting tool Ledger. It also helps you to select the correct accounts associated with the CSV data using Bayesian machine learning. First, login to your bank and export your transaction data as a CSV file.
This project add support for Legacy formats like CSV, TXT (CSV Export) to the data service output and allow $format=txt query. By default WCF Data Services support Atom and JSON responses however legacy systems do not understand ATOM or JSON but they understand CSV, TXT f...
wcf wcf-data-servicesDefine importers that load tabular data from spreadsheets or CSV files into any ActiveRecord-like ORM. Define classes that you instruct on how to import data into data models.
tabular-data spreadsheet csv-files activerecord orm data-import importerThe F# Data library (FSharp.Data.dll) implements everything you need to access data in your F# applications and scripts. It implements F# type providers for working with structured file formats (CSV, HTML, JSON and XML) and for accessing the WorldBank data. It also includes helpers for parsing CSV, HTML and JSON files and for sending HTTP requests.We're open to contributions from anyone. If you want to help out but don't know where to start, you can take one of the Up-For-Grabs issues, or help to improve the documentation.
fsharp html csv json xml worldbank http typeprovider dataMapshaper is software for editing Shapefile, GeoJSON, TopoJSON, CSV and several other data formats, written in JavaScript. The mapshaper command line program supports essential map making tasks like simplifying shapes, editing attribute data, clipping, erasing, dissolving, filtering and more.
geojson topojson shapefile gis csv svg cartography simplification topology editing map-editorFast and powerful CSV parser for the browser that supports web workers and streaming large files. Converts CSV to JSON and JSON to CSV.
csv csv-parser parser parse parsing delimited text data auto-detect comma tab pipe file filereader stream worker workers thread threading multi-threaded jquery-pluginThese lists are the result of merging data from two sources, the Wikipedia ISO 3166-1 article for alpha and numeric country codes, and the UN Statistics site for countries' regional, and sub-regional codes. In addition to countries, it includes dependent territories. The International Organization for Standardization (ISO) site provides partial data (capitalised and sometimes stripped of non-latin ornamentation), but sells the complete data set as a Microsoft Access 2003 database. Other sites give you the numeric and character codes, but there appeared to be no sites that included the associated UN-maintained regional codes in their data sets. I scraped data from the above two websites that is all publicly available already to produce some ready-to-use complete data sets that will hopefully save someone some time who had similar needs.
region-codes countries iso csv json xml dataset dataThe JavaScript data transformation and analysis toolkit inspired by Pandas and LINQ. Implemented in TypeScript, used in JavaScript ES5+ or TypeScript.
data-wrangling data-forge data data-analysis nodejs pandas visualization data-visualization data-management data-manipulation data-munging data-cleaning data-cleansing csv json data-science data-clensingThe goal of readr is to provide a fast and friendly way to read rectangular data (like csv, tsv, and fwf). It is designed to flexibly parse many types of data found in the wild, while still cleanly failing when data unexpectedly changes. If you are new to readr, the best place to start is the data import chapter in R for data science. To accurately read a rectangular dataset with readr you combine two pieces: a function that parses the overall file, and a column specification. The column specification describes how each column should be converted from a character vector to the most appropriate data type, and in most cases it's not necessary because readr will guess it for you automatically.
r csv fwf parsingTad is a desktop application for viewing and analyzing tabular data such as CSV files. The easiest way to install Tad is to use a pre-packaged binary release. See The Tad Landing Page for information on the latest release and a download link.
desktop-application pivots tabular-data csv pivot-tables data-science data-analysisPowerShell is a cross-platform automation and configuration tool/framework that works well with your existing tools and is optimized for dealing with structured data (e.g. JSON, CSV, XML, etc.), REST APIs, and object models. It includes a command-line shell, an associated scripting language and a framework for processing cmdlets.
powershell shell command-line netcoreThis Gem will enable your AR models to use the PostgreSQL COPY command to import/export data in CSV format. If you need to tranfer data between a PostgreSQL database and CSV files, the PostgreSQL native CSV parser will give you a greater performance than using the ruby CSV+INSERT commands. I have not found time to make accurate benchmarks, but in the use scenario where I have developed the gem I have had a four-fold performance gain. This gem was written having the Rails framework in mind, I think it could work only with active-record, but I will assume in this README that you are using Rails.
Datalib is a JavaScript data utility library. It provides facilities for data loading, type inference, common statistics, and string templates. While datalib was created to power Vega and related projects, it is also a standalone library useful for data-driven JavaScript applications on both the client (web browser) and server (e.g., node.js). For documentation, see the datalib API Reference.
data table statistics parse csv tsv json utility
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.