Displaying 1 to 20 from 20 results

vaex - Out-of-Core hybrid Apache Arrow/NumPy DataFrame for Python, ML, visualize and explore big tabular data at a billion rows per second 🚀

  •    Python

Vaex is a high performance Python library for lazy Out-of-Core DataFrames (similar to Pandas), to visualize and explore big tabular datasets. It calculates statistics such as mean, sum, count, standard deviation etc, on an N-dimensional grid for more than a billion (10^9) samples/rows per second. Visualization is done using histograms, density plots and 3d volume rendering, allowing interactive exploration of big data. Vaex uses memory mapping, zero memory copy policy and lazy computations for best performance (no memory wasted). HDF5 and Apache Arrow supported.

vaex - Lazy Out-of-Core DataFrames for Python, visualize and explore big tabular data at a billion rows per second

  •    Python

Vaex is a python library for Out-of-Core DataFrames (similar to Pandas), to visualize and explore big tabular datasets. It can calculate statistics such as mean, sum, count, standard deviation etc, on an N-dimensional grid up to a billion (109) objects/rows per second. Visualization is done using histograms, density plots and 3d volume rendering, allowing interactive exploration of big data. Vaex uses memory mapping, zero memory copy policy and lazy computations for best performance (no memory wasted).

awips2 - Weather forecasting display and analysis package developed by NWS/Raytheon, released as open source software by Unidata

  •    Java

AWIPS (formerly know as AWIPS II or AWIPS2) is a meteorological display and analysis package developed by the National Weather Service and Raytheon for operational forecasting. AWIPS is a Java application consisting of a data-rendering client (CAVE, which runs on Red Hat/CentOS Linux, macOS, and Windows), and a backend data server (EDEX, which runs on x86_64 Red Hat/CentOS 6 and 7). Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS contains no proprietery content and is therefore not subject to export controls as stated in the Master Rights licensing file.




geobipy

  •    Python

This package uses a Bayesian formulation and Markov chain Monte Carlo sampling methods to derive posterior distributions of subsurface and measured data properties. The current implementation is applied to time and frequency domain electro-magnetic data. Application outside of these data types is well within scope. Currently there are two types of data that we have implemented; frequency domain electromagnetic data, and time domain electromagnetic data. The package comes with a frequency domain forward modeller, but it does not come with a time domain forward modeller. See the section Installing the time domain forward modeller for more information.

HDF5Kit - HDF5 for iOS and OS X

  •    Swift

This is a Swift wrapper for the HDF5 file format. HDF5 is used in the scientific comunity for managing large volumes of data. The objective is to make it easy to read and write HDF5 files from Swift, including playgrounds. This example shows how to open an existing HDF5 file and write data to an existing dataset.

matio - MATLAB MAT File I/O Library

  •    C

Matio is an open-source C library for reading and writing binary MATLAB MAT files. This library is designed for use by programs/libraries that do not have access or do not want to rely on MATLAB's shared libraries. You can contact Christopher Hulbert through email at chulbe2lsu@users.sourceforge.net or Thomas Beutlich through email at t-beu@users.sourceforge.net.

loomR - An R-based interface for loom files

  •    R

A tutorial for loomR can be found here. A full function and method reference can be found here.


pyh5md - Read and write H5MD files

  •    Python

pyh5md is a library to read and write easily H5MD files. H5MD is a file format specification based on HDF5 to store molecular data. pyh5md is built on top of h5py, the HDF5 for Python library by Andrew Colette. pyh5md used to define a complex class structure. Since version 1.0.0.dev1, a light subclassing of h5py's classes is used instead.

oo_hdf5_fortran - Object-oriented, clean, simple HDF5 modern Fortran 2018 interface

  •    Fortran

Straightforward single-file/module access to HDF5. Abstracts away the messy parts of HDF5 so that you can read/write various types/ranks of data with a single command. as well as character (string) variables and attributes. If you'd like higher-rank arrays, let us know via GitHub Issue.

pibayer - Acquire RAW Bayer-masked images with Raspberry Pi camera (before demosaicking) in Python

  •    Python

Acquire RAW Bayer-masked images with Raspberry Pi camera (before demosaicking). Writes HDF5, NetCDF or TIFF compressed image stacks. Setting of exposure time manually (seconds) is mandatory to avoid mistakes in experiments.

hsds - Cloud-native, service based access to HDF data

  •    Python

HSDS is a web service that implements a REST-based web service for HDF5 data stores. Data can be stored in either a POSIX files system, or using object based storage such as AWS S3, Azure Blob Storage, or OpenIO <openio.io>. HSDS can be run a single machine using Docker or on a cluster using Kubernetes (or AKS on Microsoft Azure) The commercial offering based on this code is known as Kita™. More info at: https://www.hdfgroup.org/solutions/hdf-kita/. Note: passwords can (and should for production use) be modified by changing values in hsds/admin/config/password.txt and rebuilding the docker image. Alternatively, an external identity provider such as Azure Active Directory or KeyCloak can be used. See: docs/azure_ad_setup.md for Azure AD setup instructions or docs/keycloak_setup.md for KeyCloak.

h5netcdf - Pythonic interface to netCDF4 via h5py

  •    Python

A Python interface for the netCDF4 file-format that reads and writes local or remote HDF5 files directly via h5py or h5pyd, without relying on the Unidata netCDF library. h5netcdf has two APIs, a new API and a legacy API. Both interfaces currently reproduce most of the features of the netCDF interface, with the notable exception of support for operations the rename or delete existing objects. We simply haven't gotten around to implementing this yet. Patches would be very welcome.

antaresRead - Import, manipulate and explore the results of an Antares simulation

  •    R

Finally, you can download a cheatsheet that summarize in a single page how to use the package: https://github.com/rte-antares-rpackage/antaresRead/raw/master/cheat_sheet/antares_cheat_sheet_en.pdf . Select an Antares simulation interactively.

serving-runtime - Exposes a serialized machine learning model through a HTTP API.

  •    Java

Exposes a serialized machine learning model through a HTTP API written in Java. The purpose of this project is to expose a generic HTTP API from a machine learning serialized models.

netcdf-java - The Unidata netcdf-java library

  •    Java

Looking for the 6.x line of development? See branch 6.x. Looking for the 5.x line of development? See branch maint-5.x. Version 4.6 is no longer supported outside of the context of the THREDDS Data Server (TDS). If you are looking for that codebase, it can be found at https://github.com/Unidata/thredds/tree/4.6.x. The netCDF Java library provides an interface for scientific data access. It can be used to read scientific data from a variety of file formats including netCDF, HDF, GRIB, and BUFR. By itself, the netCDF-Java library can only write netCDF-3 files. It can write netCDF-4 files by using JNA to call the netCDF-C library. The library implements Unidata's Common Data Model (CDM) to provide data geolocation capabilities.

h5fortran - Lightweight HDF5 polymorphic Fortran: h5write() h5read()

  •    Fortran

For NetCDF4 see nc4fortran. h5fortran is designed for "serial" HDF5 read/write. We don't yet implement the interface for "parallel" HDF5. Uses Fortran submodule for clean template structure. This easy-to-use, thin object-oriented modern Fortran library abstracts away the messy parts of HDF5 so that you can read / write various types/ranks of data with a single command. In distinction from other high-level HDF5 interfaces, h5fortran works to deduplicate code, using polymorphism wherever feasible and extensive test suite.

arkimet - A set of tools to organize, archive and distribute data files.

  •    C

Arkimet is a set of tools to organize, archive and distribute data files. It currently supports data in GRIB, BUFR, HDF5 and VM2 formats. Arkimet manages a set of datasets, each of which contains omogeneous data stored in segments. It exploits the commonalities between the data in a dataset to implement a fast, powerful and space-efficient indexing system.

xtensor-io - xtensor plugin to read and write images, audio files, numpy (compressed) npz and HDF5

  •    C++

xtensor-io wraps the OpenImageIO, libsndfile, zlib, HighFive, and blosc libraries. xtensor-io is a header-only library. We provide a package for the mamba (or conda) package manager.

jsfive - A pure javascript HDF5 reader

  •    Javascript

It is only for reading HDF5 files as an ArrayBuffer representation of the file. See a live demo: center pane shows collapsible folder structure and right panel shows data and attributes.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.