Displaying 1 to 6 from 6 results

spark - .NET for Apache® Spark™ makes Apache Spark™ easily accessible to .NET developers.

  •    CSharp

.NET for Apache Spark provides high performance APIs for using Apache Spark from C# and F#. With these .NET APIs, you can access the most popular Dataframe and SparkSQL aspects of Apache Spark, for working with structured data, and Spark Structured Streaming, for working with streaming data. .NET for Apache Spark is compliant with .NET Standard - a formal specification of .NET APIs that are common across .NET implementations. This means you can use .NET for Apache Spark anywhere you write .NET code allowing you to reuse all the knowledge, skills, code, and libraries you already have as a .NET developer.

azure-databricks-client - Client library for Azure Databricks

  •    CSharp

The Azure Databricks Client Library allows you to automate your Azure Databricks environment through Azure Databricks REST Api. The implementation of this library is based on REST Api version 2.0.

terraform-provider-databricks - Terraform Databricks provider

  •    Go

If you wish to work on the provider, you'll first need Go installed on your machine (version 1.9+ is required). You'll also need to correctly setup a GOPATH, as well as adding $GOPATH/bin to your $PATH. To compile the provider, run make build. This will build the provider and put the provider binary in the $GOPATH/bin directory.

batcomputer - A working example of DevOps & operationalisation applied to Machine Learning and AI

  •    Python

Project Batcomputer is a working example of DevOps applied to machine learning and the field of AI. 💬 Why "Project Batcomputer"? The main model trained and used as the foundation of the project is based on crime data, and predictions of outcomes of crimes (convictions etc). The Batman Batcomputer seemed like a fun way to make using such a prediction model more interesting.




azdo-databricks - A set of Build and Release tasks for Building, Deploying and Testing Databricks notebooks

  •    TypeScript

This extension brings a set of tasks for you to operationalize build, test and deployment of Databricks Jobs and Notebooks. To run this set of tasks in your build/release pipeline, you first need to explicitly set a Python version. To do so, use this task as a first task for your pipeline.

delta-rs - A native Rust library for Delta Lake, with bindings into Python and Ruby.

  •    Rust

A native interface to Delta Lake. This library provides low level access to Delta tables in Rust, which can be used with data processing frameworks like datafusion, ballista, rust-dataframe, vega, etc. It also provides bindings to other higher level languages such as Python, Ruby, or Golang.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.