Displaying 1 to 20 from 50 results

beam - A distributed knowledge graph store

  •    Go

There's a blog post that's a good introduction to Beam. Beam is a distributed knowledge graph store, sometimes called an RDF store or a triple store. Knowledge graphs are suitable for modeling data that is highly interconnected by many types of relationships, like encyclopedic information about the world. A knowledge graph store enables rich queries on its data, which can be used to power real-time interfaces, to complement machine learning applications, and to make sense of new, unstructured information in the context of the existing knowledge.

rdfstore-js - JS RDF store with SPARQL support

  •    Javascript

Many features present in versions 0.8.X have been removed in the 0.9.X. Some of them, will be added in the next versions, other like the MongoDB backend will be discarded. Please read this README file carefully to find the current set of features. rdfstore-js is a pure Javascript implementation of a RDF graph store with support for the SPARQL query and data manipulation language.

kibi - Kibi is a friendly - kept in sync - Kibana fork which add support for joins across indexes and external sources, tabbed navigation interface and more

  •    Javascript

Kibi extends Kibana 5.5.2 with data intelligence features; the core feature of Kibi is the capability to join and filter data from multiple Elasticsearch indexes and from SQL/NOSQL data sources ("external queries").In addition, Kibi provides UI features and visualizations like dashboard groups, tabs, cross entity relational navigation buttons, an enhanced search results table, analytical aggregators, HTML templates on query results, and much more.

Web-Karma - Information Integration Tool

  •    Java

The Karma tutorial at https://github.com/szeke/karma-tcdl-tutorial, also check out our DIG web site, where we use Karma extensively to process > 90M web pages. Karma is an information integration tool that enables users to quickly and easily integrate data from a variety of data sources including databases, spreadsheets, delimited text files, XML, JSON, KML and Web APIs. Users integrate information by modeling it according to an ontology of their choice using a graphical user interface that automates much of the process. Karma learns to recognize the mapping of data to ontology classes and then uses the ontology to propose a model that ties together these classes. Users then interact with the system to adjust the automatically generated model. During this process, users can transform the data as needed to normalize data expressed in different formats and to restructure it. Once the model is complete, users can published the integrated data as RDF or store it in a database.

graph-notebook - Library extending Jupyter notebooks to integrate with Apache TinkerPop and RDF SPARQL

  •    Jupyter

The graph notebook provides an easy way to interact with graph databases using Jupyter notebooks. Using this open-source Python package, you can connect to any graph database that supports the Apache TinkerPop, openCypher or the RDF SPARQL graph models. These databases could be running locally on your desktop or in the cloud. Graph databases can be used to explore a variety of use cases including knowledge graphs and identity graphs. We encourage others to contribute configurations they find useful. There is an additional-databases folder where more information can be found.

RDFSharp - Start playing with RDF!


RDFSharp is a lightweight library designed to ease the creation of .NET applications based on the RDF and Semantic Web data model.

easyrdf - EasyRdf is a PHP library designed to make it easy to consume and produce RDF.

  •    PHP

EasyRdf is a PHP library designed to make it easy to consume and produce RDF. It was designed for use in mixed teams of experienced and inexperienced RDF developers. It is written in Object Oriented PHP and has been tested extensively using PHPUnit. After parsing EasyRdf builds up a graph of PHP objects that can then be walked around to get the data to be placed on the page. Dump methods are available to inspect what data is available during development.

database - Blazegraph High Performance Graph Database

  •    Java

Blazegraph™ DB is our ultra high-performance graph database supporting Blueprints and RDF/SPARQL APIs. It supports up to 50 Billion edges on a single machine and has a High Availability and Scale-out architecture. It is in production use for customers such as EMC, Syapse, Wikidata Query Service, the British Museum, and many others. GPU acceleration and High Availability (HA) are available in the Enterprise edition. It contains war, jar, deb, rpm, and tar.gz deployment artifacts.

rasqal - Redland Rasqal RDF Query Library

  •    C

Redland Rasqal RDF Query Library


  •    Javascript

Leaflet.dbpediaLayer is a easy to use plugin for adding a layer with POIs from Wikipedia. It does so by querying the SPARQL endpoint at DBpedia. Check out the demo.

ocaml-rdf - OCaml library to manipulate RDF graphs; implements SPARQL

  •    HTML

OCaml library to manipulate RDF graphs; implements SPARQL

graph-pattern-learner - Evolutionary Graph Pattern Learner that learns SPARQL queries for a given set of source-target-pairs from an endpoint

  •    Python

In this repository you find the code for a graph pattern learner. Given a list of source-target-pairs and a SPARQL endpoint, it will try to learn SPARQL patterns. Given a source, the learned patterns will try to lead you to the right target. As you can immediately see, associations don't only follow a single pattern. Our algorithm is designed to be able to deal with this. It will try to learn several patterns, which in combination model your input list of source-target-pairs. If your list of source-target-pairs is less complicated, the algorithm will happily terminate earlier.

sparqlwrapper - A wrapper for a remote SPARQL endpoint

  •    Python

SPARQLWrapper is a simple Python wrapper around a SPARQL service to remotelly execute your queries. It helps in creating the query invokation and, possibly, convert the result into a more manageable format.

SPARQL.js - A parser for the SPARQL query language in JavaScript

  •    Yacc

The SPARQL 1.1 Query Language allows to query datasources of RDF triples. SPARQL.js translates SPARQL into JSON and back, so you can parse and build SPARQL queries in your JavaScript applications. It fully supports the SPARQL 1.1 specification, including property paths, federation, and updates.

Sessel - Document RDFizer for CouchDB

  •    Javascript

Sessel is a CouchApp for CouchDB that generates RDF triples from JSON documents, which then in turn can be exported to various serialization formats, or queried through a SPARQL endpoint. A graphical export interface can be accessed at http://<your_host>/<your_db>/_design/sessel/export.html.

node-sparql-client - A simple SPARQL client for node.js

  •    Javascript

A simple sparql client written for Node.js (with compatibility for Apache Fuseki). From version 0.2.0 it is possible to add options regarding the formating of the results. For example, we execute the following query (to retrieve all books and their genres).

QuitStore - Quads in Git - Distributed Version Control for RDF Knowledge Bases

  •    Javascript

This project runs a SPARQL endpoint for Update and Select Queries and enables versioning with Git for each Named Graph. Adjust the config.ttl. Make sure you put the correct path to your git repository ("../store") and the URI of your graph (<http://example.org/>) and name of the file holding this graph ("example.nq").

sgvizler2 - Sgvizler2 is a javascript wrapper for easy visualisation of SPARQL result sets (and a jQuery plugin)

  •    Javascript

This project is the reboot in Typescript of project Sgvizler of Martin G. Skjæveland. Generate your chart with our SPARQL editor and to read their docs.


  •    PHP

Very simple SparqlClient for PHP. Thanks to contributors.

We have large collection of open source products. Follow the tags from Tag Cloud >>

Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.