Displaying 1 to 20 from 38 results

sling - SLING - A natural language frame semantics parser

  •    C++

SLING is a parser for annotating text with frame semantic annotations. It is trained on an annotated corpus using Tensorflow and Dragnn.The parser is a general transition-based frame semantic parser using bi-directional LSTMs for input encoding and a Transition Based Recurrent Unit (TBRU) for output decoding. It is a jointly trained model using only the text tokens as input and the transition system has been designed to output frame graphs directly without any intervening symbolic representation.

opencog - A framework for integrated Artificial Intelligence & Artificial General Intelligence (AGI)

  •    Scheme

OpenCog is a framework for developing AI systems, especially appropriate for integrative multi-algorithm systems, and artificial general intelligence systems. Though much work remains to be done, it currently contains a functional core framework, and a number of cognitive agents at varying levels of completion, some already displaying interesting and useful functionalities alone and in combination. With the exception of MOSES and the CogServer, all of the above are in active development, are half-baked, poorly documented, mis-designed, subject to experimentation, and generally in need of love an attention. This is where experimentation and integration are taking place, and, like any laboratory, things are a bit fluid and chaotic.

ludwig - Ludwig is a toolbox built on top of TensorFlow that allows to train and test deep learning models without the need to write code

  •    Python

Ludwig is a toolbox built on top of TensorFlow that allows to train and test deep learning models without the need to write code. All you need to provide is a CSV file containing your data, a list of columns to use as inputs, and a list of columns to use as outputs, Ludwig will do the rest. Simple commands can be used to train models both locally and in a distributed way, and to use them to predict on new data.




transformers - ๐Ÿค—Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX

  •    Python

๐Ÿค— Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation and more in over 100 languages. Its aim is to make cutting-edge NLP easier to use for everyone. ๐Ÿค— Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments.

bert - TensorFlow code and pre-trained models for BERT

  •    Python

This is a release of 24 smaller BERT models (English only, uncased, trained with WordPiece masking) referenced in Well-Read Students Learn Better: On the Importance of Pre-training Compact Models. We have shown that the standard BERT recipe (including model architecture and training objective) is effective on a wide range of model sizes, beyond BERT-Base and BERT-Large. The smaller BERT models are intended for environments with restricted computational resources. They can be fine-tuned in the same manner as the original BERT models. However, they are most effective in the context of knowledge distillation, where the fine-tuning labels are produced by a larger and more accurate teacher.


autonlp - ๐Ÿค— AutoNLP: train state-of-the-art natural language processing models and deploy them in a scalable environment automatically

  •    Python

You can Install AutoNLP python package via PIP. Please note you will need python >= 3.7 for AutoNLP to work properly. Please take a look at AutoNLP Documentation for a list of supported tasks and languages.

spark-nlp - Natural Language Understanding Library for Apache Spark.

  •    Jupyter

John Snow Labs Spark-NLP is a natural language processing library built on top of Apache Spark ML. It provides simple, performant & accurate NLP annotations for machine learning pipelines, that scale easily in a distributed environment. This library has been uploaded to the spark-packages repository https://spark-packages.org/package/JohnSnowLabs/spark-nlp .

Hackernews-NLU - Use Swift to interpret unstructured data from Hacker News

  •    Swift

Hackernews-NLU is a sample application that uses Watson Natural Language Understanding service to analyze the contents of trending news articles on Hackernews to give information about the concepts, entities, categories, keywords, sentiment, emotion etc. about the news article. Clicking on the button below creates a IBM Code DevOps Toolchain and deploys this application to IBM Code. The manifest.yml file [included in the repo] is parsed to obtain the name of the application, configuration details, and the list of services that should be provisioned. For further details on the structure of the manifest.yml file, see the Cloud Foundry documentation.

watson-document-classifier - Augment IBM Watson Natural Language Understanding APIs with a configurable mechanism for text classification, uses IBM Data Science Experience (DSX)

  •    Jupyter

Read this in other languages: ํ•œ๊ตญ์–ด. In this developer journey we will use Jupyter notebooks in IBM Data Science experience(DSX) to augment IBM Watson Natural Language Understanding API output through configurable mechanism for text classification.

gsoc2018-3gm - ๐Ÿ’ซ Automated codification of Greek Legislation with NLP

  •    Python

Welcome to Government Gazette text mining, cross linking, and codification Project (or 3gm for short) using Natural Language Processing Methods and Practices on Greek Legislation. This project aims to provide with the most recent versions of each law, i.e. an automated codex via NLP methods and practices.

Arch-Data-Science - Archlinux PKGBUILDs for Data Science, Machine Learning, Deep Learning, NLP and Computer Vision

  •    Shell

Welcome to my repo to build Data Science, Machine Learning, Computer Vision, Natural language Processing and Deep Learning packages from source. My Data Science environment is running from a LXC container so Tensorflow build system, bazel, must be build with its auto-sandboxing disabled.

watson-second-opinion - Get a second opinion on Amazon products by analyzing product reviews with Watson Natural Language Understanding

  •    Javascript

This is the code pattern for https://2ndopinion.mybluemix.net/. In this Code Pattern, we will create a Node.js app that takes the reviews from an online shopping website, Amazon, and feeds them into the Watson Natural Language Understanding service. The reviews will be stored in a Cloudant database. The Watson Natural Language Understanding service will show the overall sentiments of the reviews. The sample application will do all the reading of reviews for you and will give an overall insight about them. The Code Pattern can be useful to developers that are looking into processing multiple documents with Watson Natural Language Understanding.

ConvAI-baseline - ConvAI baseline solution

  •    Python

Python packages will be installed by setup.sh script. Setup will download docker images, models and data files, so you have no need to download any of that by yourself.

deep-nlp-seminars - Materials for deep NLP course

  •    Jupyter

Also, please do not add your name to your homework, since we try to keep review process anonymous. Please, register your project here.

intent_classifier

  •    Python

Try it here. In this repo one can find code for training and infering intent classification that is presented as shallow-and-wide Convolutional Neural Network[1].






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.