Displaying 1 to 20 from 27 results

primus - :zap: Primus, the creator god of the transformers & an abstraction layer for real-time to prevent module lock-in

  •    Javascript

Primus, the creator god of transformers but now also known as universal wrapper for real-time frameworks. There are a lot of real-time frameworks available for Node.js and they all have different opinions on how real-time should be done. Primus provides a common low level interface to communicate in real-time using various real-time frameworks.If you deploy your application behind a reverse proxy (Nginx, HAProxy, etc.) you might need to add WebSocket specific settings to its configuration files. If you intend to use WebSockets, please ensure that these settings have been added. There are some example configuration files available in the observing/balancerbattle repository.

TransmogrifAI - TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library for building modular, reusable, strongly typed machine learning workflows on Spark with minimal hand tuning

  •    Scala

TransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library written in Scala that runs on top of Spark. It was developed with a focus on accelerating machine learning developer productivity through machine learning automation, and an API that enforces compile-time type-safety, modularity, and reuse. Through automation, it achieves accuracies close to hand-tuned models with almost 100x reduction in time. Skip to Quick Start and Documentation.

nlp-architect - A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks

  •    Python

NLP Architect is an open source Python library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing and Natural Language Understanding Neural Networks. NLP Architect is an NLP library designed to be flexible, easy to extend, allow for easy and rapid integration of NLP models in applications and to showcase optimized models.




mleap - MLeap: Deploy Spark Pipelines to Production

  •    Scala

Deploying machine learning data pipelines and algorithms should not be a time-consuming or difficult task. MLeap allows data scientists and engineers to deploy machine learning pipelines from Spark and Scikit-learn to a portable format and execution engine. Documentation is available at mleap-docs.combust.ml.

long-range-arena - Long Range Arena for Benchmarking Efficient Transformers

  •    Python

Long-range arena is an effort toward systematic evaluation of efficient transformer models. The project aims at establishing benchmark tasks/dtasets using which we can evaluate transformer-based models in a systematic way, by assessing their generalization power, computational efficiency, memory foot-print, etc. Long-range arena also implements different variants of Transformer models in JAX, using Flax.

node-question-answering - Fast and production-ready question answering in Node.js

  •    TypeScript

It can run models in SavedModel and TFJS formats locally, as well as remote models thanks to TensorFlow Serving. The following example will automatically download the default DistilBERT model in SavedModel format if not already present, along with the required vocabulary / tokenizer files. It will then run the model and return the answer to the question.

tflite-android-transformers - DistilBERT / GPT-2 for on-device inference thanks to TensorFlow Lite with Android demo apps

  •    Java

Convert Transformers models imported from the 🤗 Transformers library and use them on Android. You can also check out our swift-coreml-transformers repo if you're looking for Transformers on iOS. Demo of the DistilBERT model (97% of BERT’s performance on GLUE) fine-tuned for Question answering on the SQuAD dataset. It provides 48 passages from the dataset for users to choose from.


lightning-transformers - Flexible interface for high-performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra

  •    Python

Flexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra. Lightning Transformers offers a flexible interface for training and fine-tuning SOTA Transformer models using the PyTorch Lightning Trainer.

layouts - Wraps templates with layouts

  •    Javascript

Wraps templates with layouts. Layouts can use other layouts and be nested to any depth. This can be used 100% standalone to wrap any kind of file with banners, headers or footer content. Use for markdown, HTML, handlebars views, lo-dash templates, etc. Layouts can also be vinyl files. Please consider following this project's author, Brian Woodward, and consider starring the project to show your ❤️ and support.

Transformers - 基于 JavaScript 的组件化开发框架,如果你想以搭积木的方式开发项目,那就试试 Transformers 框架吧~

  •    Javascript

基于 JavaScript 的组件化开发框架,如果你想以搭积木的方式开发项目,那就试试 Transformers 框架吧~

ts-tools - TypeScript Tools for Node.js

  •    TypeScript

Several packages to assist TypeScript developers working with Node.js. @ts-tools/node - TypeScript support for Node.js. Allows running .ts/.tsx files directly from source.

parcel-plugin-typescript - 🚨 Enhanced TypeScript support for Parcel

  •    TypeScript

Enhanced TypeScript integration for Parcel. While Parcel has built-in transpiling support for TypeScript, this plugin provides additional features.

deepconsensus - DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data

  •    Python

DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data. You can ignore errors regarding google-nucleus installation, such as ERROR: Failed building wheel for google-nucleus.

transfuser - [CVPR'21] Multi-Modal Fusion Transformer for End-to-End Autonomous Driving

  •    Python

The training data is generated using leaderboard/team_code/auto_pilot.py in 8 CARLA towns and 14 weather conditions. The routes and scenarios files to be used for data generation are provided at leaderboard/data. Instructions for setting up docker are available here. Pull the docker image of CARLA 0.9.10.1 docker pull carlasim/carla:0.9.10.1.

scenic - Scenic: A Jax Library for Computer Vision and Beyond

  •    Python

Scenic is a codebase with a focus on research around attention-based models for computer vision. Scenic has been successfully used to develop classification, segmentation, and detection models for multiple modalities including images, video, audio, and multimodal combinations of them. More precisely, Scenic is a (i) set of shared light-weight libraries solving tasks commonly encountered tasks when training large-scale (i.e. multi-device, multi-host) vision models; and (ii) a number of projects containing fully fleshed out problem-specific training and evaluation loops using these libraries.

robustness-vit - Contains code for the paper "Vision Transformers are Robust Learners".

  •    Jupyter

This repository contains the code for the paper Vision Transformers are Robust Learners by Sayak Paul* and Pin-Yu Chen*. *Equal contribution.

awesome-huggingface - 🤗 A list of wonderful open-source projects & applications integrated with Hugging Face libraries

  •    

This is a list of some wonderful open-source projects & applications integrated with Hugging Face libraries. First-party cool stuff made with ❤️ by 🤗 Hugging Face.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.