Primus, the creator god of transformers but now also known as universal wrapper for real-time frameworks. There are a lot of real-time frameworks available for Node.js and they all have different opinions on how real-time should be done. Primus provides a common low level interface to communicate in real-time using various real-time frameworks.If you deploy your application behind a reverse proxy (Nginx, HAProxy, etc.) you might need to add WebSocket specific settings to its configuration files. If you intend to use WebSockets, please ensure that these settings have been added. There are some example configuration files available in the observing/balancerbattle repository.
real-time websocket framework sockjs browserchannel polling http nodejs node abstraction engine.io comet streaming pubsub pub sub ajax xhr faye io primus prumus realtime socket socket.io sockets spark transformer transformers websockets ws uwsTransmogrifAI (pronounced trăns-mŏgˈrə-fī) is an AutoML library written in Scala that runs on top of Spark. It was developed with a focus on accelerating machine learning developer productivity through machine learning automation, and an API that enforces compile-time type-safety, modularity, and reuse. Through automation, it achieves accuracies close to hand-tuned models with almost 100x reduction in time. Skip to Quick Start and Documentation.
ml automl transformations estimators dsl pipelines machine-learning salesforce einstein features feature-engineering spark sparkml ai automated-machine-learning transmogrification transmogrify structured-data transformersProvides an implementation of today's most used tokenizers, with a focus on performance and versatility.
nlp natural-language-processing transformers gpt language-model bert natural-language-understandingNLP Architect is an open source Python library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing and Natural Language Understanding Neural Networks. NLP Architect is an NLP library designed to be flexible, easy to extend, allow for easy and rapid integration of NLP models in applications and to showcase optimized models.
nlp deep-learning tensorflow nlu transformers pytorch deeplearning quantization bert dynetDeploying machine learning data pipelines and algorithms should not be a time-consuming or difficult task. MLeap allows data scientists and engineers to deploy machine learning pipelines from Spark and Scikit-learn to a portable format and execution engine. Documentation is available at mleap-docs.combust.ml.
scikit-learn spark data-pipelines transformers tensorflowLong-range arena is an effort toward systematic evaluation of efficient transformer models. The project aims at establishing benchmark tasks/dtasets using which we can evaluate transformer-based models in a systematic way, by assessing their generalization power, computational efficiency, memory foot-print, etc. Long-range arena also implements different variants of Transformer models in JAX, using Flax.
nlp deep-learning transformers attention flax jaxIt can run models in SavedModel and TFJS formats locally, as well as remote models thanks to TensorFlow Serving. The following example will automatically download the default DistilBERT model in SavedModel format if not already present, along with the required vocabulary / tokenizer files. It will then run the model and return the answer to the question.
nodejs nlp typescript tensorflow transformers question-answering bert distilbertConvert Transformers models imported from the 🤗 Transformers library and use them on Android. You can also check out our swift-coreml-transformers repo if you're looking for Transformers on iOS. Demo of the DistilBERT model (97% of BERT’s performance on GLUE) fine-tuned for Question answering on the SQuAD dataset. It provides 48 passages from the dataset for users to choose from.
android nlp tensorflow transformers tensorflow-liteFlexible interface for high performance research using SOTA Transformers leveraging Pytorch Lightning, Transformers, and Hydra. Lightning Transformers offers a flexible interface for training and fine-tuning SOTA Transformer models using the PyTorch Lightning Trainer.
transformers pytorch hydra pytorch-lightningWraps templates with layouts. Layouts can use other layouts and be nested to any depth. This can be used 100% standalone to wrap any kind of file with banners, headers or footer content. Use for markdown, HTML, handlebars views, lo-dash templates, etc. Layouts can also be vinyl files. Please consider following this project's author, Brian Woodward, and consider starring the project to show your ❤️ and support.
layout layouts nested-layouts vinyl vinyl-files templates handlebars express consolidate ejs jade liquid atpl coffee dot dust eco ect engine engines haml haml-coffee hamljs handlebars-layouts hbs hogan hogan-js jazz jqtpl liquor lodash mote mustache nest nested nunjucks page qejs ractive stack stacked swig template templayed tmpl toffee transformers underscore walrus whiskers wrap基于 JavaScript 的组件化开发框架,如果你想以搭积木的方式开发项目,那就试试 Transformers 框架吧~
framework transformers componentJava Library for exporting spark models in Java ecosystem
spark mllib export machine-learning-library machine-learning-algorithms machine-learning apache-spark data-pipelines transformersSeveral packages to assist TypeScript developers working with Node.js. @ts-tools/node - TypeScript support for Node.js. Allows running .ts/.tsx files directly from source.
typescript transpilation transformers webpack loader tools nodejsEnhanced TypeScript integration for Parcel. While Parcel has built-in transpiling support for TypeScript, this plugin provides additional features.
typescript parcel-plugin type-checker code-quality lint transformersDeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences (PacBio) Circular Consensus Sequencing (CCS) data. You can ignore errors regarding google-nucleus installation, such as ERROR: Failed building wheel for google-nucleus.
bioinformatics deep-learning transformers long-read-sequencingThe training data is generated using leaderboard/team_code/auto_pilot.py in 8 CARLA towns and 14 weather conditions. The routes and scenarios files to be used for data generation are provided at leaderboard/data. Instructions for setting up docker are available here. Pull the docker image of CARLA 0.9.10.1 docker pull carlasim/carla:0.9.10.1.
transformers autonomous-driving sensor-fusion imitation-learningScenic is a codebase with a focus on research around attention-based models for computer vision. Scenic has been successfully used to develop classification, segmentation, and detection models for multiple modalities including images, video, audio, and multimodal combinations of them. More precisely, Scenic is a (i) set of shared light-weight libraries solving tasks commonly encountered tasks when training large-scale (i.e. multi-device, multi-host) vision models; and (ii) a number of projects containing fully fleshed out problem-specific training and evaluation loops using these libraries.
research computer-vision deep-learning transformers attention jax vision-transformerBackprop makes it simple to use, finetune, and deploy state-of-the-art ML models. Solve a variety of tasks with pre-trained models or finetune them in one line for your own tasks.
nlp natural-language-processing text-classification transformers question-answering image-classification transfer-learning language-model bert fine-tuning multilingual-modelsThis repository contains the code for the paper Vision Transformers are Robust Learners by Sayak Paul* and Pin-Yu Chen*. *Equal contribution.
computer-vision tensorflow transformers pytorch robustness self-attention jaxThis is a list of some wonderful open-source projects & applications integrated with Hugging Face libraries. First-party cool stuff made with ❤️ by 🤗 Hugging Face.
nlp machine-learning awesome transformers awesome-list huggingface
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.