Displaying 1 to 20 from 63 results

primus - :zap: Primus, the creator god of the transformers & an abstraction layer for real-time to prevent module lock-in

  •    Javascript

Primus, the creator god of transformers but now also known as universal wrapper for real-time frameworks. There are a lot of real-time frameworks available for Node.js and they all have different opinions on how real-time should be done. Primus provides a common low level interface to communicate in real-time using various real-time frameworks.If you deploy your application behind a reverse proxy (Nginx, HAProxy, etc.) you might need to add WebSocket specific settings to its configuration files. If you intend to use WebSockets, please ensure that these settings have been added. There are some example configuration files available in the observing/balancerbattle repository.

happypack - Happiness in the form of faster webpack build times.

  •    Javascript

HappyPack makes initial webpack builds faster by transforming files in parallel. HappyPack provides both a plugin and a loader in order to do its job so you must use both to enable it.

ViewpagerTransition - viewpager with parallax pages, together with vertical sliding (or click) and activity transition

  •    Java

Sliding pages to the left or right, as we know, could be implemented by using ViewPager. And fortunately, ViewPager's PagerTransformer is allowed for customization. That's to say, CustPagerTransformer could get rid of all the parallax effects. Then, in viewpager's fragment item, vertical slide is an independent module, which could be realized by using ViewDragHelper. In the activity transition part, android OS (above 5.0) makes it easy to transfer to another activity.




transformers - 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX

  •    Python

🤗 Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation and more in over 100 languages. Its aim is to make cutting-edge NLP easier to use for everyone. 🤗 Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments.

mmocr - OpenMMLab Text Detection, Recognition and Understanding Toolbox

  •    Python

MMOCR is an open-source toolbox based on PyTorch and mmdetection for text detection, text recognition, and the corresponding downstream tasks including key information extraction. It is part of the OpenMMLab project. The main branch works with PyTorch 1.5+.


jstransform - A simple utility for pluggable JS syntax transforms using the esprima parser.

  •    Javascript

A simple utility for pluggable JS syntax transforms using the esprima parser. Note: If you're looking for a library for writing new greenfield JS transformations, consider looking at Babel or Recast instead of jstransform. We are still supporting jstransform (and intend to for a little while), but longer term we would like to direct efforts toward other open source projects that do a far better job of supporting a multi-pass JS transformation pipeline. This is important when attempting to apply many transformations to a source file. jstransform does a single pass resulting in performance benefits, but the tradeoff is that many transformations are much harder to write.

pytorch-openai-transformer-lm - A PyTorch implementation of OpenAI's finetuned transformer language model with a script to import the weights pre-trained by OpenAI

  •    Python

This is a PyTorch implementation of the TensorFlow code provided with OpenAI's paper "Improving Language Understanding by Generative Pre-Training" by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever. This implementation comprises a script to load in the PyTorch model the weights pre-trained by the authors with the TensorFlow implementation.

question_generation - Neural question generation using transformers

  •    Jupyter

Question generation is the task of automatically generating questions from a text paragraph. The most straight-forward way for this is answer aware question generation. In answer aware question generation the model is presented with the answer and the passage and asked to generate a question for that answer by considering the passage context. While there are many papers available for QG task, it's still not as mainstream as QA. One of the reasons is most of the earlier papers use complicated models/processing pipelines and have no pre-trained models available. Few recent papers, specifically UniLM and ProphetNet have SOTA pre-trained weights availble for QG but the usage seems quite complicated. This project is aimed as an open source study on question generation with pre-trained transformers (specifically seq-2-seq models) using straight-forward end-to-end methods without much complicated pipelines. The goal is to provide simplified data processing and training scripts and easy to use pipelines for inference.

android-viewpager-transformers - A collection of view pager transformers

  •    Java

A collection of view pager transformers. This repos is fork from daimajia but i uploaded it to maven central and added some more javadoc.

laravel-responder - A Laravel Fractal package for building API responses, giving you the power of Fractal with Laravel's elegancy

  •    PHP

Laravel Responder is a package for building API responses, integrating Fractal into Laravel and Lumen. It can transform your data using transformers, create and serialize success- and error responses, handle exceptions and assist you with testing your responses. 3.0 has been released with many bug fixes, tons of new features and some breaking changes. Make sure to check out the changelog to get an overview of everything fresh. The Creating Transformers and Transforming Data sections of the documentation has also been rewritten for clarity and to account for the new changes.

allRank - allRank is a framework for training learning-to-rank neural models based on PyTorch.

  •    Python

allRank provides an easy and flexible way to experiment with various LTR neural network models and loss functions. It is easy to add a custom loss, and to configure the model and the training procedure. We hope that allRank will facilitate both research in neural LTR and its industrial applications. To help you get started, we provide a run_example.sh script which generates dummy ranking data in libsvm format and trains a Transformer model on the data using provided example config.json config file. Once you run the script, the dummy data can be found in dummy_data directory and the results of the experiment in test_run directory. To run the example, Docker is required.

laravel5-jsonapi - Laravel 5 JSON API Transformer Package

  •    PHP

For the sake of having a real life example, this configuration will guide you on how to set up 7 end-points for two resources, Employees and Orders. Both Employees and Orders resources will be Eloquent models, being related one with the other.

Transformer - Easy Attributed String Creator

  •    Javascript

The main idea of this project is to have an online tool to be able to visually add formatting to a text and get back a swift and/or objective-c code to reproduce that formating.

fast-xml-parser - Validate XML, Parse XML to JS/JSON and vise versa, or parse XML to Nimn rapidly without C/C++ based libraries and no callback

  •    Javascript

This project welcomes contributors. If you have a feature you'd like to see implemented or a bug you'd liked fixed, the best and fastest way to make that happen is to implement it and submit a PR. Basic knowledge of JS is sufficient. Feel free to ask for any guidance. To use it from CLI Install it globally with -g option.

bigbird - Transformers for Longer Sequences

  •    Python

Not an official Google product. BigBird, is a sparse-attention based transformer which extends Transformer based models, such as BERT to much longer sequences. Moreover, BigBird comes along with a theoretical understanding of the capabilities of a complete transformer that the sparse model can handle.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.