Primus, the creator god of transformers but now also known as universal wrapper for real-time frameworks. There are a lot of real-time frameworks available for Node.js and they all have different opinions on how real-time should be done. Primus provides a common low level interface to communicate in real-time using various real-time frameworks.If you deploy your application behind a reverse proxy (Nginx, HAProxy, etc.) you might need to add WebSocket specific settings to its configuration files. If you intend to use WebSockets, please ensure that these settings have been added. There are some example configuration files available in the observing/balancerbattle repository.
real-time websocket framework sockjs browserchannel polling http nodejs node abstraction engine.io comet streaming pubsub pub sub ajax xhr faye io primus prumus realtime socket socket.io sockets spark transformer transformers websockets ws uwsHappyPack makes initial webpack builds faster by transforming files in parallel. HappyPack provides both a plugin and a loader in order to do its job so you must use both to enable it.
webpack build-tool performance plugin fast speed compilation transformer loader happiness happySliding pages to the left or right, as we know, could be implemented by using ViewPager. And fortunately, ViewPager's PagerTransformer is allowed for customization. That's to say, CustPagerTransformer could get rid of all the parallax effects. Then, in viewpager's fragment item, vertical slide is an independent module, which could be realized by using ViewDragHelper. In the activity transition part, android OS (above 5.0) makes it easy to transfer to another activity.
viewpager transition parallax transformerVisual analysis and diagnostic tools to facilitate machine learning model selection. Image by Quatro Cinco, used with permission, Flickr Creative Commons.
machine-learning visual-analysis model-selection visualization scikit-learn visualizer matplotlib estimator residuals transformer advantage anacondaMMOCR is an open-source toolbox based on PyTorch and mmdetection for text detection, text recognition, and the corresponding downstream tasks including key information extraction. It is part of the OpenMMLab project. The main branch works with PyTorch 1.5+.
ocr deep-learning pytorch transformer db text-recognition pan text-detection sar maskrcnn crnn dbnet psenet panet key-information-extraction sdmg-r textsnake robustscanner segmentation-based-text-recognition fcenet🤗 Transformers provides thousands of pretrained models to perform tasks on texts such as classification, information extraction, question answering, summarization, translation, text generation and more in over 100 languages. Its aim is to make cutting-edge NLP easier to use for everyone. 🤗 Transformers provides APIs to quickly download and use those pretrained models on a given text, fine-tune them on your own datasets and then share them with the community on our model hub. At the same time, each python module defining an architecture is fully standalone and can be modified to enable quick research experiments.
nlp natural-language-processing tensorflow pytorch transformer speech-recognition seq2seq flax gpt pretrained-models language-models natural-language-generation nlp-library language-model bert natural-language-understanding jax xlnet pytorch-transformers model-hub深度学习入门课、资深课、特色课、学术案例、产业实践案例、深度学习知识百科及面试题库The course, case and knowledge of Deep Learning and AI
nlp video reinforcement-learning detection cnn transformer gan dqn classification rnn sarsa segmentation recommender-system bert pose dssm tinybert dynabertAn NLP library with Awesome pre-trained Transformer models and easy-to-use interface, supporting wide-range of NLP tasks from research to industrial applications.
nlp dataset transformer seq2seq pretrained-models embedding bert ernie paddlenlp《计算机视觉实战演练:算法与应用》中文电子书、源码、读者交流社区(持续更新中 ...) 📘 在线电子书 https://charmve.github.io/computer-vision-in-action/ 👇项目主页
machine-learning tutorial books computer-vision deep-learning neural-network notebook jupyter-notebook handbook pytorch transformer ipynb deep-learning-tutorial computer-vision-algorithms colab-notebook in-action charmveThe WeChat AI open-sourced TurboTransformers with the following characteristics. TurboTransformers has been applied to multiple online BERT service scenarios in Tencent. For example, It brings 1.88x acceleration to the WeChat FAQ service, 2.11x acceleration to the public cloud sentiment analysis service, and 13.6x acceleration to the QQ recommendation system. Moreover, it has already been applied to build services such as Chitchating, Searching, and Recommendation.
nlp gpu decoder machine-translation inference pytorch transformer albert bert roberta gpt2 huggingface-transformersThe samples decoded from each level are stored in {name}/level_{level}. You can also view the samples as an html with the aligned lyrics under {name}/level_{level}/index.html. Run python -m http.server and open the html through the server to see the lyrics animate as the song plays. A summary of all sampling data including zs, x, labels and sampling_kwargs is stored in {name}/level_{level}/data.pth.tar. The hps are for a V100 GPU with 16 GB GPU memory. The 1b_lyrics, 5b, and 5b_lyrics top-level priors take up 3.8 GB, 10.3 GB, and 11.5 GB, respectively. The peak memory usage to store transformer key, value cache is about 400 MB for 1b_lyrics and 1 GB for 5b_lyrics per sample. If you are having trouble with CUDA OOM issues, try 1b_lyrics or decrease max_batch_size in sample.py, and --n_samples in the script call.
audio music paper pytorch transformer generative-model vq-vaeA simple utility for pluggable JS syntax transforms using the esprima parser. Note: If you're looking for a library for writing new greenfield JS transformations, consider looking at Babel or Recast instead of jstransform. We are still supporting jstransform (and intend to for a little while), but longer term we would like to direct efforts toward other open source projects that do a far better job of supporting a multi-pass JS transformation pipeline. This is important when attempting to apply many transformations to a source file. jstransform does a single pass resulting in performance benefits, but the tradeoff is that many transformations are much harder to write.
transformer compiler syntax visitorThis is a PyTorch implementation of the TensorFlow code provided with OpenAI's paper "Improving Language Understanding by Generative Pre-Training" by Alec Radford, Karthik Narasimhan, Tim Salimans and Ilya Sutskever. This implementation comprises a script to load in the PyTorch model the weights pre-trained by the authors with the TensorFlow implementation.
neural-networks pytorch openai language-model transformerQuestion generation is the task of automatically generating questions from a text paragraph. The most straight-forward way for this is answer aware question generation. In answer aware question generation the model is presented with the answer and the passage and asked to generate a question for that answer by considering the passage context. While there are many papers available for QG task, it's still not as mainstream as QA. One of the reasons is most of the earlier papers use complicated models/processing pipelines and have no pre-trained models available. Few recent papers, specifically UniLM and ProphetNet have SOTA pre-trained weights availble for QG but the usage seems quite complicated. This project is aimed as an open source study on question generation with pre-trained transformers (specifically seq-2-seq models) using straight-forward end-to-end methods without much complicated pipelines. The goal is to provide simplified data processing and training scripts and easy to use pipelines for inference.
nlp natural-language-processing deep-learning transformer natural-language-generation nlg question-generation t5Figure 1: Performance of SegFormer-B0 to SegFormer-B5. SegFormer: Simple and Efficient Design for Semantic Segmentation with Transformers. Enze Xie, Wenhai Wang, Zhiding Yu, Anima Anandkumar, Jose M. Alvarez, and Ping Luo. Technical Report 2021.
transformer semantic-segmentation cityscapes ade20kA collection of view pager transformers. This repos is fork from daimajia but i uploaded it to maven central and added some more javadoc.
android transformer animation viewpager android-viewpager-transformers pager-transformersLaravel Responder is a package for building API responses, integrating Fractal into Laravel and Lumen. It can transform your data using transformers, create and serialize success- and error responses, handle exceptions and assist you with testing your responses. 3.0 has been released with many bug fixes, tons of new features and some breaking changes. Make sure to check out the changelog to get an overview of everything fresh. The Creating Transformers and Transforming Data sections of the documentation has also been rewritten for clarity and to account for the new changes.
laravel lumen fractal responder api transformerallRank provides an easy and flexible way to experiment with various LTR neural network models and loss functions. It is easy to add a custom loss, and to configure the model and the training procedure. We hope that allRank will facilitate both research in neural LTR and its industrial applications. To help you get started, we provide a run_example.sh script which generates dummy ranking data in libsvm format and trains a Transformer model on the data using provided example config.json config file. Once you run the script, the dummy data can be found in dummy_data directory and the results of the experiment in test_run directory. To run the example, Docker is required.
machine-learning information-retrieval deep-learning pytorch transformer ranking learning-to-rank ndcg click-modelFor the sake of having a real life example, this configuration will guide you on how to set up 7 end-points for two resources, Employees and Orders. Both Employees and Orders resources will be Eloquent models, being related one with the other.
json-api lumen transformer laravel laravel5 jsonapi api microservice microservices json php7
We have large collection of open source products. Follow the tags from
Tag Cloud >>
Open source products are scattered around the web. Please provide information
about the open source projects you own / you use.
Add Projects.