Displaying 1 to 20 from 38 results

transferlearning-tutorial - 《迁移学习简明手册》LaTex源码

  •    TeX

Jindong Wang et al. Transfer Learning Tutorial. 2018. 王晋东等. 迁移学习简明手册. 2018.

big_transfer - Official repository for the "Big Transfer (BiT): General Visual Representation Learning" paper

  •    Python

Update 18/06/2021: We release new high performing BiT-R50x1 models, which were distilled from BiT-M-R152x2, see this section. More details in our paper "Knowledge distillation: A good teacher is patient and consistent". Update 08/02/2021: We also release ALL BiT-M models fine-tuned on ALL 19 VTAB-1k datasets, see below.

naacl_transfer_learning_tutorial - Repository of code for the tutorial on Transfer Learning in NLP held at NAACL 2019 in Minneapolis, MN, USA

  •    Python

The tutorial was given on June 2 at NAACL 2019 in Minneapolis, MN, USA by Sebastian Ruder, Matthew Peters, Swabha Swayamdipta and Thomas Wolf. Here is the webpage of NAACL tutorials for more information.




transfer-learning-conv-ai - 🦄 State-of-the-Art Conversational AI with Transfer Learning

  •    Python

The present repo contains the code accompanying the blog post 🦄 How to build a State-of-the-Art Conversational AI with Transfer Learning. This code is a clean and commented code base with training and testing scripts that can be used to train a dialog agent leveraging transfer Learning from an OpenAI GPT and GPT-2 Transformer language model.

Haystack - Build a natural language interface for your data

  •    Python

Haystack is an end-to-end framework that enables you to build powerful and production-ready pipelines for different search use cases. Whether you want to perform Question Answering or semantic document search, you can use the State-of-the-Art NLP models in Haystack to provide unique search experiences and allow your users to query in natural language. Haystack is built in a modular fashion so that you can combine the best technology from other open-source projects like Huggingface's Transformers, Elasticsearch, or Milvus.

EasyTransfer - EasyTransfer is designed to make the development of transfer learning in NLP applications easier

  •    Python

EasyTransfer is designed to make the development of transfer learning in NLP applications easier. The literature has witnessed the success of applying deep Transfer Learning (TL) for many real-world NLP applications, yet it is not easy to build an easy-to-use TL toolkit to achieve such a goal. To bridge this gap, EasyTransfer is designed to facilitate users leveraging deep TL for NLP applications at ease. It was developed in Alibaba in early 2017, and has been used in the major BUs in Alibaba group and achieved very good results in 20+ business scenarios. It supports the mainstream pre-trained ModelZoo, including pre-trained language models (PLMs) and multi-modal models on the PAI platform, integrates the SOTA models for the mainstream NLP applications in AppZoo, and supports knowledge distillation for PLMs. EasyTransfer is very convenient for users to quickly start model training, evaluation, offline prediction, and online deployment. It also provides rich APIs to make the development of NLP and transfer learning easier.

spacy-transformers - 🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

  •    Python

This package provides spaCy components and architectures to use transformer models via Hugging Face's transformers in spaCy. The result is convenient access to state-of-the-art transformer architectures, such as BERT, GPT-2, XLNet, etc. This release requires spaCy v3. For the previous version of this library, see the v0.6.x branch.


hub - A library for transfer learning by reusing parts of TensorFlow models.

  •    Python

TensorFlow Hub is a library to foster the publication, discovery, and consumption of reusable parts of machine learning models. In particular, it provides modules, which are pre-trained pieces of TensorFlow models that can be reused on new tasks. If you'd like to contribute to TensorFlow Hub, be sure to review the contribution guidelines. This project adheres to TensorFlow's code of conduct. By participating, you are expected to uphold this code.

awesome-transfer-learning - Best transfer learning and domain adaptation resources (papers, tutorials, datasets, etc

  •    

A list of awesome papers and cool resources on transfer learning, domain adaptation and domain-to-domain translation in general! As you will notice, this list is currently mostly focused on domain adaptation (DA), but don't hesitate to suggest resources in other subfields of transfer learning. I accept pull requests. Papers are ordered by theme and inside each theme by publication date (submission date for arXiv papers). If the network or algorithm is given a name in a paper, this one is written in bold before the paper's name.

Xvision - Chest Xray image analysis using Deep learning !

  •    Python

Chest Xray image analysis using Deep Learning and exploiting Deep Transfer Learning technique for it with Tensorflow. The maxpool-5 layer of a pretrained VGGNet-16(Deep Convolutional Neural Network) model has been used as the feature extractor here and then further trained on a 2-layer Deep neural network with SGD optimizer and Batch Normalization for classification of Normal vs Nodular Chest Xray Images.

finetuner - Finetuning any DNN for better embedding on neural search tasks

  •    Python

Finetuner allows one to tune the weights of any deep neural network for better embeddings on search tasks. It accompanies Jina to deliver the last mile of performance for domain-specific neural search applications. 🎛 Designed for finetuning: a human-in-the-loop deep learning tool for leveling up your pretrained models in domain-specific neural search applications.

drivebot - RL for driving a rover around

  •    Python

drivebot has two ROS specific components that need to be built. load map and add three bots...

snca.pytorch - Improving Generalization via Scalable Neighborhood Component Analysis

  •    Python

This repo constains the pytorch implementation for the ECCV 2018 paper (paper). We use deep networks to learn feature representations optimized for nearest neighbor classifiers, which could generalize better for new object categories. This project is a re-investigation of Neighborhood Component Analysis (NCA) with recent technologies to make it scalable to deep networks and large-scale datasets. Much of code is extended from the previous unsupervised learning project. Please refer to this repo for more details.

transfer-mxnet - transfer learning written in mxnet

  •    Python

Unsupervised transfer learning for image classification written in mxnet. Note that this repo is only for unsupervised image classfication transfer learning.

graph_distillation - Graph Distillation for Action Detection

  •    Python

Please note that this is not an officially supported Google product. In this work, we propose a method termed graph distillation that incorporates rich privileged information from a large-scale multi- modal dataset in the source domain, and improves the learning in the target domain where training data and modalities are scarce.

mk-tfjs - Play MK.js with TensorFlow.js

  •    Javascript

Source code for my article "Playing Mortal Kombat with TensorFlow.js. Transfer learning and data augmentation". You can find the post here and MK.js here.






We have large collection of open source products. Follow the tags from Tag Cloud >>


Open source products are scattered around the web. Please provide information about the open source projects you own / you use. Add Projects.