下载
Conservative Uncertainty Estimation By Fitting Prior Networks
2020年4月
Code accompanying "Conservative Uncertainty Estimation By Fitting Prior Networks" - ICLR 2020
VL-BERT
2020年4月
VL-BERT is a simple yet powerful pre-trainable generic representation for visual-linguistic tasks. It is pre-trained on the massive-scale caption dataset and text-only corpus, and can be fine-tuned for various down-stream visual-linguistic tasks, such as Visual Commonsense Reasoning, Visual Question Answering…
RaCT
2020年4月
This repository implements Ranking-Critical Training (RaCT) for Collaborative Filtering, accepted in International Conference on Learning Representations (ICLR), 2020. By using an actor-critic architecture to fine-tune a differentiable collaborative filtering model, we can improve the performance of a variety of MLE-based…
BERT-nmt
2020年4月
BERT-fused NMT is a new algorithm in which we first use BERT to extract representations for an input sequence, and then the representations are fused with each layer of the encoder and decoder of the NMT model through attention mechanisms.
KG-A2C
2020年4月
KG-A2C is a reinforcement learning agent that builds a dynamic knowledge graph while exploring and generates natural language using a template-based action space - outperforming all current agents on a wide set of text-based games.
FreeLB
2020年4月
FreeLB is an adversarial training approach for improving transformer-based language models on Natural Language Understanding tasks. It accumulates the gradient in the ascent steps and updates the parameters with the accumulated gradients, which is approximately equivalent to enlarging the batch…