Télécharger
Orca-2-13B
janvier 2024
Orca 2 is a finetuned version of LLAMA-2. It is built for research purposes only and provides a single turn response in tasks such as reasoning over user given data, reading comprehension, math problem solving and text summarization. The model…
Orca-2-7B
janvier 2024
Orca 2 is a finetuned version of LLAMA-2. It is built for research purposes only and provides a single turn response in tasks such as reasoning over user given data, reading comprehension, math problem solving and text summarization. The model…
WALNUT
juin 2022
This repository contains the baseline code for the paper published in NAACL 2022: «WALNUT: A Benchmark on Weakly Supervised Learning for Natural Language Understanding». Detailed description about the data sets and methods can be manuscript at here.
KID: Knowledge Infused Decoding
mars 2022
Knowledge Infused Decoding (KID) is a decoding algorithm that infuses knowledge (from Wikipedia) into each step decoding of text generation.
LiST (Lite Self-Training)
octobre 2021
We present a new method LiST for efficient fine-tuning of large pre-trained language models (PLMs) in few-shot learning settings. LiST significantly improves over recent methods that adopt prompt fine-tuning using two key techniques. The first one is the use of…
Meta Self-training for Few-shot Neural Sequence Labeling [Code]
octobre 2021
This is the implementation of the paper Meta Self-training for Few-shot Neural Sequence Labeling. MetaST is short for meta-learning for self-training.
Multi-source Weak Social Supervision for Fake News Detection (MWSS)
mai 2021
This repository contains code for fake news detection with Multi-source Weak Social Supervision (MWSS), published at ECML-PKDD 2020. Social media has greatly enabled people to participate in online activities at an unprecedented rate. However, this unrestricted access also exacerbates the…
Meta Representation Transformation for Low-resource Cross-Lingual Learning [Code]
mai 2021
This is a source code release for a published research at NAACL 2021. Paper Title: MetaXL: Meta Representation Transformation for Low-resource Cross-Lingual Learning Paper Abstract: The combination of multilingual pre-trained representations and cross-lingual transfer learning is one of the most…
Self-training with Weak Supervision [Code]
avril 2021
State-of-the-art deep neural networks require large-scale labeled training data that is often either expensive to obtain or not available for many tasks. Weak supervision in the form of domain-specific rules has been shown to be useful in such settings to…