“How Much Information Does a Human Translator Add to the Original” and “Multi-Source Neural Translation”
We ask how much information a human translator adds to an original text, and we provide a bound. We address this question in the context of bilingual text compression: given a source text, how many bits of additional information are required to specify the target text produced by a human translator? We develop new compression algorithms and establish a benchmark task. We build a multi-source machine translation model and train it to maximize the probability of a target English string given French and German sources. Using the neural encoder-decoder framework, we explore several combination methods and report up to +4.8 Bleu increases on top of a very strong attention-based neural translation model.
Speaker Details
Barret Zoph is a recent graduate from the University of Southern California and is currently working at the Information Sciences Institute.
- Date:
- Speakers:
- Barret Zoph
- Affiliation:
- University of Southern California