Research talk: Towards data-efficient machine learning with meta-learning
At Microsoft Research, we are approaching large-scale AI from many different perspectives, which include not only creating new, bigger models, but also developing unique ways of optimizing AI models from training to deployment. One of the main challenges posed by larger AI models is that they are more difficult to deploy in an affordable and sustainable way, and it is also still hard for them to learn new concepts and tasks effectively. Join Microsoft Researcher Guoqing Zheng for the third of three lightning talks in this series on Efficient and adaptable large-scale AI. See talks from Microsoft Researchers Subho Mukherjee and Yu Cheng to learn more about the work Microsoft is doing to improve the efficiency of computation and data in large-scale AI models.
Learn more about the 2021 Microsoft Research Summit: https://Aka.ms/researchsummit (opens in new tab)
- 轨迹:
- Deep Learning & Large-Scale AI
- 日期:
- 演讲者:
- Guoqing Zheng
- 所属机构:
- Microsoft Research Redmond
-
-
Guoqing Zheng
Principal Researcher
-
-
Deep Learning & Large-Scale AI
-
-
-
Research talk: Resource-efficient learning for large pretrained models
Speakers:- Subhabrata (Subho) Mukherjee
-
-
-
Research talk: Prompt tuning: What works and what's next
Speakers:- Danqi Chen
-
-
-
Research talk: NUWA: Neural visual world creation with multimodal pretraining
Speakers:- Lei Ji,
- Chenfei Wu
-
-
-
-
Research talk: Towards Self-Learning End-to-end Dialog Systems
Speakers:- Baolin Peng
-
Research talk: WebQA: Multihop and multimodal
Speakers:- Yonatan Bisk
-
Research talk: Closing the loop in natural language interfaces to relational databases
Speakers:- Dragomir Radev
-
Roundtable discussion: Beyond language models: Knowledge, multiple modalities, and more
Speakers:- Yonatan Bisk,
- Daniel McDuff,
- Dragomir Radev
-