项目
Algorithm to generate complex LLM prompts from scratch Given a task in the form of a basic description and its training examples, prompt optimization is the problem of synthesizing the given information into a text prompt for a large language…
At Microsoft Research Lab India, we conduct a variety of healthcare related research, including smartphone-based low-cost diagnostics, generative AI chatbots to support the healthcare ecosystem, and promote mental well-being of employees.
Recent studies have highlighted that more than 70% of patients and their caregivers experience anxiety prior to undergoing an invasive treatment. Additionally, over 80% of them require timely, trustworthy, detailed, and accurate information about their treatment. The provision of such…
成立:
With Emphasis on the Global South The work on accessibility at MSR India has spanned the range from spatial audio with HoloLens to the use of feature phones to reach children with vision impairments and a spectrum of tangible toys…
Design, analysis and interpretability of large language models Transformers and large language models (LLMs) have had enormous success in recent years. Yet they remain poorly understood, in particular why and how they work. We are trying to answer such questions…
Kahani: Visual Storytelling is a research prototype that allows the user to create visually striking and culturally nuanced images just by describing them in their local languages.
Scaling performance beyond Moore’s law Domain specialization is expected to play a big role in how computer systems evolve in future. With the end of Moore’s law, we are already seeing CPU, GPU and domain specific hardware evolving rapidly. The next decade is therefore expected to see big changes in how we develop, compile and run software. This project focuses on data systems, a class of systems where, as the data sizes grow, performance scaling is going to be of importance.First, we believe that domain-specific compilers will play a crucial strategic role in helping software leverage the changing hardware landscape. Such compilers will…
成立:
Towards efficient AI/ML deployment The AI Infrastructure team at Microsoft Research India works on cutting-edge systems optimizations for improving the efficiency of a variety of AI/ML workloads, including an emerging class of workloads, namely, serving large language models (LLMs). AI/ML models…
成立:
Post-Deployment Configuration Tuning of Services Made Easy Real-world application deployments have hundreds of inter-dependent configuration parameters, many of which significantly influence performance and efficiency. With today’s complex and dynamic services, operators need to continuously monitor and set the right configuration…