Monday, July 13
Time | Event/Theme | Location |
---|---|---|
8:00–9:00 | Continental Breakfast | |
9:00–10:45 | Opening Plenary Session | Kodiak |
9:00–9:30 | Faculty Summit Introduction and Welcome Harold Javid, Faculty Summit Chair, Microsoft Research Microsoft External Research | slides Tony Hey, Corporate Vice President, External Research, Microsoft Research | |
9:30–10:45 | Rethinking Computing Craig Mundie, Chief Research and Strategy Officer, Microsoft | |
10:45–11:00 | Break | |
11:00–12:00 | Break-out Sessions | |
Earth, Energy, and Environment From Farm to Forest: Carbon Implications of Land Use Dennis Baldocchi, University of California, Berkeley; David Lobell, Stanford University; Catharine Van Ingen, Microsoft Research Are agricultural carbon credits a sound investment, helping both sequester carbon and raise farmer income levels? Are forest-based offsets equally viable? At-risk forests can absorb 20 percent of the planet’s carbon emissions while agriculture accounts for approximately 10 percent of global emissions. Thus, land-use, whether in the form of deforestation or agriculture, directly affects nearly 30 percent of the exchange of greenhouse gases between terrestrial ecosystems and the atmosphere. This session features presentations (and interactive discussion) by University of California, Berkeley Professor Dennis Baldocchi on the relationship between vegetation and the atmosphere and Stanford University Dr. David Lobell on how technology is critical to studying climactic impact on agriculture. | Hood | |
Global Outreach Attracting and Retaining Women in Computing: Real Programs for Real Progress | slides Moderator: Jane Prey, Microsoft Research Maureen Biggers, Indiana University; Tracy Camp, Colorado School of Mines; Carla Ellis, Duke University; Gillian Hayes, University of California, Irvine; Rita Powell, University of Pennsylvania A degree or career in computer science remains a less than compelling choice for college-bound girls. The National Center for Women and Information Technology (NCWIT) leverages the efforts of organizations across the United States, and connects efforts to increase women’s participation in all areas of information technologies, from elementary school to higher education and through industry and academic careers. Leading-edge social science research focuses on education, innovation, climate, and workforce participation. Research is the foundation for NCWIT’s mission. By researching what works and what does not work, NCWIT can develop and distribute practices that will accelerate women’s participation in information technology. Additionally, NCWIT’s Academic Alliance Seed Fund Award, sponsored by Microsoft Research, encourages the widespread application of new promising practices by awarding alliance members with funds to develop and implement initiatives in computing and information technology. This session examines the variety of programs initiated through the Academic Alliance and Seed Fund program and provides examples of successful approaches to reform. | Baker | |
Core Computer Science Beyond Search with Data Driven Intelligence Introduction | Evelyne Viegas, Microsoft Research Whither Search? | Rakesh Agrawal, Microsoft Research Bing: User Intent and Decision Engine | Harry Shum, Microsoft These presentations capture the future of search by focusing on how data-driven research can help advance the state of the art in the online world and present a vision for humane computing. This session was held from 11:00 to 12:15. | Cascade | |
Education and Scholarly Communication The Road to Personalized Learning Michael Golden, Microsoft The role of educators and students in education is clear and undisputed. What then is technology’s role? We believe it to be in two dimensions—augment and scale. Microsoft’s vision is to expand the power of education for everyone through personalized learning. Technology must then augment the delivery and experience of personalized learning for a given educator or student; and then it must enable personalized learning to occur for all students—or achieve scale. In this session, Michael Golden discusses Microsoft’s approach to personalized learning, and demonstrates some of the tools that help achieve it. | Rainier | |
12:00–1:15 | Lunch and Brown Bag Sessions | |
Core Computer Science Robots as a Context for Teaching Beginner Programmers: the Conclusion of Three Years’ Research | slides Mark Guzdial, Georgia Institute of Technology IPRE (the Institute for Personal Robots in Education, hosted at Georgia Tech with Bryn Mawr College) phase 1 concludes this summer. This talk reviews IPRE’s progress in contextualized beginner computer science education using personal robots since its foundation in 2006, and looks forward to IPRE’s phase 2. | St. Helens | |
1:15–2:30 | Break-out Sessions | |
Earth, Energy, and Environment Water for a Thirsty World: How Can Information Technology Help? Jeff Dozier, University of California, Santa Barbara | slides Ilya Zaslavsky, University of California, San Diego | slides Water defines our environment. We are a water-dependent species in a world where water is the central actor, changing Earth’s surface and shaping where and how we live. At the same time that populations are growing and water demand is increasing, changes in climate and land use impose broad challenges for the future. In the study of the water environment from the perspectives of natural, engineering, and social sciences the overarching question is: How can we protect ecosystems and better manage and predict water availability and quality for future generations, given changes to the water cycle caused by human activities and climate trends? In this session, Professor Dozier and Dr. Zaslavsky lead an interactive discussion of the role of information technology in transforming water science and improving decisions about water management. | Hood | |
Health and Wellbeing Mobile Solutions for Underserved Communities Moderator: Kristin Tolle, Microsoft Research This session focuses on sustainable solutions that will help solve the healthcare crisis in emerging and developing economies. Transformational Improvement in Healthcare | slides David Zar, Washington University in St. Louis Approximately 75 percent of the world population has no access to medical imaging. Most of those people are poor and many live in remote areas, far from modern medical facilities. By taking advantage of the computing capabilities of modern smart phones, real-time, ultrasonic imaging may be introduced to these people and at very low cost. Using commercially available USB-based ultrasound probes and developing drivers and applications that run on Windows Mobile smart phones, many underserved areas of the world may now have access to modern medical imaging. Basic Mobile Technology for Basic Support | slides Michael Platt, Microsoft There is still much that can be done to support underserved communities with mobile technologies to which they already have access. This presentation examines how the huge numbers of mobile phones has been utilized as a delivery mechanism for cloud computing to provide dynamic and personalized support to the bottom of the pyramid. Use Smart Phones to Promote Diabetes Self-management for Robust Elderly in China | slides Jiao (Maggie) Ma and Cynthia LeRouge This presentation provides an overview of how User-Centered Design (UCD) is being applied in design and prototyping of an age and culturally appropriate, interactive diabetes self-management support system on smart phones Chinese Aged Diabetic Assistant (CADA). CADA uses a gaming approach to engage and inspire robust (independent in activities of daily living) elder populations with diabetes in China. | St. Helens | |
1:15–2:45 | Core Computer Science Energy-Efficient Computing: the State of the Art Moderator: Feng Zhao, Microsoft Research Energy Efficiency and Cloud Computing | David Patterson, University of California, Berkeley Ingredients for Building Energy-Efficient Computing Systems: Hardware, Software, and Tools | John D. Davis, Microsoft Research (Updated: Due to a family emergency, John D. Davis replaces Jeffrey Chase.) Power is increasingly becoming a critical performance metric for designing computing systems, from devices, services, to large-scale data centers. Two leading researchers, David Patterson from the University of California, Berkeley, and John D. Davis, Microsoft Research, present the latest research on energy-efficient computing for data centers and cloud computing. | Baker |
Technical Direction and Strategy at Microsoft – How ThinkWeek and Quests Work | slides Tara Prakriya, Microsoft The Technical Strategy Group (TSG) works to capture and influence business, experience and technology direction, and opportunities for the company. The goal of this session is to share insight into how Microsoft technical strategy is developed across divisions, the future technology direction of the company, the intersection of business-experience-technology strategy alignment, how programs like ThinkWeek and Quests are instrumental in this process, and insight on how Microsoft Research engages in these programs. | Cascade | |
Education and Scholarly Communication Next Generation Scholarly Measurement—Deciding What Counts Academic researchers have used various methods for ranking the importance and influence of scholarly journals and their authors, including citation analysis, usage data, and more recently by using social networking analysis. This session explores how recent advances in data mining, network analysis, and information theory have led to new methods for evaluating the influence of scholarly periodicals and for understanding the structure of academic research. MESUR: Studying Science from Large-Scale Usage Data Johan Bollen, Los Alamos National Laboratory Science is of significant importance to our society, but we understand very little of the processes that lead to scientific innovation. This presentation provides an overview of our work on large-scale usage data as an early indicator of scientific activity. The MESUR project has in the past two years aggregated a large-scale collection of the usage data recorded by some of the world’s most significant publishers, aggregators and institutional consortia. The resulting data set has been analyzed to reveal the structural properties of scientific activity in real-time. The presentation highlights some of our recent work on producing detailed maps of science that reveal how scientists navigate between online scholarly resources. The results indicate that it may be possible to detect or predict the emergence of innovation from temporal changes in the structure of scientific activity. This work underpins efforts to arrive at a more accurate, pro-active evaluation of scientific impact. The Eigenfactor Project | slides Carl Bergstrom, University of Washington Science is a massively parallel human endeavor to explain and predict the nature of the physical world. In science, knowledge is acquired cumulatively and collaboratively, and the principal mode for sharing this knowledge is the institution of scholarly publishing. In science, ideas are built upon ideas, models upon models, and verifications upon prior verifications. This cumulative process of construction leaves behind it a latticework of citations, from which we can reconstruct the geography of scientific thought and retrace the paths along which intellectual activity has proceeded. The Eigenfactor Project aims to use recent advances in network analysis and information theory to develop novel methods for evaluating the influence of scholarly periodicals and for mapping the structure of academic research. | Rainier | |
2:30–2:45 | Break | |
2:45–4:00 | Break-out Sessions | |
Earth, Energy, and Environment Protecting Ocean Resources Expect the Unexpected: Salmon, Water, and Wind | Mark Abbott, Oregon State University Climate Change and the Oceans: Why You Should Care | Ellen Prager, Earth2Ocean Oceans play a crucial role in supporting life on earth. Oceans provide the primary source of protein for more than 1 billion people, are a leading source for pharmaceuticals, and supply multiple billions in economic wealth, not to mention the inherent visceral enjoyment of a day at the beach. Unfortunately, pollution, overfishing, and climate change threaten all of this. This session, led by marine scientist Dr. Ellen Prager and Mark Abbott, Dean and Professor of College of Oceanic and Atmospheric Sciences at Oregon State University, focuses on climate change, in particular. Session attendees will participate in a rich discussion around how better technology including modeling, visualization, translation skills, and decision tools can help address the critical problems of sea level rise, ocean temperature increase, and ocean acidification. | Hood | |
Health and Wellbeing Computational Challenges of Genome-Wide Association Studies (GWAS) Moderator: Simon Mercer, Microsoft Research This session describes several approaches used in association with GWAS that facilitate time to discovery. Using Genomics to Understand Neurological Disease Bryan Traynor, National Institutes of Health The presentation outlines the tremendous advances that have been made in genomics in the last five years, and demonstrates how we have used these technologies to begin to unravel neurological diseases. The audience should come away with a sense of the potential of genomics, both now and in the very near future. Improving Detection in Large-Scale Genetic Association Studies by Discovering and Accounting for Race, Relatedness, and Other Hidden Relationships | slides Jennifer Listgarten, Microsoft Research The goal of genome-wide association studies is to uncover associations between disease and genetics by looking at genetic markers in large populations of individuals with and without the disease. In the statistical analysis of such studies, the ability to capture and effectively deal with various types of population structure (for example, race structure, family structure, and unknown relatedness) is critical to the discovery of genetic markers of disease. Such structure is known to be a significant confounding factor, leading to loss of power and spurious results when not properly accounted for. However, finding models that automatically account for multiple types of structure, when the presence or nature of this structure is unknown, remains an open area of research. We are investigating the use of statistical models that automatically learn and correct for these hidden factors, even when their presence is not originally known, and also without the need to remove subsets of individuals as is often done. GeneScription: An Information Management System for Enabling Pharmacogenomics and Drug Safety Assurance | slides Michael Kane, Purdue University The presentation describes the rationale, development, and utility of a software system (GeneScription) developed specifically to provide training to healthcare professionals in the field of pharmacogenomics by using an operational model. The audience derives an introduction to pharmacogenomics (which is distinct from the use of genomics for disease prediction), as well as an emerging clinical arena dependent upon the successful integration of computing and information management, clinical genomics, pharmacotherapeutics, and the issues surrounding patient privacy. | St. Helens | |
Core Computer Science The Microsoft-Intel Universal Parallel Computing Research Centers Universal Parallel Computing Research Center at Illinois | Wen-mei Hwu, University of Illinois at Urbana Champaign UC Berkeley Par Lab Overview | David Patterson, University of California, Berkeley Microsoft and Intel jointly funded two Universal Parallel Computing Research Centers (UPCRC): one at the University of California, Berkeley, and one at the University of Illinois, Urbana-Champaign. The goal of these centers was to produce the innovative research that would help further the adoption and use of multicore parallel computers by developing new techniques for parallel programs and new end-user applications that could exploit these computers. Professors Patterson and Hwu describe the research that is ongoing at each of their institutions. | Baker | |
Education and Scholarly Communication Advances in Tablet Computing: From Research to Application Interactive Classroom for Microsoft Office | Chris Moffatt, Microsoft The Promise of Pen- and Touch-Computing | Andries Van Dam, Brown University For years, Microsoft has invested in breakthrough research in the arena of Tablet/Pen Computing. This session focuses on two areas: (1) A summary of the results (including demos) of the three-year program of research conducted by the Pen Computing Center at Brown under the direction of Professor Andries van Dam, and (2) Some exciting demos from Microsoft’s Education Product Group showing how many of the advanced technologies that stemmed from these investments have now been incorporated into the forthcoming release of Microsoft Office highlighting the educational potential of this software plus form factor. | Cascade | |
Digital Humanities Research—Computationally Intensive Efforts in eHumanities Digital Humanities is currently a vibrant area for innovative and multi-disciplinary research, involving all of the humanistic disciplines and computer and library sciences. Over the course of the past decade, scholars have shifted focus from generating individual repositories of digital data in various formats (plain text, TEI, XML, and so on) to thinking about how this digital data underpins the creation of new knowledge. Research continues to be shaped by the primary materials, but the fact that it is now available in digital form allows Humanities researchers thanks to the various research tools that have and continue to be developed to ask different questions. These new questions will shape disciplines and have the potential to revolutionize and change the nature of understanding. Digital Humanities Research at Trinity College Dublin Digital Humanities: an Historical Perspective | Jane Ohlmeyer, Trinity College Dublin Digital Humanities has a long tradition dating back to the 1940s when IBM funded a project led by a Roberto A. Busa, which resulted in machines for the automation of the linguistic analysis of written texts. The invention of the Web in 1992 gave fresh impetus to the field and from the mid-1990s there were a number of major Digital Humanities projects, especially the Rossetti archive, completed in 2008, and the Valley of the Shadow project. This presentation explores current issues in digital humanities research and how they have been addressed by the research communities in Europe, and more particularity, in Ireland and especially at Trinity College Dublin. Text-mining and Humanities Research | slides John Unsworth, University of Illinois, Urbana-Champaign What kinds of research questions can humanities scholars address with text-mining tools? What challenges face those who want to build text-mining software for this audience? What kind of work needs to be done to prepare text collections for this kind of work? Who is actually doing this kind of research, and what have their results been? This presentation addresses these and other questions, based on four years of experience in collaborative, multi-institutional projects aimed at building text-mining tools for the digital humanist. | Rainier | |
4:00–4:15 | Break | |
4:15–5:15 | A Call to Action: How Can Technology Help Protect Environmental Ecosystem Services? | slides Sandy Andelman, Vice President, Executive Director of the Tropical Ecology, Assessment and Monitoring Network at Conservation International; Peter Seligmann, Co-Founder, Chairman, and CEO, Conservation International The environmental plenary session features a conversation with Peter Seligmann, the chief executive officer of Conservation International, and Sandy Andelman, Vice President and Executive Director of the Tropical Ecology, Assessment and Monitoring (TEAM) Network, on the most urgent issues facing the environment and the role that technology can play in the protection of ecosystem services. Drawing from decades of experience, they discuss how climate change will impact the basic services that people depend on, including food, water, culture, and national security and how the research community can accelerate breakthroughs in the way we understand and address environmental degradation. | Kodiak |
5:30–6:00 | Travel to Kirkland | |
6:30–9:00 | Dinner Cruise Around Lake Washington |
Tuesday, July 14
Time | Event/Topic | Location |
---|---|---|
8:00–9:00 | Continental Breakfast | Hood |
9:00–10:00 | Opening Plenary Session Research in the 21st Century | slides Rick Rashid, Senior Vice President, Microsoft Research | Kodiak |
10:00–1:00 | DemoFest | McKinley |
11:45–1:00 | Lunch and Brown Bag Sessions | |
Global Outreach Five Years of Faculty Fellowships: A Retrospective | slides Moderator: Tom McMail, Microsoft Research In 2005, Microsoft Research created a fellowship program for research faculty that was designed as an investment in the development of talent critical to the future progress of the computing disciplines. Now, after five years of activity and with 25 fellows named, this session examines some of the successful researchers and activities enabled by the awards as well as future enhancements envisioned for the program. Needles in a Haystack: Reading Human Evolution in the Human Genome Gill Bejerano, Stanford University (2009 Fellow) The genomes of humans and our closest living species allow us to seek the genomic events that drove the unique evolution of our species. One such quest will be described, highlighting the intimate interplay between computation and experiments that allowed it to bear fruit. Some Vignettes from Learning Theory | slides Robert Kleinberg, Cornell University (2008 Fellow) A great deal of recent research on computational learning theory and its applications focuses on a paradigm called “regret minimization.” Regret-minimizing algorithms solve repeated decision problems (for example, which medical treatment to administer to a patient) and learn from their past mistakes, improving their performance as they gain experience. It is possible to design these algorithms to meet surprisingly strong provable worst-case guarantees, but decision problems “in the wild” often force us to reconsider the assumptions underlying these algorithms and to expand the theory in unexpected ways. in this discussion, we survey a few recent examples that illustrate how the theory is growing and maturing under the influence of applications from domains such as Web search and advertising. Interactive and Collaborative Data Management in the Cloud | slides Magdalena Balazinska, University of Washington (2007 Fellow) The scientific data management landscape is changing. Improvements in instrumentation and simulation software are giving scientists access to data at an unprecedented scale. This data is increasingly being stored in data centers running thousands of commodity servers. This new environment creates significant data management challenges. In addition to efficient query processing, the magnitude of data and queries call for new query management techniques such as runtime query control, intra-query fault tolerance, query composition support, and seamless query sharing. In this talk, we present our ongoing research efforts to provide scientists the tools they need to analyze data at these new scales and in these new environments. We also briefly discuss some of the other research projects in our group. | Cascade | |
Core Computer Science Microsoft Cloud Computing Platform | slides Roger Barga, Microsoft Research; Dennis Gannon, Microsoft Research Cloud computing uses data centers to provide on-demand access to services such as data storage and hosted applications that provide scalable Web services and large-scale scientific data analysis. While the architecture of a data center is similar to a conventional supercomputer, they are designed with very different goals. This talk highlights the basic cloud computing system architectures and the application programming models, including general concepts of data center architecture. We examine cloud computing and storage models with a detailed look at the Microsoft Azure cloud computing platform. | Lassen | |
1:00–3:45 | Design Expo The Design Expo is a Microsoft Research forum where the top graduate design institutions showcase their prototype interaction design ideas. Microsoft Research sponsors a semester-long class at leading interdisciplinary design schools and invites the top class projects to present their ideas as part of the Faculty Summit. | Kodiak |
1:00–3:45 | Earth, Energy, and Environment Toward Zero Carbon Energy Production Climate, Energy, and Economy | Mark Abbott, Oregon State University Michael Jacobson, Stanford University Towards Zero Carbon Energy Production | Michael Totten, Conservation International While the administration of United States President Obama has committed US$1.2 billion to go toward green energy research and development, approximately one thirtieth of the U.S. Department of Energy’s annual budget, both climate change and energy security, remain critical problems to solve. How do we avoid investing in energy sources that yield unintended consequences? What if the energy sources that are getting the most attention are between 25 to 1,000 times more polluting than the best available options? The science community is assessing not only the potential for delivering energy for electricity and vehicles from different sources, but also how they affect global warming, human health, energy security, water supply, space requirements, wildlife, water pollution, reliability, and sustainability. Join Stanford University Professor Mark Z. Jacobson, Conservation International Chief Advisor Michael Totten, and Oregon State University Dean and Professor Mark Abbott for an interactive workshop dedicated to using technology to achieve a whole systems evaluation of competing alternative energy options. | Hood |
1:00–2:15 | Break-out Sessions | |
Health and Wellbeing Systems Biology and Transformative Healthcare Moderator: Simon Mercer, Microsoft Research This session investigates the role of computing in the fields of biological sciences and health care. Systems Biology and Biotechnology of Microorganisms: Making Systems Biology Work | slides Sang Yup Lee, Korea Advanced Institute of Science and Technology Systems biology has been changing the paradigm of biological and biotechnological research. It is now possible to perform so-called systems metabolic engineering by integrating metabolic engineering with systems biology. This lecture presents the general strategies for systems metabolic engineering and several examples on the production of various bioproducts. Systems metabolic engineering can be considered as one of the success stories of systems biology, and will become an essential strategy for developing various microbial processes for the production of chemicals and materials, thus helping us to move into sustainable bio-based economy. Interpreting Personalized Genetic Information Pathway Association Analysis | Trey Ideker, University of California Although genome-wide association studies (GWAS) are rapidly increasing in number, numerous challenges persist in identifying and explaining the associations between loci and quantitative phenotypes. This project is developing tools to integrate gene association data with protein network information to identify the pathways underlying a patient’s genotype. These methods will elevate the study of gene association to a new study of pathway association. The project is a joint work with Richard Karp in the EECS Department at the University of California, Berkeley. Our proposed solution is to explain the associations captured by GWAS in terms of known gene and protein interactions. New technologies have provided a wealth of interaction data ranging from the proteome (protein-protein interaction networks) to the transcriptome (protein-DNA interactions) to the metabolome (metabolic pathways). We will develop computational tools that query these independent networks to identify pathways and sub-networks of interactions underlying the observed set of genome-wide associations. This framework is intended to improve the power of current GWAS, by identifying genes in loci with borderline significance that nonetheless have close network proximity to significant genes. Furthermore, it will provide a list of putative physical pathways incorporating the causal genes necessary to affect the phenotype. | St. Helens | |
Core Computer Science Panel: Energy-Efficient Computing: Hype or Science? Moderator: Feng Zhao, Microsoft Research | slides Energy-Efficient Computing: Hype or Science? | Trishul Chilimbi, Microsoft Research Energy-Efficient Computing: Emerging Technologies | Fred Chong, University of California, Santa Barbara Three Observations and Three Lessons from Embedded Systems | Rajesh Gupta, University of California, San Diego Lifting the Energy Veil | Philip Levis, Stanford University Energy-Efficient Computing: Hype or Science? | Chuck Thacker, Microsoft Research This panel provides a forum for lively debate about the directions, challenges, and ideas about building energy-efficient computing systems. The experts examine energy and power issues in hardware and systems design, interconnect and optics, networking fabric, embedded systems, and software design. | Baker | |
Global Outreach Highlights from Asia on eScience | slides Moderator: Lolan Song, Microsoft Research This session presents some of the highlights from eScience Research in Asia. Three speakers from universities in the Asia-Pacific region talk about their research work in the environment, bioinformatics, and other areas. The Health-e-Waterways Project – An Exemplar Model for Environmental Monitoring and Resource Management | slides Jane Hunter, University of Queensland Numerous state, national, and international agencies are advocating the need for standardized frameworks and procedures for environmental accounting. The Health-e-Waterways project provides an ideal model for delivering a standardized approach to the aggregation of ecosystem health monitoring data and the generation of dynamic, interactive reports (that link back to the raw data sets). The system combines Microsoft Virtual Earth and Microsoft Silverlight to present environmental reports that not only save agencies significant time and money, but can also be used to guide regional, state, and national environmental policy development. Current work includes linking management action strategies to specific spatio-temporal indicators to identify the extent of impact of management actions and investments—enabling adaptive management strategies based on environmental outcomes. A Semantic and “Kansei” Computing System for Analyzing Global Environments | slides Yasushi Kiyoki, Keio University In the design of multimedia database systems, one of the most important issues is how to search and analyze media data (images, music, video, and documents), according to user’s impressions and contexts. We introduce a “Kansei” and semantic associative search method based on our Mathematical Model of Meaning (MMM). The concept of “Kansei” includes several meanings on sensitive recognition, such as impression”, “human senses”, “feelings”, “sensitivity”, “psychological reaction”, and “physiological reaction”. This model realizes “Kansei” processing and semantic associative search for media data, according to various contexts. This model is applied to compute semantic correlations between images, music, video, and documents dynamically with a context interpretation mechanism. The main feature of this model is to realize semantic associative search and analysis in the 2000-dimensional orthogonal semantic space with semantic projection functions. This space is created for dynamically computing semantic equivalence or similarity between media data. One of the important applications of MMM is Global Environment-Analysis, which aims to evaluate various influences caused by natural disasters in global environments. We have several experiments for a global environment-analysis system based on MMM for natural disasters, especially for mud-flow disasters. Those results show the feasibility and effectiveness of our Semantic Computing System with MMM for realizing deep analysis of global environments. Computational Challenges in Analyzing Complex Traits | slides Jun Zhu, Zhejiang University Most human important diseases and economically important animal and plant traits are complex traits controlled by multiple genes with gene-to-gene interaction (epistasis) and gene-to-environment interaction (GE). Detection of polygene with fixed effects of genes and random effects of GE interaction are often revealed by mixed-linear-model approach, which is a statistical method involving enormous computation of many inverses of an (n&n) matrix. Genes are located on chromosomes. There must be two-dimension presentation for multiple genes with gene-to-gene interaction. Since genes express differently during developmental stages and across various environments, the graphic presentation of dynamic gene expression is another type of challenge for bio-computation. | Lassen | |
Education and Scholarly Communications Computer Games and Learning: Best Practices Using Games to Teach in Academia and at Microsoft | slides Chris Franz, Microsoft Jennifer Michelstein, Microsoft Ken Perlin, New York University Productivity Games | Ross Smith, Microsoft The Games for Learning Institute is a joint venture with Microsoft Research, New York University, and affiliated New York regional schools. Nine months into its efforts, it has prematurely published its annual report discussing the latest research about how to make great games and how to make great game vehicles for teaching. This talk is complemented by three efforts at Microsoft where product groups are using games to teach the esoteric features of Microsoft software, facilitate learning, and improve software development. See some very cool stuff and learn how to get your kids to love math (as does Ken Perlin) or find out how to use a feature in Microsoft Office Word you have not yet discovered. | Rainier | |
Global Outreach Computer Science Research in Latin America Moderator: Jaime Puente, Microsoft Research Improving Meta-Analysis Based GWAS Through Data Quality Management | slides Raul Ruggia, University of La Republica Defining mappings or indirect relations from genotype to phenotype has long been a challenge for those in the field of biology. The present pace of data generation from genomic sciences offers unparalleled opportunities in this regard. Prominent examples are Genome Wide Association Studies (GWAS), which jointly analyze thousands of Single Nucleotide Polymorphisms (SNPs) from chosen populations, looking for associations between a specific disease and a given genomic configuration. However, huge costs and project complexity restrict the application of GWAS approach. An option to overcome this limitation is to combine different studies, applying the so-called Meta-Analysis approach. Efforts such as Database of Genotype and Phenotype (dbGaP) are intended to provide a uniform repository of such studies. However, retrieving, integrating, and interpreting heterogeneous data sources are daunting tasks. Indeed, most successful meta-analyses rely on sophisticated statistics aided with expert inspection and filtering. This approach is slow, costly and error-prone (for example, multiple subjective decisions), introducing reproducibility problems. The main goal of our work is to provide a data quality assessment environment for GWAS, which enables a powerful and reliable application of Meta-Analysis. The environment trends to promote this approach by identifying core concepts and elements that would allow model-based, automated, comprehensive, and reproducible data quality management. Furthermore, while Meta-Analysis was extensively used for combining aggregated data, our approach intends to combine raw data, even from heterogeneous sources. Research at LaFHIS: The Tools and Foundations for Software Engineering Lab at University of Buenos Aires | slides Sebastian Uchitel, University of Buenos Aires The Laboratory on Foundations and Tools for Software Engineering (LaFHIS) within the Department of Computing at the Faculty of Science, University of Buenos Aires, aims to conduct leading-edge research in, and technology transfer of, effective engineering methods, tools, and environments for the development of composite, heterogeneous, and complex software-intensive systems. The group has strong interests in the specification, construction, and verification of software-intensive systems. This talk provides an overview of the research conducted at LaFHIS, which focuses on models and automated analysis. It provides particular insight into the researchers’ work on model checking, scenario-based specifications, partial behavior modeling, and contract validation. This talk also includes descriptions of ongoing collaborative projects with Microsoft Research on model-based testing and program analysis. Advancements of the LACCIR Virtual Institute: 2007–2009 | slides Ignacio Casas, Pontifical Catholic University of Chile With support and sponsorship from Microsoft Research, the Inter American Development Bank (IADB), and the Organization of American States (OAS), the Latin American and Caribbean Collaborative ICT Research (LACCIR) Virtual Institute was created in May 2007 as a federation of Latin American and Caribbean universities, for the advancement of collaborative information and communication technologies (ICT) research applied to social and economical development of the region. This presentation provides an account of activities and achievements in terms of regional research projects, graduate student fellowships, collaboration networks, and research indicators to date. | Sonora | |
2:15–2:30 | Break | |
2:30–3:45 | Break-out Sessions | |
Health and Wellbeing Devices, Sensors, and Mobility for Healthcare Moderator: Kristin Tolle, Microsoft Research This session focuses on innovative technologies in the devices sensors and mobile space being developed by Microsoft Researchers and their external collaborators. Physiological Computing for Human-Computer Interaction and Medical Sensing | slides Desney Tan, Microsoft Research The human body is a complex biological machine and a prolific signal generator. Recent advances in sensing technologies have vastly augmented our ability to decode the signals generated by the body. This talk presents research into utilizing sensors placed on or in the human body in order to create natural and always-available interaction with computers around us. The talk also includes discussion of recent efforts in applying our expertise to build sensors and design experiences centered on medical sensing. Monitoring and Diagnosing Sleep Apnea in the Home | slides Kristin Tolle, Microsoft Research This talk focuses on technology being developed in the Microsoft Research hardware team to help diagnoses sleep apnea and other sleeping disturbances. For proper diagnosis, patients typically must check into a sleep clinic (hospital) for monitoring. By reinstrumenting many of the sensors used in this controlled environment into a neck cuff, we posit that we can generate accurate predictions of sleep apnea (and with multiple data points) in the comfort of a subject’s home. MAUI: Mobile Assistance Using the Internet | slides Victor Bahl, Microsoft Research Seamless augmentation of human cognition requires processing and energy that far outstrips the capabilities of mobile hardware. The CPU, memory, I/O, and energy demands of new world applications greatly exceed the capacity of devices that people are willing to carry or wear for extended periods. On such hardware, improving size, weight, and battery life are higher priorities than enhancing compute power. A mobile device can never be too small, too light, or have too long a battery life! This is not just a temporary limitation of current technology, but is intrinsic to mobility. At any given cost and level of technology, considerations of weight, power, size, and ergonomics will exact a penalty in computational resources. Computation on mobile devices will always be a compromise. Cloud computing suggests an obvious solution: Run the application on a distant high-performance computer or compute cluster and access it over the Internet via a mobile computer. Unfortunately, long WAN latencies hurt the crisp interaction that is so critical for seamless augmentation of human cognition. Humans are acutely sensitive to delay and jitter, and it is very difficult to control these parameters at WAN scale. As latency increases and bandwidth drops, interactive response suffers. This distracts the user, and reduces his or her depth of cognitive engagement. | St. Helens | |
Core Computer Science Computational Thinking Enters the Mainstream | slides Moderator: Tom McMail, Microsoft Research The Microsoft Carnegie Mellon Center for Computational Thinking was founded in 2007 to encourage breakthrough research in projects exemplifying this approach to problem solving. This session provides an overview of the investigations conducted at this center over its first two years and presents some interesting possibilities for the future. The Spread of Computational Thinking | slides Peter Lee, Guy Blelloch, Christopher Langmead, Golan Levin, Carnegie Mellon University Every educated person should be able to think computationally. That is the thesis first promoted by Jeannette Wing, which formed the foundation of the Microsoft-supported Center for Computational Thinking. In the same manner that mathematical thinking, global thinking, and so on, are critical for succeeding or even surviving in today’s world, computational thinking addresses problems which would be unsolvable or solved less well without computational advantages and the mindset required to use them most creatively and effectively. As a means for conceptualizing and solving complex problems in a number of domains in both the sciences and humanities, it has received wide attention in the research, teaching, and policy communities. Parallel Thinking Guy Blelloch, Carnegie Mellon University With the advent of Multicores, we are riding a third or fourth wave of parallel computing, and perhaps unlike previous ones this one will break. Many if not most computer science classes, however, remain case studies in how to push students into thinking sequentially. At the earliest stages, for example, we teach students that taking the dot product of two vectors or merging two lists involves starting at one end and sequentially traversing to the other. In reality, many problems, applications, and even algorithms are inherently parallel. The languages and models we use, however, push us to describe and conceptualize them sequentially. This talk describes some of the core concepts in parallel algorithms and points out that these ideas transcend any particular model and are thus largely robust against uncertainties in what parallel machines might look like. How programming languages can affect the way we think about the algorithms will also be discussed. Ideas from the audience are appreciated. Computational Drug Discovery Christopher Langmead, Carnegie Mellon University We are using Computational Thinking to address the problem of designing drugs that evade resistance. Our approach uses two key abstractions. The first is to model the drug design process as a two-player game. Here, a pharmaceutical company makes a move by introducing a drug against a target molecule. The disease then makes a move by introducing mutations that decreases the binding affinity of the drug, while preserving the biological function of the target. The second abstraction is to model the physics of molecular interactions and the space of possible mutations as a complex probability distribution. This complex distribution is efficiently encoded by using undirected probabilistic graphical models; probabilistic queries are answered by using inference algorithms. This presentation focuses on graphical models used in this work. Music Performance in the Computational Age | slides Roger Dannenberg, Carnegie Mellon University Computing has revolutionized music performance, recording, distribution, and listening. To date, most of the revolution has been driven by advances in storage and communication. The next revolution will come from computation, especially interactive real-time systems. We have been exploring how computing can augment musical performance by amateurs and professionals alike. A recent concert featured a 20-piece digital string orchestra playing with a live jazz band. Future work is aimed at interfaces that extend human musical abilities, especially in live performance. Art and Code Golan Levin, Carnegie Mellon University Just as true literacy in English means being able to write as well as read, true literacy in software demands not only knowing how to use commercial software tools, but how to create new software for oneself and for others. Recently, a new set of visually- and musically-oriented programming environments (and accompanying pedagogic techniques) have been developed by artists, and for artists. These toolkits—many of which are free, open-source initiatives—have made enormous inroads towards democratizing the education of computational thinking worldwide. With support from the Computational Thinking Center, a conference concerned with “programming environments for artists, young people, and the rest of us” brought together 15 of the key innovators leading significant revolutions in software-arts education, and provided workshops in 11 different arts-programming languages to an extremely diverse new community of creators. | Cascade | |
Education and Scholarly Communications Surface and Multitouch Moving Forward Hrvoje Benko, Microsoft Research; Daniel Wigdor, Microsoft; Andy Wilson, Microsoft Research The Microsoft Surface is being used in some very creative and innovative ways. Discover the potential of this fantastic new platform and see how touch computing can be applied in the future. Microsoft Research and the Microsoft Surface Product Group provide presentations and demos. | Lassen | |
3:45–4:00 | Break | |
4:00–5:15 | Emerging Transformational Changes in Healthcare Computing | slides Michael Gillam, Microsoft Research The foundations for the biggest changes in the future of healthcare are being laid in the field of health information technology today. From the emergence of enterprise computational clouds to the fast growing area of personally owned digital health records; this session examines the historic medical trends that are defining the most promising areas for success in healthcare computing today. Mr. Feynman Wasn’t Joking | slides Tony Hey, Corporate Vice President, External Research | Kodiak |