Prototype tablet tricked out with sensors just proves Mom was always right: Posture is important!

Published

By , Senior Principal Research Manager

(opens in new tab)

The mobility of tablets affords interaction from a wide diversity of postures:

Hunched over a desk with brow furrowed in concentration. On the go with the tablet gripped in one hand, while operating it with the other. Or kicked back on a couch to relax with some good old-fashioned Cat vs. Laser Pointer internet-video action.

Microsoft research podcast

Abstracts: August 15, 2024

Advanced AI may make it easier for bad actors to deceive others online. A multidisciplinary research team is exploring one solution: a credential that allows people to show they’re not bots without sharing identifying information. Shrey Jain and Zoë Hitzig explain.

This dexterity of situation, task, and mood are a big part of what makes tablets so appealing.

But as the scenarios above illustrate, there’s another crucial point: the nature of the activity changes as we move between these contexts.

And these changes are mirrored by the user’s physical posture—how they sit, how closely they engage with the device, which hand holds and which hand manipulates, all the way down to the fine-grained details of how the user grasps the tablet’s bezel between thumb and forefinger.

That’s where the Posture-Aware Interface comes in. Because applications can transform in compelling ways by directly sensing how people hold and manipulate their tablets—and interact with them using multi-touch and digital pen inputs.

(opens in new tab) Figure 1 – The Posture-Aware Interface adapts to many nuances of how a person holds, touches, or marks up content on their tablet.

Figure 1 above shows a simple application for annotation and mark-up that responds to various contexts sensed by our system:

  • Grasping the tablet summons Thumb Tools to a grip-centric location nearby, such as at the left (Panel A) or bottom (Panel B) edges of the screen.
  • Putting the tablet flat on a desk reverts to a more standard type of toolbar, at a device-centric position near the upper right (Panel C).
  • Planting the preferred hand on the screen to write automatically orients miniature Palm Tools (Panel D) to a convenient hand-centric location.
  • Or laying the pen down on the screen directly accesses its settings for customization (Panel E).
  • Finally, if you set fingers of your left hand to the screen, Fan Tools appear with more options—but note how they’re splayed out to the right, for easy access by the pen (Panel F).
  • But if you take the same action with the right hand, the system directly senses this and splays them out to the left instead (Panel G).

These all are simple adaptations driven by sensors that detect how the user postures, grips, or reaches for the device. And the interface dynamically morphs in response, with smooth animations that make everything feel fluid and responsive.

Indeed, by being acutely aware of these fine details of context, the Posture-Aware Interface comes to grips with our half-formed thoughts. This is not mind reading, but when the device anticipates your every move, it can certainly feel that way at times.

But it works because how people hold a device, and reach for the screen, reveals their tacit intentions—much in the way that a poker player might give away an all-too-revealing tell by how convincingly they lay down their bet.

Realizing posture-Aware sensing (a.k.a., getting our hands dirty with some cool hardware)

All right then, since it’s not powered by card sharks, how does the Posture-Aware Interface achieve this acuteness of observation?

The system combines three distinct sensing modalities:

  1. the full image sensed by the touchscreen
  2. the tilt and motion of the device
  3. the capacitive field surrounding the device itself

The first element uses the image of your hand resting on the touchscreen to reason about what’s going on. In principle this is just like a preschooler exuberantly mashing their paint-laden fingers onto (digital) construction paper. In practice, modern touchscreen interfaces derive entirely from individual finger-contact events—and in the process lose much of the joy and expressiveness of painting with our entire hands in childhood. But by going back to the drawing board (so to speak), our approach brings aspects like hand contact—simply being able to detect your palm resting on the screen while writing—back into the vocabulary of touch-screen interaction.

The second element uses the accelerometer and gyroscope sensors built into modern devices to understand the angle of the screen, and how it’s moving (or not moving). The established use for such sensors is automatic screen rotation—which just happens to be another innovation introduced by Microsoft Research, some 20 years ago. But in the present project we use the sensors to reason about stationary versus mobile use of tablets, allowing graceful transitions between many different physical postures of engagement with a device. And indeed, we can now use the added context of grip to suppress auto-rotation when you lay down on a couch.

The third element is perhaps where the real magic is. We built a custom electrode ring—essentially, a series of about 50 copper tape segments under the bezel of the screen—that can detect the capacitance of an approaching hand.

That’s right, the Posture-Aware Interface can sense and respond to your hand even before it touches down on the screen.

It also can separately detect when your hand actually grasps the screen bezel, so that we can tell exactly where you’re gripping it.

Or it can combine the two modes, such as when you’re gripping it with one hand, and touching the screen with the other. In this case, the sensor can further detect which direction your arm is reaching from because your forearm passes over the electrode ring.

More formally, this super-sensory get-up is known as a Peripheral Electric Field Sensor, if you really want to impress guests at your next dinner party with some arcane capacitance-sensing terminology. And it’s driven by the custom circuitry illustrated below.

(opens in new tab) Figure 2 – The Peripheral Electric Field Sensor consists of an electrode ring and custom circuitry that can detect when your hands approach or grip the tablet.

All of this may sound quite fancy, but indeed one of the reasons we chose these three sensor elements was that we believe they are all amenable to practical integration with consumer tablets. Indeed, the first two elements are already widely available—the innovation here is simply to combine these to gain awareness of the posture of the device relative to the user, and then to leverage these insights to drive contextually-appropriate adaptations.

Come and experience it for yourself

The Posture-Aware Interface will be presented in full scientific depth at the ACM CHI 2019 Conference on Human Factors in Computing Systems in Glasgow (opens in new tab), where it received an Honorable Mention for best paper award. Check out “Sensing Posture -Aware Pen + Touch Interaction on Tablets (opens in new tab)” for further details.

The research was conducted by the EPIC (Extended Perception, Interaction, and Cognition) group at Microsoft Research (opens in new tab) in Redmond, Washington, a team that innovates at the nexus of hardware, software, and human potential.

Research Intern Yang Zhang, who hails from Carnegie Mellon University, led the project (and exhibited unparalleled hardware wizardry!) during his time at Microsoft Research. Other contributors include Michel Pahud, Christian Holz, Haijun Xia, Gierad Laput, Michael McGuffin, Xiao Tu, Andrew Mittereder, Fei Su, William Buxton, and Ken Hinckley.

Posing thoughts

Overall, our work on the Posture-Aware Interface demonstrates how posture awareness can adapt interaction and morph user interface elements to suit the fine-grained context of use for pen and touch interaction on tablets.

Posture awareness includes the ability to sense grip, the angle of the tablet, the presence and orientation of the palm on the screen while writing or sketching, and the direction of reach during touch. And our work shows how just a few simple sensors can achieve this—enabling tablets to more effectively support both mobile and stationary use, and the many postural nuances in-between.

In the meantime, whether you’re setting ergonomic trends, sitting at your workstation tall and straight enough to make a drill sergeant proud, or in a repose of slothful decadence upon your favorite chaise longue, we’re envisioning a world in which your interface accommodates you—and not the other way around.

So strike a pose and read our paper; we’d love to hear what you think, where you stand, and how the idea of a posture-aware interface sits with you.

However you come at the topic, make your momma proud—and remember not to slouch!

Related publications

Continue reading

See all blog posts