Table Of ContentPraise for Designing Across
Senses
“As the web becomes an overlay on our physical reality and everyday objects
come alive with embedded intelligence, we must evolve new ways to interact
with our technology. In this enlightening book, Park and Alderman not only
demystify multimodal design to bring us closer to our machines, but they also
make sense of the startling sensory experiences that keep us human.”
—DAVID PESCOVITZ, RESEARCH DIRECTOR,
INSTITUTE FOR THE FUTURE AND CO-EDITOR,
BOING BOING
“John and Christine have written a UX classic that will be referenced and
pulled on and off the bookshelf for years to come. This book defines and sets
the stage for multimodal design and sensory experiences we are experiencing
today and designing in a not too distant future.”
—KELLY GOTO, CEO AND FOUNDER, GOTOMEDIA
“Technology is breaking away from the screen, and this is the first book that
addresses how voice, virtual reality, and other upcoming interaction models
should be designed. Take a moment to feel this book. When it is in your hands,
you’ll find it insightful. When it’s out of your hands, you’ll miss it dearly.”
—GOLDEN KRISHNA, DESIGN STRATEGIST AND
AUTHOR, THE BEST INTERFACE IS NO INTERFACE
Designing Across Senses
A Multimodal Approach to Product Design
Christine W. Park and John Alderman
Designing Across Senses
by Christine W. Park and John Alderman Copyright © 2018 Christine Park and
John Alderman. All rights reserved.
Printed in the United States of America.
Published by O’Reilly Media, Inc., 1005 Gravenstein
Highway North, Sebastopol, CA 95472.
O’Reilly books may be purchased for educational, business, or sales promotional
use. Online editions are also available for most titles (oreilly.com/safari). For
more information, contact our corporate/institutional sales department: (800)
998-9938 or [email protected].
Acquisitions Editor: Angela Rufino
Cover Designer: Randy Comer
Editor: Angela Rufino
Interior Designers: Ron Bilodeau and Monica Kamsvaag
Production Editor: Melanie Yarbrough
Illustrator: Rebecca Demarest
Proofreader: Amanda Kersey
Compositor: Melanie Yarbrough
Indexer: Lucie Haskins
February 2018: First Edition.
Revision History for the First Edition:
2018-03-07 First release See http://oreilly.com/catalog/errata.csp?
isbn=0636920049500 for release details.
The O’Reilly logo is registered trademarks of O’Reilly Media, Inc. Designing
Across Senses and related trade dress are trademarks of O’Reilly Media, Inc.
Many of the designations used by manufacturers and sellers to distinguish their
products are claimed as trademarks. Where those designations appear in this
book, and O’Reilly Media, Inc., was aware of a trademark claim, the
designations have been printed in caps or initial caps.
Although the publisher and author have used reasonable care in preparing this
book, the information it contains is distributed “as is” and without warranties of
any kind. This book is not intended as legal or financial advice, and not all of the
recommendations may be suitable for your situation. Professional legal and
financial advisors should be consulted, as needed. Neither the publisher nor the
author shall be liable for any costs, expenses, or damages resulting from use of
or reliance on the information contained in this book.
978-1-491-95424-9
[LSI]
[Preface]
What Is This Book About?
FROM THE KEYBOARD, MOUSE, and touchscreen, to voice-enabled
assistants and virtual reality, we have never had more ways to interact with
technology. Called modes, they allow people to enter input and receive output
from their devices. These inputs and outputs are often designed together in sets
to create cohesive user interfaces (UIs). These modes reflect the way our senses,
cognitive functions, and motor skills also work together in sets called modalities.
Human modalities have existed for far longer than our interface modes, and they
enable us to interact with the physical world. Our devices are only beginning to
catch up to us. We can now jump and move around in our living rooms to play a
game using Microsoft’s motion-tracking peripheral, Kinect. We can ask
Domino’s to deliver a pizza using the Amazon Echo.
We often use several modalities together in our daily activities, and when our
devices can do the same, they are considered multimodal UIs. Most UIs are
already multimodal, but because they’re so familiar we don’t tend to think of
them that way. In fact, almost all designed products and environments are
multimodal. We see a door and knock on it, waiting for it to open or to hear
someone inside ask who it is. We use our fingers to type on a keyboard, and see
characters appear on the screen in front of our eyes. We ask Siri a question and
see the oscilloscope-like waveform to let us know we are being heard. We
receive a phone call and feel the vibration, hear the ringtone, and see the name of
the person on the screen in front of us. We play a video game and are immersed
in sensory information from the screen, speakers, and the rumble shock
controller in our hands.
Multimodal products blend different interface modes together cohesively. They
allow us to experience technology the same way we experience our everyday
lives: across our senses. Good multimodal design helps us stay focused on what
we are doing. Bad multimodal design distracts us with clumsy or disjointed
interactions and irrelevant information. It pulls us out of our personal experience
in ways that are at best irritating and at worst dangerous.
As technology is incorporated into more contexts and activities in our lives, new
types of interfaces are rapidly emerging. Product designers and their teams are
challenged to blend modalities in new combinations for new products in
emerging categories. They are being asked to add new modalities to the growing
number of devices we use every day. This book provides these teams with an
approach to designing multimodal interactions. It describes the human factors of
multimodal experiences, starting with the senses and how we use them to
interact with both physical and digital information. The book then explores the
opportunities represented by different kinds of multimodal products and the
methodologies for developing and designing them. Following this approach will
develop multimodal experiences for your users. You will be able to deliver
successful products that earn trust, fulfill needs, and inspire delight.
Who Should Read This Book
This book is for designers who are developing or transforming products with
new interface modes, and those who want to. It will extend knowledge, skills,
and process past screen-based appraoches, and into the next wave of devices and
technologies. The book also helps teams that are integrating products across
broader physical environments and behaviors. The senses and cognition are the
foundation of all human experience, and understanding them will help blend
physical and digital contexts and activities successfully. Ultimately, it is for
anyone who wants to create better products and services. Our senses are the
gateway to the richness, variety, delight, and meaning in our lives. Knowing how
they work is key to delivering great experiences.
How This Book Is Organized
This book is organized into two parts. Part I covers the human sensory abilities,
how they function, and how we use them to interact with both the physical world
and with technology. It also describes the ways technology fits with human
senses in new interface modes. Part II sets out the flexible process and
methodology of multimodal design. Starting with product definition, it explains
how to identify and assess possibilities for innovation. From there, it describes
the considerations, activities, and deliverables that take a team from concept to
launch. Sprinkled throughout the book are short sections about relevant products
and technologies.
Part I: New Human Factors
Chapter 1 describes how sensing, whether by humans or devices, turns
physical events into useful information. It describes modalities and
multimodalities and how they shape human experience. It describes the
difference between human modalities and device modes and how
together they become interfaces. Finally, it looks at the new human
factors: sensing, understanding, deciding, and acting—important
experiential building blocks for designing any kind of product or
service.
Chapter 2 delves further into the building blocks of experience and how
they relate to more familiar design concepts like affordances and mental
models. The chapter looks at how they are useful for understanding
human experience and how they are applicable to multimodal design.
Chapter 3 looks at how our senses evolved to perceive the diverse types
of matter and energy in the world around us. Designing interfaces
requires an understanding of the user’s senses and the powers,
limitations, characteristics, and expectations that each sense carries.
Chapter 4 is about how cognition is organized by schemas, which lets
us parse and analyze sensory information and then understand and learn
from it.
Chapter 5 is about how our physical form and abilities shape our
interactions, and the considerations they raise in product design.
Chapter 6 digs into modalities and multimodalities: specific patterns of
perception, cognition, and action that enable our behaviors. It introduces
a few rules of thumb for designers creating multimodal interactions
such as respecting cognitive load, supporting focus, maintaining flow,
and allowing feedback and validation.
Part II: Multimodal Design
Chapter 7 explains how to identify opportunities for innovation by
assessing user needs and contexts and reframing current products and
technologies.
Chapter 8 looks at cues, affordances, feedback, feedforward, and
prompts: the palette of multimodal interactions. These elements of
experience describe how people use physical information within an
interaction.
Chapter 9 explores ways to use maps and models to frame
opportunities, contextualize insights, and align project efforts to create
effective multimodal experiences. It builds on existing deliverables like
customer journeys and ecosystem maps, and introduces new ones like
focus models.
Chapter 10 describes the interplay of physical design and technology
capabilities during product development. It emphasizes the need to map
interface modes to the required modalities within user behaviors, and
compare different mappings across products.
Chapter 11 distinguishes layers of context as ecosystems and looks at
how they affect product usage. It includes four types of ecosystems:
information, physical, social, and device.
Chapter 12 encourages designers to think of the entire design process as
prototyping. It describes deliverables that can be used to specify product
characteristics and usage behaviors.
Chapter 13 describes different ways that products can be released and
identifies how teams can minimize risk, maximize learning, and
increase the chances of a successful product.
Description:What Is This Book About? From the keyboard, mouse, and touchscreen, to voice-enabled assistants and virtual reality, we have never had more ways to interact with technology. Called modes, they allow people to enter input and receive output from their devices. These inputs and outputs are often desig