Designing Across Senses

Book description

Today we have the ability to connect speech, touch, haptic, and gestural interfaces into products that engage several human senses at once. This practical book explores examples from current designers and devices to describe how these products blend multiple interface modes together into a cohesive user experience.

Authors Christine Park and John Alderman explain the basic principles behind multimodal interaction and introduce the tools you need to root your design in the ways our senses shape experience. This book also includes guides on process, design, and deliverables to help your team get started.

The book covers several topics within multimodal design, including:

  • New Human Factors: learn how human sensory abilities allow us to interact with technology and the physical world
  • New Technologies: explore some of the technologies that enable multimodal interactions, products, and capabilities
  • Multimodal Products: examine different categories of products and learn how they deliver sensory-rich experiences
  • Multimodal Design: learn processes and methodologies for multimodal product design, development, and release

Publisher resources

View/Submit Errata

Table of contents

  1. Praise for Designing Across Senses
  2. [Preface]
    1. What Is This Book About?
    2. Who Should Read This Book
      1. How This Book Is Organized
    3. Why Write a Book About Multimodal Design
    4. Acknowledgments
  3. 1. Returning to Our Senses
    1. If a Tree Falls in the Forest…
    2. The Sound of Violence
    3. Experience Is Physical
    4. People Have Modalities
    5. Devices Have Modes
    6. Human Modalities + Device Modes = Interfaces
    7. Physical Information: The New Data
    8. Sensing, Understanding, Deciding, and Acting: The New Human Factors
      1. Sensing
      2. Understanding and Deciding
      3. Acting
    9. Focus: The New Engagement
    10. Multimodality Makes a Wider Range of Human and Product Behaviors Possible
    11. How Multimodality Affects Design
      1. Creating Usability
      2. Creating Delight, Trust, and Love
    12. Multimodal Design Is Cross-Disciplinary
    13. Summary
  4. 2. The Structure of Multimodal Experiences
    1. The Human Slice of Reality: Umwelt and Sensibility
    2. Assembling Multimodal Experiences: Schemas and Models
    3. The Building Blocks of Multimodal Experience
    4. Summary
  5. 3. Sensing
    1. The Three Main Categories of Stimuli
      1. Electromagnetic
      2. Chemical
      3. Mechanical
    2. Defining the Senses: Dimension, Resolution, and Range
    3. Sensory Focus: Selecting, Filtering, and Prioritizing Information
    4. Reflexes
    5. Our Senses and Their Unique Properties
    6. Vision
      1. Visual Interfaces
    7. Hearing
    8. Auditory Interfaces
    9. Touch (Somatosensory or Tactile Abilities)
      1. Haptic Interfaces (Tactile, Proprioceptive, and Vestibular)
    10. Smell (Olfactory Ability)
      1. Olfactory Interfaces
    11. Taste (Gustatory Ability)
      1. Gustatory Interfaces
    12. Sixth Senses and More
      1. Time and Rhythm
      2. Proprioception and the Vestibular System
    13. Summary
  6. 4. Understanding and Deciding
    1. The Foundations of Understanding: Recognition, Knowledge, Skills, and Narratives
    2. Aware and Non-Aware: Fast and Slow Thinking
    3. Agency: Balancing Self-Control and Problem Solving
    4. Motivation, Delight, Learning, and Reward: Creating Happiness
    5. Summary
  7. 5. Acting
    1. About Anthropometrics
      1. The Origin of Anthropometrics
    2. Task Performance
    3. Nonverbal Communication
    4. Precision Versus Strength
    5. Inferring Versus Designating Intent
    6. Summary
  8. 6. Modalities and Multimodalities
    1. Modalities: How We Use Our Senses
      1. Types of Modalities
    2. We Shape Our Modalities, and They Shape Us
    3. Attributes and Abilities of Modalities
    4. Applying Modalities to Design
    5. Multimodalities
    6. Trusted Version and Performance Optimization
      1. Validation
      2. Integration
      3. A Single Prioritized Sense or Many Together?
      4. How Multimodality Shapes Our Activities and Experiences
    7. Attributes and Abilities of Multimodalities
      1. Focus
      2. Flow
      3. Sequence
      4. Simultaneity
      5. Shift
      6. Transition
      7. Substitution
      8. Translation
      9. Proficiency
    8. Common Categories of Multimodalities
      1. Basic abilities
      2. Orientation and scanning
      3. Hand–eye coordination (visuo-haptic integration)
      4. Social interaction
      5. Performance and athletics
      6. Cognition and analysis
    9. Applying Multimodality to Design
      1. Maintaining Focus
      2. Respecting Cognitive Load
      3. Overcoming Barriers with Substitutions and Translations
      4. Shifts, Interruptions, and Flow
      5. Feedback and Validation
      6. Body language and physical engagement
    10. Summary
  9. 7. The Opportunity: Transforming Existing Products and Developing New Ones
    1. Key Applications of IoT: Monitor, Analyze and Decide, Control and Respond
      1. Functional Categories
      2. Monitor
      3. Analyze and Decide
      4. Control and Respond
    2. “Disruptive” Technologies
      1. Removing Sound—And Putting It Back
      2. Mapping Apps Know Who Is in the Driver’s Seat
    3. Beginning Inquiry
      1. Workflow to Identify Opportunities
    4. Summary
  10. 8. The Elements of Multimodal Design
    1. Using Physical Information
    2. Constructing Knowledge, Interactions, and Narratives
    3. Summary
  11. 9. Modeling Modalities and Mapping User Experiences
    1. Behaviors Shared Between Users and Devices
    2. Demanding Contexts and Complex Interactions Call for Alignment
    3. Experience Maps and Models for Multimodality
      1. Different Maps Cover Different Scopes and Details
    4. Key Considerations of Multimodal Design
      1. Modalities and Senses
      2. Focus
      3. Level of Focus and Engagement Depth
      4. Continuity
      5. Sequence
      6. Shifts
      7. Flow and Habits
      8. Interruptions
      9. Substitutions
      10. Specialized Integration
      11. Knowledge and Skill
      12. Key Contexts in Multimodal Experiences
    5. Example Maps and Models
      1. Experience Map: Transitional Flow
      2. Ecosystem Map
      3. Context Map
      4. Focus Model
      5. Storyboards and Keyframing
      6. Update as Needed
    6. Summary
  12. 10. Form Factors and Configurations
    1. Creating Multimodal Properties
    2. Configuring Interface Modes
    3. Mapping Modal Behaviors to Modal Technologies
      1. Vision Dominant Activities
      2. Immersive Activities: Screen-Based Experiences and VR
      3. Augmented or Auxiliary Activities: Visual Indicators for Peripheral Information
      4. Augmented Reality Versus Augmented Products: Visual Arrays of Control and Choice
      5. Automated Visual Capabilities
      6. Creating Focal Experiences with Audio and Speech
      7. Personal Sound Experiences
      8. Social Experiences: Broadcast
      9. Conversation Experiences: Speech
      10. Creating Haptic Experiences
    4. Summary
  13. 11. Ecosystems
    1. Device Ecosystems
    2. Information Ecosystems
    3. Physical Ecosystems
    4. Social Ecosystems
    5. Specialized Ecosystems
    6. Cloud Architectures: Distributing Resources Through Connectivity
    7. Ecosystem and Architecture: Applying Ecosystem Resources to Multimodal Design
    8. Sensing Experiences: Answering the Door—A Doorbell, Ring, and the August Lock
    9. Understanding and Deciding Experiences: Determining Distance—a Pedometer, Apple Watch, and Lyft App
    10. Acting Experiences: Writing and Drawing—A Pencil, a Tablet, and the Apple Pencil
    11. Summary
  14. 12. Specifying Modalities: States, Flows, Systems, and Prototypes
    1. Introduction: A Prototype Is a Custom Measuring Tool
    2. Practice Makes Perfect
    3. The Media of Multimodal Products: Information and Interactions, Understandings and Behaviors
    4. The Product Development Process for Multimodal Products
    5. Defining Design Requirements
      1. User Goals, Scenarios and Storyboards, and Use Cases
      2. Pseudocode and Swimlane Logic Flows
    6. Specifying Multimodalities
      1. Synchronous and Asynchronous Modes
      2. Parallel and Integrated Modes
      3. Input/Output MAP
    7. Summary
  15. 13. Releasing Multimodal Products: Validation and Learning
    1. Release Is About Process
    2. Alpha Release
      1. Organizing alpha
      2. Learning from Alpha
    3. Beta Release
      1. Organizing Beta
      2. Learning from Beta
    4. Public Release
      1. Validation and Feedback
    5. The Out-of-the-Box Experience
    6. Summary
  16. A. Further Reading
  17. B. Glossary
  18. Index

Product information

  • Title: Designing Across Senses
  • Author(s): Christine W. Park, John Alderman
  • Release date: March 2018
  • Publisher(s): O'Reilly Media, Inc.
  • ISBN: 9781491954195