Chapter 3: Conceptualizing Interaction

Loading audio…

ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.

If there is an issue with this chapter, please let us know → Contact Us

The central objective of this chapter is to equip designers with the tools necessary to define a clear conceptual model prior to engaging in the physical construction of an interface, thereby ensuring design ideas undergo a crucial reality check and help define the project scope. This early conceptualization requires the rigorous identification and scrutiny of underlying assumptions, which are factors taken for granted, and claims, which are assertions needing further investigation, helping design teams reach common understanding (Common Ground). A conceptual model is formally a high-level abstraction detailing how a system operates and the fundamental concepts users must grasp to interact with it, including metaphors, objects, attributes, operations, and the relationships and mappings between these components. Interface metaphors are vital elements within this model, designed to provide a familiar structure for abstract tasks, exemplified by classic concepts like the desktop (the foundation of the Xerox Star interface) or contemporary designs like the card metaphor popularized in mobile and social media applications. The chapter defines five core interaction types that inform design choices: Instructing, where users issue quick and efficient commands, suitable for repetitive tasks; Conversing, which involves a two-way dialogue where the system functions as a dialogue partner, as seen in systems like Siri or chatbots; Manipulating, which leverages users’ knowledge of the physical world through interacting with digital objects (the basis for the direct manipulation framework); Exploring, where movement through virtual (e.g., 3D worlds) or sensor-embedded physical environments facilitates interaction; and Responding, characterized by a proactive system that initiates interaction by providing contextual information or alerts, such as fitness trackers notifying users of milestones or Google Lens identifying objects in photos. Beyond models and metaphors, interaction design is informed by broader concepts: Paradigms, which are general, accepted approaches (like the shift toward ubiquitous computing, Big Data, and the Internet of Things); Visions, which are future scenarios that inspire research and development, such as the historical Knowledge Navigator or modern AI narratives; Theories, offering well-substantiated explanations of human-computer interaction phenomena; Models, which simplify specific aspects of behavior (like Don Norman’s models of cognitive processing); and Frameworks, which offer specific advice, principles, or concepts for analysis or design, exemplified by Norman’s framework detailing the necessary alignment between the Designer’s Model, the System Image, and the User’s Model. The increasing autonomy afforded by AI and context-aware systems presents a significant design dilemma regarding the necessary degree of control users should maintain over the technology.