Chapter 9: Data Analysis, Interpretation & Presentation

Loading audio…

ⓘ This audio and summary are simplified educational interpretations and are not a substitute for the original text.

If there is an issue with this chapter, please let us know → Contact Us

Data gathered from methods like interviews, questionnaires, and observation can yield both quantitative data (numerical measures like age or time spent on a task) and qualitative data (descriptions, quotes, and images). Initial steps for any data set, especially large volumes, include data cleansing to check for anomalies, followed by collating and transcribing raw information into analyzable formats, such as spreadsheets. Basic quantitative analysis focuses on numerical methods to ascertain magnitude, utilizing measures like percentages and the three types of averages—mean (the common average), median (the middle value in a ranked list), and mode (the most frequently occurring number). Careful presentation of quantitative results using graphical representations, such as scatter diagrams, is vital for identifying patterns and outliers (values significantly different from the majority). Qualitative analysis investigates the nature of something, often represented by themes and patterns, and can be approached inductively (themes emerging from the data, such as thematic analysis) or deductively (using pre-existing concepts to categorize data elements). Collaborative techniques like building an affinity diagram help organize diverse ideas and insights into a hierarchical structure of common themes. Another focused qualitative method is critical incident analysis, which isolates pivotal events—both desirable and undesirable—for detailed investigation to inform design and interpretation. More specialized analytic frameworks are employed depending on the required level of granularity: Conversation Analysis focuses on the fine details of spoken interaction, including turn-taking and pauses; Discourse Analysis centers on the underlying meaning conveyed through language and context; Content Analysis systematically classifies data into categories and studies the frequency of their occurrences, often paired with sentiment analysis; Interaction Analysis uses video recordings to inductively study verbal and nonverbal interactions between people and artifacts; Grounded Theory develops a theoretical framework that is systematically derived, or grounded, in the empirical data, using iterative cycles of open, axial, and selective coding. Finally, Systems-Based Frameworks, such as Socio-technical Systems Theory and the Distributed Cognition of Teamwork (DiCoT), analyze organizational effectiveness and efficiency at a macro-level by modeling information flow, physical structure, artifacts, and social structures. Tools like Excel, Nvivo, Dedoose, and SPSS support these activities. Ultimately, the presentation of findings requires selecting appropriate styles, such as structured notations or stories (narratives), ensuring conclusions are supported by evidence and avoiding claims that over-generalize or misrepresent the data.