User experience evaluation

User experience evaluation (UXE) or user experience assessment (UXA) refers to a collection of methods, skills and tools utilized to uncover how a person perceives a system (product, service, non-commercial item, or a combination of them) before, during and after interacting with it. It is non-trivial to assess user experience since user experience is subjective, context-dependent and dynamic over time.[1] For a UXA study to be successful, the researcher has to select the right dimensions, constructs, and methods and target the research for the specific area of interest such as game, transportation, mobile, etc.

Dimensions

edit

There are many different dimensions to consider when choosing the best assessment approach:

Laboratory experiments may work well for studying a specific aspect of user experience, but holistic user experience is optimally studied over a longer period of time with real users in a natural environment.

Constructs

edit

In all cases, however, there are certain aspects of user experience that researchers are interested in (measures), and certain procedures and techniques used for collecting the data (methods). There are many measures and some high-level constructs of user experience that can be used as the basis for defining the user experience measures, for example:

  1. Utility: Does the user perceive the functions in the system as useful and fit for the purpose?
  2. Usability: Does the user feel that it is easy and efficient to get things done with the system?
  3. Aesthetics:[2] Does the user see the system as visually attractive? Does it feel pleasurable in hand?
  4. Identification: Can I identify myself with the product? Do I look good when using it?
  5. Stimulation: Does the system give me inspiration? Or wow experiences?
  6. Value: Is the system important to me? What is its value for me?

To properly evaluate user experience, metrics and other factors surrounding a study need to be taken into account, for example:

  • Data (metrics): The time taken to complete a task.
  • Scale (metrics): Indicators that show effectiveness, efficiency and satisfaction.
  • Other Factors: Conditions of use, the surrounding environment and other human factors.

Methods

edit

An individual method can collect data about a set of specific constructs of user experience. For instance, usability testing is used to collect data about usability construct.[3] Methods also differ if they are to measure a momentary or episodic experience (i.e., assessing how a person feels about a specific interaction episode or after executing a task) or an experience over time, also known as an longitudinal experience. UXA methods can be classified in three categories: implicit, explicit and creative methods.

Implicit methods

edit

Implicit methods of UX research focus not just only on what the users say, but also on what the user cannot express verbally. Many available tools can assist in the implicit evaluation, in particular to gather implicit or objective data. When available, UX researchers utilize state of the art equipment to uncover all aspects of the experience.

Examples of implicit evaluation methods and tools:

Explicit methods

edit

Explicit methods of UX research explore what the user is consciously aware of getting them to reflect on their own feelings or thoughts, and gather their views and opinions. An important aspect of explicit methods includes usability testing and emotion evaluation.

Emotion assessment

edit

When investigating momentary user experiences, we can evaluate the level of positive affect, negative affect, joy, surprise, frustration, etc. The measures for emotions are bound to the methods used for emotion assessment, but typical emotion measures are e.g. valence and arousal. Objective emotion data can be collected by psychophysiological measurements or by observing expressed emotions. Subjective emotional data can be collected by using self-report methods, which can be verbal or non-verbal.

 
The Geneva Emotion Wheel

Examples of emotion assessment methods:

  • Psychophysiological emotion measurements aim to identify emotions from physiological changes in muscles (e.g. face), pupils, skin, heart, brains, etc.
  • Expression
  • Think aloud protocol can be used for reporting emotions (real-time verbal self-report)
  • Positive and Negative Affect Schedule (PANAS) (retrospective verbal self-report)
  • Geneva emotion wheel[4] (retrospective verbal self-report)
  • Photographic Affect Meter (PAM)[5]
  • Emotion slider[6] (continuous non-verbal self-report)
  • Sensual evaluation instrument (SEI)[7] (snapshot non-verbal self-report)
  • PrEmo, a new version of EmoCards for assessing emotion[8] (snapshot non-verbal self-report)

Creative methods

edit

Equally important to implicit and explicit methods are the creative methods that the user researcher can utilize in order to bring together the design team's view, as well as the target market's dreams, aspirations and ideas of optimal design. These activities are more open and allow people to either co-create with the engineers/designers, or to use their imagination to express their ideal system.

Examples of creative assessment methods

Longitudinal

edit

In contrast to identifying a momentary emotion, longitudinal UXA investigates how a person feels about a system as a whole, after using it for a while.

Examples of longitudinal UXA methods (excluding traditional usability methods):

  • Diary methods[9] for self-reporting experiences during field studies
  • Experience sampling method (ESM)[10] for self-reporting during field studies
  • Day reconstruction method (DRM)[11] – story-telling to reveal the meaningful experiences during field studies
  • AttrakDiff[12] questionnaire for overall UX evaluation
  • User experience questionnaire (UEQ) (available in several language versions)[13]
  • Ladder interviews – e.g. to find out attitudes or values behind behaviour or experience
  • Holistic user experience (HUX)[14] identifying the relevant product factors for holistic user experience

Areas of UXA research

edit

Transportation

edit

Automobiles have come a long way since their beginning in the late 19th century. One of the major things that have helped automobiles to provide more safety and convenience is electronics. With the advances in technology and electronics, car manufacturers have been able to offer a wide variety of services and conveniences. From the creation of the electronic fuel injection to the popular global positioning system found standard in many cars today, the auto industry has revolutionized the way people travel from place to place. Understanding how people interact with vehicles today, what contributes to a great driving experience, what is their current relationship with the car, what placement does it have in their lives, is key to the development of these technologies. This information ensures user-centered design practices to generate cohesive, predictive and desirable designs.

Once specific design concepts and ideas are on the table, UXA researchers further explore how people react to them regarding desirability, findability, usefulness, credibility, accessibility, usability and human factors metrics. Outcomes of this work includes user requirements, concept validation, and design guidelines.[15] Researchers have conducted intriguing research to answer questions such as: could an In-Vehicle Infotainment (IVI) system with a speech evoked personality change your relationship with your car?,[16] could an in-car system support unwinding after work?,[17] could in-car solutions address the special needs of children as passengers, and assist the parents with the task of driving?[18] and many others. Additionally, workshops and gatherings of researchers around the world take place to discuss current evaluation techniques and advance the field of experience research in the area of transportation. An important professional venue for this work is AutomotiveUI, the International Conference on Automotive User Interfaces and Interactive Vehicular Applications.

UXA methods for transportation

edit

As with other UXA's the method chosen has a lot to do with the outcome desired and where the project is in its design cycle. Given that, methods are selected best suited to the research problem which most times ends up being a combination of implicit, explicit and creative. Some methods include:

  • Interviews: both structured and un-structured.
  • Diary studies[19]
  • Workload assessment questionnaires (i.e. DALI –Driving Activity Load Index adapted from NASA-TLX)
  • Subjective assessment of interfaces questionnaires (i.e. SASSI—Subjective Assessment of Speech System Interfaces[20] ) that can lead to design guidelines to speech interfaces[15]
  • Experience Probing (Prototypes, storytelling, storyboards)
  • Co-design activities
  • Observations (i.e. coding for frustration, delight and other non-verbal cues)

Video games

edit

A relatively new pursuit in video game play-testing is UX and usability research. An increasing number of companies including some of the world's biggest publishers have begun outsourcing UX evaluation or opening their own in-house labs.[21][22][23] Researchers use a variety of HCI and psychological techniques to examine the effectiveness of the user experience of the games during the design process.[24]

There are also some companies starting to use biometrics to measure the relationship between in-game events and the player's emotions and feelings (the UX), such as Player Research and Serco ExperienceLab in the UK,[25][26] and Valve, Electronic Arts, BoltPeters, and VMC Labs in the US and Canada.[27][28][29][30] The interest in this area comes from both academia and industry, sometimes enabling collaborative work.[31][32] Game UX work has been featured at professional venues, such as the Game Developers Conference (GDC).[33][34]

Web design

edit

User experience evaluation has become common practice in web design, especially within organizations implementing user-centered design practices. Through user testing, the user experience is constantly evaluated throughout the whole product design life-cycle.

See also

edit

References

edit
  1. ^ Law, E., Roto, V., Hassenzahl, M., Vermeeren, A., Kort, J.: Understanding, Scoping and Defining User Experience: A Survey Approach. In Proceedings of Human Factors in Computing Systems conference, CHI'09. 4–9 April 2009, Boston, MA, USA (2009)
  2. ^ Moshagen, M. & Thielsch, M. T. (2010). Facets of visual aesthetics. In: International Journal of Human-Computer Studies, 68 (10), 689–709.
  3. ^ Pelt, Mason (23 May 2016). "Stop overthinking UX and try the coffee shop test". venturebeat.com.
  4. ^ Baenziger, T., Tran, V. and Scherer, K.R. (2005) ‘'The EmotionWheel. A Tool for the Verbal Report of Emotional Reactions, poster presented at the conference of the International Society of Research on Emotion, Bari, Italy.
  5. ^ J. P., Adams, P., & Gay, G. (2011). PAM: a photographic affect meter for frequent, in situ measurement of affect. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 725–734). ACM.
  6. ^ Laurans, G., Desmet, P.M.A., & Hekkert, P.P.M. (2009). The emotion slider: a self-report device for the continuous measurement of emotion. Proceedings of the 2009 International Conference on Affective Computing and Intelligent Interaction. Amsterdam, the Netherlands.
  7. ^ Isbister, K., Höök, K., Sharp, M., and Laaksolahti, J. 2006. The sensual evaluation instrument: developing an affective evaluation tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Montréal, Québec, Canada, 22–27 April 2006). CHI '06. ACM, New York, NY, 1163–1172
  8. ^ Desmet, P.M.A., Overbeeke, C.J., Tax, S.J.E.T. (2001). Designing products with added emotional value: development and application of an approach for research through design. The Design Journal, 4(1), 32–47.
  9. ^ Bolger, N., Davis, A., & Rafaeli, E. (2003). Diary methods: Capturing life as it is lived. Annual Review of Psychology, 54, 579–616.
  10. ^ Csikszentmihalyi M, Larson R. (1987). Validity and reliability of the Experience-Sampling Method. Journal of Nervous and Mental Disease. Sep 1987;175(9):526–536.
  11. ^ Kahneman, D., Krueger, A., Schkade, D., Schwarz, N., and Stone, A. (2004). A Survey Method for Characterizing Daily Life Experience: The Day Reconstruction Method. Science. 306:5702, pp. 1776–780.
  12. ^ Hassenzahl, M., Burmester, M., & Koller, F. (2003). AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In J.Ziegler & G. Szwillus (Eds.), Mensch & Computer 2003. Interaktion in Bewegung (pp. 187–196). Stuttgart, Leipzig: B.G. Teubner.
  13. ^ Laugwitz, B., Schrepp, M. & Held, T. (2008). Construction and evaluation of a user experience questionnaire. In: Holzinger, A. (Ed.): USAB 2008, LNCS 5298, S. 63-76.
  14. ^ Toussaint, C., Ulrich, S., Toussaint, M. (2012). HUX - Measuring Holistic User Experience. In German UPA e.V., Usability Professionals 2012 - Tagungsband (pp. 90-94).
  15. ^ a b Areti Goulati and Dalila Szostak. 2011. User experience in speech recognition of navigation devices: an assessment. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services (MobileHCI '11). ACM, New York, NY, USA, 517-520. DOI=10.1145/2037373.2037451
  16. ^ Jennifer Healey and Dalila Szostak. 2013. Relating to speech evoked car personalities. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 1653-1658. DOI=10.1145/2468356.2468652
  17. ^ Zoë Terken, Roy Haex, Luuk Beursgens, Elvira Arslanova, Maria Vrachni, Jacques Terken, and Dalila Szostak. 2013. Unwinding after work: an in-car mood induction system for semi-autonomous driving. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI '13). ACM, New York, NY, USA, 246-249. DOI=10.1145/2516540.2516571
  18. ^ Liang Hiah, Tatiana Sidorenkova, Lilia Perez Romero, Yu-Fang Teh, Ferdy van Varik, Jacques Terken, and Dalila Szostak. 2013. Engaging children in cars through a robot companion. In Proceedings of the 12th International Conference on Interaction Design and Children (IDC '13). ACM, New York, NY, USA, 384-387. DOI=10.1145/2485760.2485815
  19. ^ Lallemand, C. (2012) Dear Diary: Using Diaries to Study User Experience
  20. ^ Kate S. Hone and Robert Graham. (2000). Towards a tool for the Subjective Assessment of Speech System Interfaces (SASSI). Nat. Lang. Eng. 6, 3-4 (September 2000), 287-303. DOI=10.1017/S1351324900002497.
  21. ^ Halo 3: How Microsoft Labs Invented a New Science of Play. Wired.com. Retrieved on 21 October 2011.
  22. ^ Bolt, Nate. (22 January 2009) Researching Video Games the UX Way – Boxes and Arrows: The design behind the design. Boxes and Arrows. Retrieved on 21 October 2011.
  23. ^ THQ Chooses The Guildhall at SMU to House New Usability Lab | games industry | MCV. Mcvuk.com. Retrieved on 21 October 2011.
  24. ^ Hong, T. (2008) Shoot to Thrill: Bio-Sensory Reactions to 3D Shooting Games, Game Developer Magazine, October
  25. ^ GamesIndustry.biz. Player Research. Retrieved on 16 March 2013
  26. ^ Game usability testing. PlayableGames. Retrieved on 21 October 2011.
  27. ^ Valve. Valvesoftware.com. Retrieved on 21 October 2011.
  28. ^ EA Games – Electronic Arts Archived 22 May 2012 at the Wayback Machine. Ea.com. Retrieved on 21 October 2011.
  29. ^ VMC Consulting – Tailored Solutions for Your Business. Vmc.com. Retrieved on 21 October 2011.
  30. ^ Bolt | Peters | Research, design, and products. Boltpeters.com. Retrieved on 21 October 2011.
  31. ^ Nacke, L., Ambinder, M., Canossa, A., Mandryk, R., Stach, T. (2009). "Game Metrics and Biometrics: The Future of Player Experience Research" Panel at Future Play 2009
  32. ^ 8–9 April 2010, Seminar Presentation at Games Research Methods Seminar, "Using physiological measures in conjunction with other UX approaches for better understanding of the player's gameplay experiences", University of Tampere, Finland
  33. ^ Ambinder, M. (2011) Biofeedback in Gameplay: How Valve Measures Physiology to Enhance Gaming Experience. Game Developers Conference 2011
  34. ^ Zammitto, V. (2011) The Science of Play Testing: EA's Methods for User Research. Game Developers Conference 2011