Jennifer Posada HCC 746 Affective Human Computer Interaction
Framework of product experience (Desmet & Hekkert, 2007)
Researchers Desmet and Hekkert (2007) published the paper “Framework of product experience” to contribute literature on the recent focus in user-centered design on the affective experience of the user as it relates to their interaction with a product. They offer a framework on product experience developed to help make comparisons between experiential concepts (Desmet & Hekkert, 2007). Their framework involves three components of product experience that include aesthetic experience, the experience of meaning, and emotional experience.
Impact and Relevance
By developing a framework, Desmet and Hekkert (2007) hope to make an impact on experience design research approaches that may support new “design directions”. This supports past literature that continues to emphasize the connection between affect and physiology and introduced the concept of ‘core affect’ by combining the “affect dimension with physiological arousal into a circular two- dimensional model.” (Desmet, 2007). This framework also supports past research on the importance of considering affect in design and how affect is an integral part of how we experience our world (Desmet & Hekkert, 2007). In this paper, they look at the relationships between the three levels of product experience and patterns seen between meaning we give stimuli and emotions as well as the patterns between aesthetics and emotion.
During the time of publication, this field of study “attracted the attention in various disciplines involved in product research, such as marketing, consumer research, ergonomics, economics, and engineering” and it is certainly relevant today with the nature of marketing and the use of technology to attempt to elicit affective responses in users.
Alternative Approaches and What the authors missed
I thought that the researchers did well in supporting their proposed framework with previous literature and providing reasoning for each of the components of product experience. Aesthetic experience as it relates to concepts from social psychology and the effects of attractiveness on the “valuation of other personality attributes” is an interesting connection and one that I noticed in the other papers as well. The authors looked at the use of concepts such as affect and experience and its relevance to product experience. Aesthetic experience is also explained as the ability to perceive a stimuli as pleasant. Experience of meaning is also discussed as involving cognitive process where we recognize “personal and symbolic significance of products (Desmet & Hekkert, 2007). Finally, they look at aesthetics and emotions where the experience can give rise to emotions and where users are motivated to seek out pleasurable experiences.
The framework helped identify patterns to look out for in various product experiences to better understand the complexity. Desmet and Hekkard(2007) also discuss how cultural and individual differences impact product experience. This work contributes to the area of HCI that we have researched for our project in this class on culture, color, and affect and how perceptions may differ based on certain values.
Three levels of product emotion. (Desmet, 2010)
Desmet (2010) introduces a way to design products that is driven by emotion and draws on the processes that occur during an emotional response to a product. The paper offers three levels of product emotions that are based on the notion that we react emotionally to stimuli based on how we appraise the stimuli. It is also based on the idea that products generate different classes of emotional stimuli including stimuli relating to the product and its qualities, stimuli presented while doing an activity with the product, and finally stimuli with self-focus that looks at effects from using the product (Desmet, 2010). The author states that the framework can be used for designs that aim for “three levels of emotional product appeal” (Desmet, 2010).
Impact and Relevance
Contributions to this area of research are done in the hope of clarifying methods for designing with emotional product appeal. Again, even more relevant today with the use of emotions to persuade users to use their product.
Developing another approach to emotion-driven design can help support more user-centered designs by considering the basic appraisal processes we use when interacting with our environment.
It is also relevant due to the situational meanings surrounding products and their use. Specifically, self-focus and the influence of product use on identity seems to be the most relevant today with consumers or users being drawn to products that are socially just and sustainable, for example, or used by a public figure whom you really identify with.
Alternative Approaches and What the authors missed
Again, I felt that this research was pretty clear in identifying why this approach is relevant to how we already experience emotions in other settings. They discussed how feelings are a key factor in emotions as well as how we tend to adapt our responses depending on the context(Desmet, 2010). I thought the addition of the desire for a future ideal emotional state as a motivator was interesting because it emphasizes the importance of relationships in affect as well which can be useful when thinking about designing affective agents.
The paper also presents three main emotional forces for how we appraise situations. The first is usefulness which looks at whether the product supports or harms the user’s goals. The second factor is pleasantness and the extent to which the design of the product elicits pleasure or pain. The third factor is rightfullness, which looks at the extent to which the user’s standards for a product are met or not.
The situational meaning of the product is also emphasized with three focuses: product focus, activity focus, and self-focus (Desmet, 2010). Product focus involves the perception of the qualities of the product and how that can elicit emotion. Activity focus looks at how the product helps or hinders an activity, and while not directly related to emotion it is indirectly related because the product is used to achieve a goal. Finally, self-focus looks at the influence of the product on the user’s identity.
The three levels of product experience and its nine associated product evoked emotions related well to Norman’s model of emotional design, in my opinion. It seems to describe the same processes but with a different term. For example, product focus would relate to the processing that occurs at the visceral level where “physical features dominate” (Kleinsmith, Slide 15, 2022). Activity focus could relate to behavioral level of processing that looks at usability and the ability to fulfill a need or goal (Kleinsmith, Slide 15, 2022). Finally the self-focus situation relates well to the reflective level where the focus in on the “message, culture, and meaning of the product” (Kleinsmith, Slide 18, 2022).
What is beautiful is usable (Tractinsky et al., 2000)
Tractinsky et al. (2000) conducted a study to see whether users intial appraisal of usability and aesthetic relationships persist after using the system for a certain period of time. Specifically, they wanted to see whether perceived aesthetics influenced these perceptions or if it was the actual usability of the system. They present findings and how they contribute to emerging research at the time that emphasizes the relationship between aesthetics and user experience with a product. Tracinstsky et al. (2000) refer to “tension between function and form in HCI” at the time where usability was emphasize more over aesthetics.
Impact and Relevance
This paper contributes data that shows a correlation between aesthetics and perceived usability. The impact, in my opinion and mirrored by later work, was that aesthetic surpassed usability as a priority in design resulting in designs that are aesthetically pleasing but are less usable. Zara’s website is a great example of the impact research like this may have had on design, it looks great and aesthetic but it is complicated to use and not as satisfying as it could be if it helped the user achieve their goals with minimal obstacles. It may also be why so many people are loyal to brands like Apple because of aesthetics when Androids may offer more functionalities to the user and ultimately be more usable.
Alternative Approaches and What the authors missed
The study involved 132 Industrial Engineering students that participated for extra credit. 67% of participants were male with an average age of 25. The participants had no prior exposure to information about aesthetic considerations for design. The study looked at subjective valuations of the UI including aesthetics, usability, and amount of information both pre-experiment and post experiment with the addition of measuring user satisfaction in the post-experiment evaluation. Participants rated aesthetics on three levels of low, medium, and high and usability on 2 levels (low and high). The procedure involved users receiving a 4 digit pin to use the ATM and performed 11 tasks such as account balance inquiry(3x), withdrawal (4x), checking balance and withdrawal (2x), depositing money (2x). The experiment was conducted in three stages with the layout displayed three different times and users were asked to rate based on the three variables being measured. They randomized the order they presented these to the participants as well as how they were assigned to usability levels. I thought this was good to avoid an order effect where they may perceive one design as more aesthetic than the other based on the order they were presented it.
Limitations I thought about were the fact that mostly men were participants (67%). The authors did not address this or suggest future research was needed or that it was even an issue. It speaks to how the field may be lacking in diversity in the time of the study. It would also be interesting to see this studied on students from a non-tech background to see if there were any differences in effects.
The findings from this study emphasize the need to study aesthetics as it relates to human computer interaction because the results showed that correlations between perceived usability and aesthetics persisted in the experiment. The actual usability of the system, surprisingly, did not affect the post-experimental perceptions of the systems usability. I thought this, again, relates to how Apple products and those similar to it can be appraised as better and more usable than they actually are. These findings and other studies conducted after also likely contributed to the development of persuasive tech and the use of aesthetics or perceived aesthetics for coercive tactics to gain something from users.
Desmet, P., & Hekkert, P. (2007). Framework of product experience. International journal of design, 1(1).
Desmet, P. M. A. (2010, March). Three levels of product emotion. In Proceedings of the international conference on Kansei engineering and emotion research (pp. 236-246).
Kleinsmith, A. (2022). Week 14 Lecture Emotion Design+Persuasion. Lecture Slides
Jennifer Posada HCC 746 Affective Human Computer Interaction
Interpersonal Synchrony (Delaherche et al., 2012)
Researchers Delahersche et al. (2012) discuss behavioral synchrony and how its importance across various disciplines led to the goal of having a set definition of synchrony. A set definition allows for a better understanding of synchrony which is already viewed as “a complex phenomenon” because of the need to perceive and understand social and communicative signals and adapt continuously to changes in those signals.
In this paper the researchers focus on clarifying synchrony and its role in early childhood and adulthood. With this clarification, it would help support the design of algorithms used in HCI because more information could be gained about the components related to synchrony such as “rapport building” and “cooperation efficiency” (Delahersche et al. 2012). They refer to previous definitions of synchrony and methods used to measure synchrony to attempt to develop a better definition. The methods they mentioned involved testing surrogate datasets, created by offsetting time, timeshuffling, and associating mismatched partners, to set a baseline for judging coordination and synchrony. They address the limitations of these methods by questioning whether these methods are better for analyzing a database but are not as helpful when attempting to equip a machine with these skills. They provide suggestions for improvements in measuring synchrony.
Impact and Relevance
Delaherscher et al. (2012) discuss the role of synchrony in early development, specifically in language learning and social connect, and how relevant this research can be for building affective agents. In fact, synchrony is already being researched in fields relating to machine learning, robotics, and social signal processing to support the goal of building robots that learn through synchronized exchanges the same way infants do.
They also discuss how studying synchrony may also provide information that is useful when looking at psychiatric conditions that affect social abilities.
Alternative Approaches and What the authors missed
This paper was written in 2012, so the literature available on synchrony may have changed since this was written. Today, there is likely more data on behavioral synchrony but there may need to be a focus on how the pandemic has affected behavioral synchrony. For example, the paper refers to how “there is a limited window of time for the other partner to produce a coordinated behavior”(Delahersche et al., 2012). It would interesting to see if there are any differences in behavioral synchrony when you’ve only interacted with people online for a long period of time and suddenly start interacting in person again. Especially, curious to see if there is a difference in online settings since within online interactions the window of time to produce a coordinated behavior is larger than it is in person.
Delahersche et al. (2012) discuss future research directions and considerations when looking at behavioral synchrony in the fields of developmental robotics, social robotics, and clinical studies.
The author raises a few questions about synchrony and future areas that could be explored in-depth such as the appropriateness of breaking down behavior “into small units” (Delahersche et al., 2012). I thought this was interesting because I also questioned this. Even if we were to operationalize it, how could one ensure that it could apply across different cultures, for example, that may speak at different rates compared to each other. Or as the researcher mentioned, there is also the possibility it may change throughout the interaction or by context. It is something to consider given the fact that they referred to extensive studies on synchrony and timing, but are suggesting that breaking down synchrony by time may not be as meaningful as it was believed. It is also a noteworthy contribution that may suggest that other factors should receive more attention when researching and designing for behavioral synchrony.
They also refer to a study that says the perception of coordination “was more unanimous when coordination was high or very low” and discuss how judges in the study could not judge dyads with “medium”. They question how synchrony and dimensions of synchrony should be determined and whether it is continuous or discrete. They use this study as an example to show how dimensions of behavioral synchrony may lean more towards discrete or synchronous vs non-synchronous (Delahersche, 2012). Considering this may be important when designing within human-computer interaction contexts for the reason that if a robot does not exhibit coordination that is too extreme on either end of the spectrum, it could make it so that the user perceives it as a synchronous interaction similar. This reminded me of how some robots may wait for a pause too long to respond making the coordination seem very low, but if the pause before a response were “just right”, so to speak, they may be more likely to view it as a synchronous interaction.
Delahersche et al. (2012) also refer to prospects in developmental robotics surrounding behavioral synchrony and approaches for designing this in robots. The first approach they reference involves milestones for future research in developmental robotics relating to social learning theory. The sequence involves the robot learning how to use “non-verbal social cues” to learn language and skills which, when reinforced by their human partner, will allow them to achieve a “synchronized exchange” because the robot will associate where the user looks with the information they extract from the user’s speech (Delahersche et al., 2012).
They also refer to another robotic architecture proposal based on a study that uses a similar “monkey see, monkey do” approach to learning and measured the robot’s ability to “measure the degree of synchrony” with the participants and adapt its behavior accordingly. In this study behavioral synchrony or arm movement, allowed the robot to learn right-left associations.
Future contexts this could be studied in could be with participants with autism or depression, as the authors mentioned earlier since they are contexts where social abilities or disabilities can be explored further for therapeutic or research purposes. I would personally be interesting in seeing how this could done to coach social skill defecits or even as a refresher for interacting with people after the pandemic.
Exploring Skin Conductance Synchronisation (Slovak et al., 2014)
Researchers Slovak et al. (2014) wrote this paper to contribute to existing literature on behavioral synchrony, specifically synchronization of electrodermal activity (EDA) and studying its correlation with empathy. Their work also provides insight into potential uses of EDA synchrony in human-computer interaction. Consensus on the definition of synchrony was, at the time, unclear and varied throughout the literature. However, the literature Slovak et al. (2014) reviewed seemed to present a common theme that context was a strong influence for any differences in synchrony that are observed but it was still unclear why context mattered. There was also another theme seen where most of the data focused on individual biodata in isolation rather than looking at this data in a real world context. Their paper presents a study the researcher designed to test whether there were any synchrony patterns between pairs in a conversation that occurred in a “quasi-naturalistic” setting and the implications of the results of the study for HCI.
Impact and Relevance
Behavioral synchrony ties to psychology and social neuroscience and work has been done to research the relationship biosignal and nonverbal signal synchrony has with interactions. Slovak et al. (2014) also refer to its relevance to work within HCI that studies empathy and methods to design technology that can be used in Autism research. They refer to empathy as an area of research that still needed to be addressed in affective computing and social signal processing.
Alternative Approaches and What the authors missed
The study conducted involved the use of mixed methods to collect data and conducted a qualitative analysis looking at patterns seen in the conversations from 20 pairs of friends within a “quasi-naturalistic” environment where they discussed topics that could elicit empathy to a certain extent. Participants were 23 males and 17 females with both single gender and mixed gendered pairings for the study. They measured EDA using skin sensors and recorded their conversations with audio and video setups. Participants were asked to discuss a meaningful topic that they could relate on with a partner with 5 minutes to think about the topic before beginning the actual conversation. This was the natural phase of the study to give insight about natural synchrony patterns. They also included a treatment called the “ignored period” that involved one participant ignoring the other during their monologue of the topic. After debriefing and interviewing the participants together, they processed the data using an algorithm that calculated the “value of moment-by-moment physiological concordances and physiological synchrony within a single session (Slovak et al., 2014).
The conversations were held at a pub, to simulate a real word environment, however, they explicity stated that they made sure participants did not drink alcohol. While I understand that may be a confounding variable, I thought that it would be interesting to have a test group that did use alcohol simply because that is likely what people are doing at a pub. An alternative could be not using a pub and instead a school common space or something of that nature where it is still a natural setting where these real world interactions can occur. I also felt that future research may also attempt to include some Latinx participants if possible, just for sake of representing all demographics. Fliers were posted around campus and at the pub but maybe Latinx students do not frequent the pub they used. I also thought the ignore phase of this experiment could be ommitted or maybe ensuring that they do not talk about topics that are too sensitive in phase because being ignored may be triggering for the participant, even with debriefing at the end.
Slovak et al. (2014) make a few contributions to HCI including a hypothesis for EDA synchrony as emotional reactivity, discussions about the implications for HCI in everyday contexts, and finally an alternative approach to analyzing interpersonal judgments based on the data they collected.
The key focus of their findings involved analyzing if and how observed changes in the interactions also correlated to changes in synchrony. They found that high EDA corresponded with high emotional engagement for both during the conversations. Slovak et al. (2014) define emotional engagement as the act of “attending to each other in a focused way”. Their findings also showed that consistent synchrony was not only seen in intimate and vulnerable conversations, but also in every day conversation topics as long as emotional engagement was present. For conversations that the participants did not perceive as engaging, the findings showed fluctuating synchrony with the example of the polite discussion about the participants dislike for coconuts, while funny it was evident that they were just talking about it for the sake of talking about it. Other findings showed that emotional engagment and associated consistency of synchrony occured especially in moments when the speaker was becoming angry or confused and they tried asking direct questions from the ignoring participants. This made me wonder if the types of questions asked can increase emotional engagement. For example, direct questions such as “are you listening?” or “do you even care about what im talking about?” could elicit some type of physiological response that is not apparent outwardly. I thought this tied well to references made by Delahersche et al. (2012) to a study done where participants showed increases in muscular activity over the cheek muscle region when they were facing happy facial expressions. What was interesting about this study was that this activity was so subtle it was not able to be seen with the naked eye so I wonder if the consistent synchrony seen in the ignoring participant conversations is tied to this phenomenon.
The research presented by the authors above present interesting and exciting applications for HCI research. As someone who studied psychology in undergrad with extensive courses in developmental psychology, it was exciting to see social learning theories applied to other disciplines.
Delaherche, E., Chetouani, M., Mahdhaoui, A., Saint-Georges, C., Viaux, S., & Cohen, D. (2012). Interpersonal synchrony: A survey of evaluation methods across disciplines. IEEE Transactions on Affective Computing, 3(3), 349-365.
Slovák, P., Tennent, P., Reeves, S., & Fitzpatrick, G. (2014, October). Exploring skin conductance synchronisation in everyday interactions. In Proceedings of the 8th nordic conference on human-computer interaction: Fun, fast, foundational (pp. 511-520).
Jennifer Posada HCC 746 Affective Human-Computer Interaction
Reflective Informatics: Conceptual Dimensions for Designing Technologies of Reflection (Baumer, 2015)
Baumer (2015) contributes to existing research on reflection and provides “concepts and unifying concerns” that can be applied to seettings such as health, education, as well as personal informatics. The purpose of this work is to provide common terminology around what reflection provides so that designers for reflective technology have a common framework to refer to. The author refers to previous work on this topic and draws from each in order to present commonalities for future designers to reference.
Impact and Relevance
This work is especially relevant as we gain access to more of our data through the invention of apps like Fitbits and others similar to it. It also allows for future researchers to gain insight into what is meant when “talking about reflection” (Baumer, 2015).
The literature review could do be pared down specifically, the philosophy portion where points could be made more concisely especially since language was a bit heavy at some points. I think the author was very thorough in their research to support the commonalities they presented. I do not think they missed anything and feel that they addressed their goal relatively well, again I think it could have been more concise in some areas.
This contributes to the gap in literature on frameworks to refer to for design consideration for reflective technology. It also offers commonalities in which technologies can be designed to “promote, foster, encourage, or support reflection” (Baumer, 2015). These commonalities are: breakdown, inquiry, transformation, evaluation. Breakdown refers to the breaking down of a certain function where “the system renders that expectation inaccurate” and the user reflects on their expectation of a certain response with an example given of an email being delayed when sent and as a result “alleviates the onus to respond quickly” and allows them to reflect on this. Inquiry considerations in design may involve supporting inquiry by showing the user what they already know but encouraging them to reevaluate the data in a new lens. This reminds me of 23 and me where you get new “insights” about your DNA such as cultural facts about certain ancestry. The last two commonalities Tranformation and Evaluation, according to Baumer (2015), are likely the most challenging to designers. Transformation consideration in design could involve allowing different approaches to making sense of data as the user experiences transformation within themselves, however, doing so may still be difficult. Evaluation design considerations could involve determining how different interventions may affect reflection.
Understanding Self-Reflection: How People Reflect on Personal Data Through Visual Data Exploration (Chloe et al., 2017)
Researchers Chloe et al. (2017) share their research self-reflection in HCI with the goal of supporting “self-trackers in reflecting on their data”. They pose two research questions, what is the process of self-reflection and what are the outcomes of self reflection. They discuss prior research that differs the definition of reflection such as the stage-based model of Personal Informatics as well as those that discuss two types of reflection such as “reflection in action” or through real time feedback and “reflection on action” which is through aggregated feedback” (Chloe et al., 2017). They came to the conclusion that people reflect in different ways depending on which “self-tracking phase” they are in, as well as when feedback is given and how. They then designed a web-based application called Visual Self that allowed for tracking of personal data with features such as “timeline visualization, overview, comparison tab, and maps” and conducted a think-aloud session where they observed users and their use of the app. The researchers discuss results of this research and how it contributes to the field of HCI as well as future directions for research.
Impact and Relevance
The impact of using technology that allows people to ask questions and explore data means that there are more “opportunities to foster deep self-reflection” (Chloe et al., 2017). This ties back to Baumer (2015) and their reference to how reflection could be used in contexts of oppressors and the oppressed to foster potential change. It also has promising opportunities to be used in education or healthcare. Again, since more applications are available now that allow for collection of personal data, design considerations for features that allow self-reflection over this data may be needed.
The researchers used a professional recruiting agency for participants however I noticed that only 3 participants were female. I think that if they use an agency there should be effort made to make sure there is representation from all genders to avoid missing out on any emerging differences or themes. The inclusion criteria for participants involved having to have been “regularly tracking personal data for the past two months or longer” as well as also already using tech such as Fitbit, Aria, etc. I found this to be interesting because while it makes sense to use people who are already used to using this type of technology, it would be interesting to see how participants who are not familiar with this tech would reflect on their data. They also conducted a pre-study questionnaire where they were asked about their demographic information. I thought it was interesting that most occupations seem to require at least a college education, with the exception of the student and postal worker. I wondered if this sample is truly representative and think future research would also make the effort to use more diverse participants.
Results from this research were interesting with some contradicting expecations. For example, Quantified Selfers “despite their extensive data tracking experience” did not have much history of reflecting over that data beyond feedback that was given in real time (Chloe et. al, 2017). This was interesting because if experience in data tracking does not always translate to experience with reflection then the criteria they used for participant recruitment could be omitted in future research. Other findings include that participants enjoyed being able to import data from current devices and older devices. I thought this was important for future research and to HCI because we are constantly upgrading our devices and losing data or moving data from previous devices can be a concern. Therefore, I agree with the researchers that an important design consideration would be for each upgrade to support continuity, as mentioned by the researchers and give users the means to easily transfer their data. I thought this was important as well because designs could also be made to allow the user to have autonomy over their data in the event that they want to switch to a different platform. This way they can still potentially import their data to a new system.
Some insight gaining patterns Chloe et al. (2017 found involved the use of visual data exploration and paths on a map acting as memory triggers. Visual data exploration involved users looking at peaks in the graphs to compare different time periods and allowed participants to look at past behaviors and reflect on why the peaks occurred. My thoughts on this were that this could potentially be harmful if someone sees multiple peaks because they may feel as though they are not making as much progress as they had hoped and also related to the “paralysis analysis” referred to by Baumer (2015) where they fixate on reflection and do not come to any type of “meaningful resolution”. One concrete example of this could be someone with depression or biploar disorder may see peaks often and may not come to a meaningful or constructive conclusion because they ruminate over the data and it’s perceived extremeness.
The researchers also found that “paths on a map” helped the users remember events that occurred a long time ago. really liked this and thought of ways this could help someone with dementia, although I wonder whether you would need to consider if dementia will impede the user from remembering how to use the reflection application thus defeating the purpose there. In that case, I also think this could work to identify maybe whether one commute route is more stressful than another. Ultimately, these findings specifically related to their first research question about the process of self-reflection and revisiting past experiences and informed the researchers about features for future iterations that support the process of revisting past experiences and reflection on them.
Another pattern they noticed answered their second research question about the outcomes of self-reflection where the users would look at a certain time period and explain what happened in the past such as moments when they were not “taking care of” themselves. Ultimately, this “prompted people to come up with a new interesting question to ask, leading them to visually explore their data to look for an answer” which relates to Baumer’s commonality of Inquiry so future research may look at supporting questions or offering suggestions for inquiry so they can look within their data for an answer.
Finally, another interesting finding that relates to previous findings from work such as AffectAura is that participants felt that it was “uneccessary” to view data daily because changes did not occur that often so it seems to reflect the sentiment that long-term data visualization is most valued.
Both papers give a look at how future applications should design opportunities for users to reflect on their own data as well as consider methods to support reflection within different contexts, rather than focusing on a single context since methods of reflection and feature preferences users have to achieve that will likely vary.
Baumer, E. P. (2015, April). Reflective informatics: conceptual dimensions for designing technologies of reflection. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (pp. 585-594).
Choe, E. K., Lee, B., Zhu, H., Riche, N. H., & Baur, D. (2017, May). Understanding self-reflection: how people reflect on personal data through visual data exploration. In Proceedings of the 11th EAI International Conference on Pervasive Computing Technologies for Healthcare (pp. 173-182).
Jennifer Posada HCC 746
Designing Mental Health Technology that Support the Social Ecosystem of College Students(Lattie et al., 2020)
Researchers Lattie et al. (2020) studied ways to design mental health technology in their paper “Designing Mental Health Technology”. The purpose of the study was to find solutions to best meet the increased demand for mental health services by specifying the best ways to design tools for these purposes. They highlight the importance of understanding the contexts in which mental health needs arise so that the technology best serves the user. I thought this was interesting because it gives more information about how influential the environment can be, for example, more support may be desired in a home setting versus a school setting.
The use of digital mental health tools is a possible solution with benefits such as providing services “promptly, efficiently, and cost-effectively” (Lattie et al., 2020). In this study they look at user recommendations for a digital mental health tool design and the influence social considerations have on the likelihood that the user will want to engage with and learn about the tool as well as recommend it to other peers.
Impact and Relevance
According to the researchers, previous work shows there have been “increased reports of mental health problems” in the past 10 years where college counseling centers have difficulty supporting students who are already in the vulnerable and difficult phase of emerging adulthood (Lattie et al.,p.235, 2020).
Past literature on this phase of development shows that influence from peers affects how interested students are in seeking mental health support, therefore this research adds relevant because it looks at how peers influence design preferences for digital support tools.
The researchers used IRB-approved methods for the study that involved the use of co-design workshops and individual interviews to determine desired features for an ideal digital mental health tool. There were 20 college students and 10 college counseling center staff who acted as participants for this study. The participants were screened using a patient health questionairre and were asked to self report levels of anxiety using the GAD-7 and were asked their opinons on college mental health care. After they were interviewed the researchers presented them with a mock-up for a proposed mental health tool and asked what design preferences they had and what they would like to learn about its functionality.
What the authors missed
Some limitations to the work done by Lattie et al. (2020) involve their sampling methods. Their recruitment methods relied on students opening the emails sent out advertising the study. Students that are struggling with mental health or are worried about the stigma may be less likely to participate in the study but would probably benefit the most from a tool like this and offer unique insight into the needs for such a system. The researchers touch on this limitation by also mentioning more serious health conditions such as schizophrenia but I also thought of COVID and the importance of considering it as a health condition that was not mentioned. Granted, this was published in April 2020 so it is unclear whether the pandemic started already. Future studies should include COVID as a health condition to see if it uniquely affects mental health needs.
Another limitation of this study is that a large porportion of the participants were female. While it gives us detailed information about this demographic, it would be interesting to see if there are differences in design preferences in other genders. This is also more inclusive and gives further nuance to how we understand design preferences.
Their study contributes to existing research in HCI that looks at student mental health and can provide more insight for existing technologies that have not been tested in real world contexts. Findings from this study showed the “predominant role of known peers, and the ancillary role of unknown peers and non-peers” in what the digital health tools are desirable for students as well as the role they play in how students learn about these tools (Lattie et al., 2020). These results also provided insight into how students “seek support from different social groups according to nuanced differences in the types of support they seek within specific contexts” (Lattie et al., p.2, 2020). College allows for increased proximity to all types of friend groups and living in this context involves “highly embedded social relationships” as the researchers put it, which makes me wonder: if this app is tailored best for college students, will they be left with a tool that does not translate to life after college? Especially since people tend to socialize less and work more than while they were in college and are not close proximity of their peers.
However, the researchers also mention the influence of unknown peers and, possibly because of increased presence online, there is a “strong desire for connection broadly” (Lattie et al., p. 7, 2020). I think this is an interesting desire to have, and may be a recent development especially since we have access to vast amounts of opinions and people to “connect”. This is depicted nicely in a quote from a participant that said “just knowing that a lot of other people in the United States or in the world are going through whatever you are going through, this is a supportive thing. (Participant #8)” . It did make me wonder whether this could exaggerate mental health issues if not used properly similar to “doomscrolling” done during the early days of the pandemic.
The type of peers a student had and whether they were supportive or unsupportive was found to be influential in whether that student sought mental health support with many participants reporting fear of social rejection as a common barrier. Recommendations were raised to give access to services discreetly such as with “text-based information resources” (Lattie et al., p.9, 2020).
Aside from how peers influenced mental health support, researchers found that peers also played a role in preference toward certain technologies. They found that some peers encouraged the use of popular meditation applications which ultimately led to accountability toward using the digital health tool.
Another interesting finding was that the omnipresence of personal technology like social media was “occasionally viewed as contributing to isolating people” which begs the question of whether digital health tools will do the same. In other words, will a user feel like they are still alone because they do not have a real therapist listening to them and reassuring them that they are not alone in their struggle, that others have gone through that as well. I would think a future design could use AI or something of that nature to compile the experiences of others to draw from when responding to the user and their issues, similar to how a real therapist would draw on training and experience with past clients. This may also increase trust in the tool and its ability to provide quality care.
Recommendations for design functions involve being able to use the tool to “build a community”. This stood out to me because self-care has been really emphasized during the pandemic. The general consensus seems to be that social connections with community are more important than we thought and that self-care may not be enough. I also thought about how mental health treatment could also be seen as a learning process, like with cognitive behavioral therapy where you learn methods to cope with issues. Since social factors play a role in learning it may also play an important role in how digital mental health tools are used. Other recommendations requested the ability to see events occurring at school, which could support this desire to connect and build a community.
Some participants expressed the need to find a counselor that a “person of color or culturally competent” which contributes to HCI research on the importance of considering culture for design preferences and how information is presented to users. The culture in question does not have to be racially or ethnically specific it could also relate to differences in age group cultures or culture differences on other dimensions.
Overall, their contributions give excellent insight into the role of socialization and how it affects design preferences for mental health tools for college students.
Sharing Biosignals Liu et al 2019
Researchers Liu et al (2019) study opportunities for systems to support sharing of biosignals for “mood-centric” interactions. They look at how sharing biosignals can give insight into emerging communication patterns and how people understand their own and other people’s biosignals (Liu et al., p.3, 2019)
Impact and Relevance
With the increased use of smartwatches and other wearables and the popularity of personal monitoring features, research in this area can impact how we design future personal monitors and use biosignals to communicate with others. Connecting with others remotely with the use of biosignals may be impactful during this time where we are not able to interact with loved ones and provide another channel of communication that is more meaningful than phone or video calls.
The researchers conducted a two week field study with 20 pairs that used an app they called Animo. This smartwatch app was made to be used on a Fitbit, due to its compatibility with IOS and Android, and allowed users to represent mood using vector graphics called an animo. The animo mood meanings were based on the “valence-arousal circumplex” and were simplified based on constraints for smartwatches to focus on heartrate as a way to measure for valence and arousal. My thought was that limiting valence to positive and negative may result in the loss of nuance for how responses can be interpreted, similar to how they did in AffectDiary. The vector graphics used different presentation characteristics such as color, shape and motion to present mood. The researchers chose to change colors to “encourage questioning of interpreation of Animo” (Liu et al., 2019).
Prior to testing the app, they were onboarded by taking a questionaire and baseline heartrate was measured during a calming task. They were taught how to use it and were left to use it for one week where they were then given mid study survey to measure their initial thoughts on Animo. After they finished the second week of the study, they did offboarding process which involved returning the watch and participating in individual interviews. Here the researchers showed users their five most used and received animos and sketches of future mock-ups for Animo.
What the authors missed
The use of mental health professionals to incorporate enhanced mood detection could be beneficial for future work in this area. Other limitations raised by the researchers were that many of their participants worked at tech companies so findings may not generalize to non-tech savvy people. I thought this was important to consider and what to add culture as another consideration for how people communicate since people may have different communication styles that lead them to use biosignals differently.
The researchers also propose response as a new dimension for determining whether simple or complex “rich” biosignals are more appropriate responses.
Liu et al. (2019) found that sharing biosignals give users social cues about our own emotions and others which is similar to previous work done with Affect Aura (McDuff et al., 2012). After using Animo, participants found that connection and communication was easier and more convenient because it offered reminders that made their partner more “salient”. This is likely because it represented the connection to their partners and allowed for communication when it was difficult to use other means (Liu et al., p.8, 2019). What stood out and could be another important contribution is the idea that one participant said “the thing on my wrist was him” which shows that there is indeed some type of meaning placed on this method of communication.
They also found participants were able to create new understanding about emotional states because they felt they could get new information about their partner’s emotional state which adds to existing research that supports status awareness through sharing biosignals (Liu et al., 2019). This consequently clarified ambiguity in partner’s feelings as well which may be helpful in mental health applications like social anxiety where the user can see their partner’s biosignals and be reassured that they are not secretly angry or upset at them, thus helping anxiety.
Interestingly, another finding was that context was important for interpreting color meanings since Red may mean excited or angry in different contexts. I thought this was an important contribution to work on color theory that describes how color meanings rely heavily on context.
Future work could address the cognitive effort that is needed when there is limited context since participants felt that it required more effort than the researchers anticipated. They found that seeing only one state at a time made it hard to track animo states so they typically based their interpretation on one dimension rather than both.
Researchers also found that intimacy influenced who the users wanted to use this technology with likely because of the vulnerability involved with sharing heart rate as one participant mentioned they would rather use it with their best friend rather than their recent partner. I thought this tied well with the previous paper on digital mental health tools and the feelings of vulnerability around showing deeper emotions. Another promising indication of the utility of sharing biosignals was that Animos triggered conversation if users saw an Animo that made them want to transition to a heavier conversation. This could make it easier for people to share that they are struggling and provide a way to get to the deeper conversations without feeling as much discomfort.
This study is an important contribution to research on future uses for biosignal based communication as well as the important design considerations needed to improve how mood representations are interpreted.
Lattie, E. G., Kornfield, R., Ringland, K. E., Zhang, R., Winquist, N., & Reddy, M. (2020, April). Designing mental health technologies that support the social ecosystem of college students. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-15).
Liu, F., Esparza, M., Pavlovskaia, M., Kaufman, G., Dabbish, L., & Monroy-Hernández, A. (2019). Animo: Sharing biosignals on a smartwatch for lightweight social connection. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 3(1), 1-19.
McDuff, D., Karlson, A., Kapoor, A., Roseway, A., & Czerwinski, M. (2012). AffectAura: an intelligent system for emotional memory. In Proceedings of the SIGCHI conference on human factors in computing systems (pp. 849-858).
Stahl et al. (2009) conducted research that expanded on the concept of an embodiment of emotions, which essentially considers “bodily experiences” as an “integral” part of how we experience the world. Specifically, the purpose of the study was to design a system around the idea of “recollection and perhaps re-living of past bodily experiences” having an influence on personal reflection of emotions (Stahl et al., 2009). The researchers created a system that they called an Affective Diary, an affective system that is a metaphor for a physical diary one would write in and reflect over experiences with. It was designed according to research that supports the relationship between affect and bodily experiences and, in order to measure these bodily experiences, the researchers operationalized this by using bio-sensors to collect data. They also offer a term for this data called “affective body memorabilia” (Stahl et al., 2009). To use the Affective Diary system, the participant uses an app, puts on body sensors that measure pulse and skin reactivity to indicate levels of “movement and arousal”(Stahl et al., 2009). They then use a tablet to log activities such as texts, pictures, and the presence of other phones in the area via Bluetooth tracking along with the data collected from the biosensors (Stahl et al., 2009).
The researchers conducted an “exploratory end-user study” of 4 users after several weeks of use and found that the users were “able to identify with the body memorabilia” which allowed for recollection and reflection on their past (Stahl et al., 2009). Interestingly, the researchers found that 2 users were able to find “patterns” in the body memorabilia data and even “attempted to alter their own behaviors” (Stahl et al., 2009). This study contributes to the existing literature on designing a system for affect as an interaction rather than as simple discrete information processing.
I think that this could be an interesting system to use in applications for supporting Alzheimer’s and dementia patients, especially in the early onset population. Perhaps an application for showing users how to identify emotions may be useful for therapeutic purposes in psychotherapy practices as well. The point they raised about “leading a user to learn something they did not want to know” when using the affective diary is also something I considered important because ideally systems should be designed to improve the user’s life rather than have a negative influence. This may be an issue for patient’s with PTSD as well because it could trigger flashbacks of an event if they recall why their body memorabilia figure looked a certain way.
While the findings were interesting, the sample size in future research could be larger as well making sure to get participants from different cultures where color and emotions may be experienced differently, especially since the users interpret their figures by looking at the color and shape of the body memorabilia figures (Stahl et al., 2009). I thought it was interesting that the use of figures rather than graphs allowed the user to identify with it better and agreed with the point the researchers brought up about the need for increasing the variety of abstract portrayals of data to appeal to more users (Stahl et al., 2009).
The use of biosensors that allow for accurate measuring could be useful in future research to avoid the influence of confounding variables like weather and the clothes the user wears. The researchers also pointed out that “large individual differences” can also occur so maybe including questions about sweating and heart rate as something to look out for in participants could help mitigate this (Stahl et al., 2009).
Researchers Sundstrom et al. (2007) created a mobile emotional messaging system named eMoto that allowed users to compose messages using “emotional-signaling gestures” to create a message that incorporated colors and animations to express their emotional content. When designing the system the researchers considered that the system needed to be “engaging physically, intellectually, and socially,” and resonate with their “real world experience” as it is referred to by Stahl et al., (2009). This is important in order for users to become involved in an “affective loop experience” therefore, the researchers used “cultural probes”, which involved giving users materials to report on personal and subjective experiences themselves to gain insight into what those real world experiences entail (Sundstrom et al., 2007). They also used a method called “experience clips” where a spectator obtains data on how the system is used, as well a technology probe that tells the user the system is unfinished and allows for observation of how the user fills in the gaps over time (Sundstrom et al., 2007). I thought these methods were great approaches to collecting information about experiences and provide important information about how context influences how emotionas are expressed.
Their work also mirrors the Affective Diary’s use of color and other modalities to express emotions because the design of eMoto allowed “expressing affect through various modalities, such as colour, shape, animation, characteristics in body posture, sound and haptics” (Sundstorm et al., 2007). I thought the use of color to express emotions was an interesting recurring theme I have seen in our previous readings and wonder whether this still applies in other cultural contexts.
This study contributes to the field of HCI by mirroring the importance of designing for affect as an interaction. Their design moves beyond the view of emotions as a “singular state that exists within one person” when in reality it is a “process between the two friends communicating” (Sundstrom et al., 2007). Sundstrom et al. (2007) found that lab studies alone would not be enough to achieve the understanding necessary to design for experiences of emotions and emphasize the importance of using methods that encourage reflection of the user’s emotional experiences. I think this is a great application of participatory design practices and agree that this would be important for future research.
Ståhl, A., Höök, K., Svensson, M., Taylor, A. S., & Combetto, M. (2009). Experiencing the affective diary. Personal and Ubiquitous Computing, 13(5), 365-378.https://doi.org/10.1007/s00779-008-0202-7
Sundström, P., Ståhl, A., & Höök, K. (2007). In situ informants exploring an emotional mobile messaging system in their everyday practice. International journal of human-computer studies, 65(4), 388-403.https://doi.org/10.1016/j.ijhcs.2006.11.013
The ability for interactive agents to support humans in negative emotional states was studied by researchers such as Klein et al. (2002) to see whether designs made specifically to support the users were effective in managing emotional states. Their research found users interacted “significantly longer” with the system designed to demonstrate “active listening, empathy, and sympathy” as compared to the control conditions where the agent either ignored the user’s emotions or only allowed the user to report issues (Klein et al., 2002). The impact of these findings could be important for the design of future interactive agents, especially as technology becomes more ingrained in our society allowing for more situations where users become frustrated with systems they interact with. Klein et al. (2002) also reference previous research on how humans already seek ways to manage their emotions with the use of technology whether it is playing music or seeking social interactions, but the issue remains that they have to actively seek this. An interactive agent, in this case, could prove to be helpful by providing users with a playlist based on the emotional input that is given by the user or immediately starting a call with a loved one or therapist.
When it comes to how to design a system for a supportive interactive agent, the author’s reference strategies based on previous research that supports this goal. An important strategy mentioned, in my opinion, was the ability of the system to “allow for repair if the feedback is judged wrong” in the event that the user has trouble articulating their emotions especially if they are in a highly emotional state (Klein et al., p. 125, 2002). This may also make the interaction feel less artificial and allow it to flow similarly to how an interaction with a human would. This relates to the idea the researchers referenced about new research regarding the tendency for users to interact with machines “as if they were other people” as well as responding to computer personalities in the same way (Klein et al., p. 124, 2002).
Isbister and Nass (2000) also contributed to research into how humans respond to systems designed with personalities by conducting an experiment where users were paired with computer characters. In their study, they found that users tended to prefer interactive agents that had personalities that were “complementary, rather than similar, with their own” (Isbister & Nass, 2000) This contributes to the field of affective HCI because the findings contrasted what was seen in previous research on this subject. I find it interesting too because there is the general belief that we tend to like people or personalities that are similar to ours. Another relevant contribution is the idea presented by Isbister and Nass (2000) about emphasizing the use of an introversion-extraversion scale when looking at social interactions but using dominance-submissiveness scales when looking at joint control and control shifts. This adds nuance to how personality can be understood and then applied to HCI practices especially since the researchers chose an extroversion-introversion dimension because of its importance in research about nonverbal cues (Isbister & Nass, 2000).
Overall, the researchers contributed useful knowledge about how interactive agents can be designed to provide a “likeable” character and ultimately support users whenever they experience negative emotional states during their interactions with technology.
Isbister, K., & Nass, C. (2000). Consistency of personality in interactive characters: Verbal cues, non-verbal cues, and user characteristics. International Journal of Human-Computer Studies, 53(2), 251–267. https://doi.org/10.1006/ijhc.2000.0368
Klein, J., Moon, Y., & Picard, R. W. (2002). This computer responds to user frustration: Interacting with Computers, 14(2), 119–140. https://doi.org/10.1016/s0953-5438(01)00053-4
Possessed Photography. (2018). White and brown human robot illustration. Unsplash.com; Unsplash. https://unsplash.com/photos/YKW0JjP7rlU
Ethical considerations have been a point of discussion in the area of facial recognition technology. While it has been argued that a lot of good can come from emotion-oriented systems, the means by which they become emotion-oriented often involve collecting facial recognition data which brings into question whether these means are an invasion of privacy. A recent increase emotion oriented technology makes these considerations relevant given the possiblity that they may become even more intertwined with our daily lives.
Hernandez et al. (2020) present guidelines for “assessing and minimizing risks of emotion recognition applications” as a way to address ethical concerns about different applications currently used. They describe how the” lowered barriers for entry” and the “gaps in regulating the usage of this technology” also make it necessary to consider how we use these tools (Hernandez et al., 2020).
The authors discussed applications of this tech such as in job recruitment but that they lack “contextual understanding” which may inhibit the user (Hernandez et al., 2020) I thought this was important to note especially since not everyone presents the same facial expressions or even any obvious expressions like in the case of certain disorders, which the authors could have elaborated more on. They also mentioned the importance of consider “human computer collaboration” and argue that it is important to give any predictions to “an expert operator” so that they can make a final decision and override any of the systems limitations(Hernandez et al., 2020). The authors also include the guideline of “transparency” categorized under “informed consent” which addresses the concern users have over how emotional recognition data will be used. The guidelines presented by the authors are a useful contribution to the field of affective HCI because it gives people in the field ethical ways approaches to developing this technology.
The author Cowie (2012) takes a “balanced view” of ethical considerations for emotion oriented technology and presents the case for applications that “present no ethical problems of note, and others that do.” One example is that of “frivolous applications” which are especially relevant in todays society and the prevalence of using “indicators of emotion to electronic messages” such as smileys which are generally recognizable (Cowie, 2012). They bring up an important point about “computer anxiety” and its role in disadvantaging users as well as how to approach this issue by suggesting interfaces that are “less demanding” with the ability to give a response to anxiety (Cowie, 2012). The authors also included the importance of avoiding “misrepresenting scientific understanding” by disclosing that they do not have a specialized understanding of a specific “aspect of humanity” (Cowie, 2012). Cowie (2012) argued that demonstrations of emotion oriented technology can mislead users to believe the creators know more about humanity. I thought that was an important contribution to the paper because it acknowledges the importance of transparency in all areas of developing this type of technology whether it is in research or in its applications.
Both readings provided a deeper look into applications of this type of technology as well as how approaches to ethical issues are being researched and designed.
Cowie, R. (2012). The good our field can hope to do, the harm it should avoid. IEEE Transactions on Affective Computing, 3(4), 410-423.
Hernandez, J., Lovejoy, J., McDuff, D., Suh, J., O’Brien, T., Sethumadhavan, A., … & Czerwinski, M. (2021, September). Guidelines for Assessing and Minimizing Risks of Emotion Recognition Applications. In 2021 9th International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 1-8). IEEE.
Affective computing as an emerging field has been interesting to witness especially in today’s society where wearables allow you to see your heart rate and even pinpoint exactly when it went through the roof during a stressful event that day. Rosalind Piccard (1997) discusses the idea that emotions are comprised of both physical and cognitive elements. She claims that while there is a connection between physiological responses and emotions, it is important to note that certain physiological responses can occur that are “similar to those in an emotional state” without actually corresponding to an emotion such as when there is an increased heart rate during exercise (Piccard, 1997). I thought this was an important contribution to affective HCI because it raises the question of the context in which the physiological response is expressed and the need to show computers to consider that context when making an assessment of what emotion the user is displaying. Brave and Nass (2002) also reference Piccard’s emphasis on the importance of emotion in HCI and believe that in order to design interfaces that “elicit desired affective states” designers need to know the causes of emotions and mood and the differences as well.
The reading by McDuff & Czerwinski (2018) expands on the implications and considerations designers need to make when designing computers to recognize emotions. The authors take the same position as the previously mentioned and they discuss how “systems that respond to social and emotional cues can be more engaging and trusted” which the authors believe can be important in automating psychological assessments and evaluations (McDuff & Czerwinski, 2018). I think this can be an important contribution to the field of HCI especially as a solution to the “demand effect” the authors mentioned where behavior changes as a result of “cues as to what constitutes appropriate behavior”(McDuff & Czerwinski, 2018). However, I think a limitation of this may be that patients could wonder whether the system will not use discretion if the patient presents behavior that shows them to be mentally unstable and still change their behavior as a precaution. This also ties into the point brought up by Brave and Nass (2002) on assessing a user’s response to an interface and the importance of considering “the biasing effects of a user mood”. They state that “a person in a good mood tends to view everything in a positive light” with the opposite occurring if they are in a bad mood (Brave & Nass, 2002).
Another aspect of emotional sensing that was interesting was the use of style matching linguistic patterns to build rapport between the person you are interacting with (McDuff & Czerwinski, 2018). This is applied in the HCI field with the creation of software such as LIWC that “enables automatic extraction of linguistic style features” in order to create an emotional bond with the user. I think the author made an important point when they mentioned limitations these systems may have because they may not capture the “full complexity of human language” (McDuff & Czerwinski, 2018). They also touch on the need for systems to synthesize tone of voice and I think this ties well with what Piccard (1997) discussed in regard to vocal inflection as an important component of recognizing emotion. McDuff and Cerwinski (2018) note that it requires “thousands of lines of dialogue to be recorded” in order to create a realistic appraisal of emotion and suggest machine learning may simplify this task in the future. I believe this is also an important contribution to the field of HCI by providing methods to capture nonverbal cues to use in affect recognition.
Each of the authors contributed discussions that give the reader insight into how nuanced emotional processes as well as possible applications in HCI and ethical implications of creating these technologies.
Brave, S., & Nass, M. (2002). Emotion in Human-Computer Interaction. The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. https://doi.org/10.1201/b10368-6.
McDuff, D., & Czerwinski, M. (2018). Designing Emotionally Sentient Agents. Community ACM, 61(12), 74-83. https://doi.org/10.1145/3186591
Piccard, R. (1997). Affective Computing