EmojiGrid

Source: Wikipedia, the free encyclopedia.

The EmojiGrid is an affective self-report tool consisting of a

facial expressions
of the emoji labels vary from disliking via neutral to liking along the x-axis, and gradually increase in intensity along the y-axis. To report their affective appraisal of a given stimulus, users mark the location inside the grid that best represents their impression. The EmojiGrid can either be used as a paper or computer-based response tool. The images needed to implement the EmojiGrid are freely available from the OSF repository.

The EmojiGrid: an emoji-labelled Valence (horizontal axis) × Arousal (vertical axis) self-report tool.

Applications

The EmojiGrid was inspired by Russell's Affect Grid [1] and was originally developed and validated for the affective appraisal of food stimuli,[2] since conventional affective self-report tools (e.g., Self-Assessment Mannikin[3] are frequently misunderstood in that context.[2][4] It has since been used and validated for the affective appraisal of a wide range of affective stimuli such as images,[5][6] audio and video clips,[7] 360 VR videos,[8] touch events,[9] food,[10] and odors.[11][12][13] It has also been used for the affective analysis of architectural spaces [14] to assess affective experience of trail racing,[15] and to assess the emotional face evaluation capability of people with early dementia.[16] Since it is intuitive and language independent, the EmojiGrid is also suitable for cross-cultural research.[4][17]

Implementation

In a computer-based response

vertical grid borders should be responsive (clickable), so that users can report their affective response by pointing and/or clicking inside the grid.  In practice, this may be achieved by superimposing (1) a clickable image of the unlabeled grid area on top of (2) a larger image showing the grid area together with the emoji labels. The images needed to implement the EmojiGrid are freely available from the OSF repository. An implementation of the EmojiGrid rating task in the Gorilla experiment builder is freely available from the Gorilla Open Materials platform
.

See also

Further reading

  • P. Kuppens, F. Tuerlinckx, J. A. Russell et al., “The relation between valence and arousal in subjective experience”, Psychological Bulletin, 139(4), 917-940 (2013). doi: 10.1037/a0030811
  • A. M. Mattek, G. L. Wolford, and P. J. Whalen, “A mathematical model captures the structure of subjective affect”, Perspectives on Psychological Science, 12(3), 508-526 (2017). doi: 10.1177/1745691616685863
  • E. Van der Burg, A. Toet, Z. Abbasi et al., “Sequential dependency for affective appraisal of food images”, Humanities and Social Sciences Communications, 8(1), paper nr. 228 (2021). doi: 10.1057/s41599-021-00909-4
  • E. Van der Burg, A. Toet, A.-M. Brouwer et al., “Serial dependence of emotion within and between stimulus sensory modalities”, Multisensory Research, 1-22 (2021). doi: 10.1163/22134808-bja10064

References

  1. S2CID 4837807
    .
  2. ^ .
  3. .
  4. ^ .
  5. .
  6. .
  7. .
  8. , retrieved 2021-11-28
  9. .
  10. .
  11. .
  12. .
  13. .
  14. ^ Sanatani, R.P. (2020). User-specific predictive affective modeling for enclosure analysis and design assistance", Imaginable Futures: Design Thinking, and the Scientific Method. 54th International Conference of the Architectural Science Association 2020. Auckland, New Zealand: Architectural Science Association (ANZAScA). pp. 1341–1350.
  15. S2CID 244197232
    .
  16. .
  17. .