Inducing Novel Sound–Taste Correspondences via an Associative Learning Task

Research output: Contribution to journalJournal articleResearchpeer-review

Documents

  • Fulltext

    Final published version, 832 KB, PDF document

The interest in crossmodal correspondences, including those involving sounds and involving tastes, has experienced rapid growth in recent years. However, the mechanisms underlying these correspondences are not well understood. In the present study (N = 302), we used an associative learning paradigm, based on previous literature using simple sounds with no consensual taste associations (i.e., square and triangle wave sounds at 200 Hz) and taste words (i.e., sweet and bitter), to test the influence of two potential mechanisms in establishing sound–taste correspondences and investigate whether either learning mechanism could give rise to new and long-lasting associations. Specifically, we examined an emotional mediation account (i.e., using sad and happy emoji facial expressions) and a transitive path (i.e., sound-taste correspondence being mediated by color, using red and black colored squares). The results revealed that the associative learning paradigm mapping the triangle wave tone with a happy emoji facial expression induced a novel crossmodal correspondence between this sound and the word sweet. Importantly, we found that this novel association was still present two months after the experimental learning paradigm. None of the other mappings, emotional or transitive, gave rise to any significant associations between sound and taste. These findings provide evidence that new crossmodal correspondences between sounds and tastes can be created by leveraging the affective connection between both dimensions, helping elucidate the mechanisms underlying these associations. Moreover, these findings reveal that these associations can last for several weeks after the experimental session through which they were induced.

Original languageEnglish
Article numbere13421
JournalCognitive Science
Volume48
Issue number3
Number of pages31
ISSN0364-0213
DOIs
Publication statusPublished - 2024

Bibliographical note

Publisher Copyright:
© 2024 The Authors. Cognitive Science published by Wiley Periodicals LLC on behalf of Cognitive Science Society (CSS).

    Research areas

  • Associative learning, Color, Crossmodal correspondences, Emotions, Semantic, Sound, Taste

ID: 387697082