Evolving in-game mood-expressive music with MetaCompose

Marco Scirea, Julian Togelius, Peter Eklund, Sebastian Risi

Publikation: Bidrag til bog/antologi/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

Resumé

MetaCompose is a music generator based on a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers). In particular, this paper focuses on determining if differences in player experience can be observed when: (i) using affective-dynamic music compared to static music, and (ii) the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activity (EDA). The data collected confirms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with performance) that players prefer dynamic affective music when asked to reflect on the current game-state. In the future this system could allow designers/composers to easily create affective and dynamic soundtracks for interactive applications.

OriginalsprogEngelsk
TitelProceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion
Antal sider8
ForlagAssociation for Computing Machinery
Publikationsdato12. sep. 2018
Artikelnummer8
ISBN (Elektronisk)978-1-4503-6609-0
DOI
StatusUdgivet - 12. sep. 2018
BegivenhedAudio Mostly 2018 on Sound in Immersion and Emotion - Wrexham, Storbritannien
Varighed: 12. sep. 201814. sep. 2018

Konference

KonferenceAudio Mostly 2018 on Sound in Immersion and Emotion
LandStorbritannien
ByWrexham
Periode12/09/201814/09/2018

Fingeraftryk

Multiobjective optimization
Blood
Wear of materials

Citer dette

Scirea, M., Togelius, J., Eklund, P., & Risi, S. (2018). Evolving in-game mood-expressive music with MetaCompose. I Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion [8] Association for Computing Machinery. https://doi.org/10.1145/3243274.3243292
Scirea, Marco ; Togelius, Julian ; Eklund, Peter ; Risi, Sebastian. / Evolving in-game mood-expressive music with MetaCompose. Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion. Association for Computing Machinery, 2018.
@inproceedings{2d8a829bea1849699f98655d48d1947d,
title = "Evolving in-game mood-expressive music with MetaCompose",
abstract = "MetaCompose is a music generator based on a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers). In particular, this paper focuses on determining if differences in player experience can be observed when: (i) using affective-dynamic music compared to static music, and (ii) the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activity (EDA). The data collected confirms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with performance) that players prefer dynamic affective music when asked to reflect on the current game-state. In the future this system could allow designers/composers to easily create affective and dynamic soundtracks for interactive applications.",
keywords = "Affective expression, Evolutionary algorithms, Music generation",
author = "Marco Scirea and Julian Togelius and Peter Eklund and Sebastian Risi",
year = "2018",
month = "9",
day = "12",
doi = "10.1145/3243274.3243292",
language = "English",
booktitle = "Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion",
publisher = "Association for Computing Machinery",
address = "United States",

}

Scirea, M, Togelius, J, Eklund, P & Risi, S 2018, Evolving in-game mood-expressive music with MetaCompose. i Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion., 8, Association for Computing Machinery, Audio Mostly 2018 on Sound in Immersion and Emotion, Wrexham, Storbritannien, 12/09/2018. https://doi.org/10.1145/3243274.3243292

Evolving in-game mood-expressive music with MetaCompose. / Scirea, Marco; Togelius, Julian; Eklund, Peter; Risi, Sebastian.

Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion. Association for Computing Machinery, 2018. 8.

Publikation: Bidrag til bog/antologi/rapport/konference-proceedingKonferencebidrag i proceedingsForskningpeer review

TY - GEN

T1 - Evolving in-game mood-expressive music with MetaCompose

AU - Scirea, Marco

AU - Togelius, Julian

AU - Eklund, Peter

AU - Risi, Sebastian

PY - 2018/9/12

Y1 - 2018/9/12

N2 - MetaCompose is a music generator based on a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers). In particular, this paper focuses on determining if differences in player experience can be observed when: (i) using affective-dynamic music compared to static music, and (ii) the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activity (EDA). The data collected confirms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with performance) that players prefer dynamic affective music when asked to reflect on the current game-state. In the future this system could allow designers/composers to easily create affective and dynamic soundtracks for interactive applications.

AB - MetaCompose is a music generator based on a hybrid evolutionary technique that combines FI-2POP and multi-objective optimization. In this paper we employ the MetaCompose music generator to create music in real-time that expresses different mood-states in a game-playing environment (Checkers). In particular, this paper focuses on determining if differences in player experience can be observed when: (i) using affective-dynamic music compared to static music, and (ii) the music supports the game's internal narrative/state. Participants were tasked to play two games of Checkers while listening to two (out of three) different set-ups of game-related generated music. The possible set-ups were: static expression, consistent affective expression, and random affective expression. During game-play players wore a E4 Wristband, allowing various physiological measures to be recorded such as blood volume pulse (BVP) and electromyographic activity (EDA). The data collected confirms a hypothesis based on three out of four criteria (engagement, music quality, coherency with game excitement, and coherency with performance) that players prefer dynamic affective music when asked to reflect on the current game-state. In the future this system could allow designers/composers to easily create affective and dynamic soundtracks for interactive applications.

KW - Affective expression

KW - Evolutionary algorithms

KW - Music generation

U2 - 10.1145/3243274.3243292

DO - 10.1145/3243274.3243292

M3 - Article in proceedings

BT - Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion

PB - Association for Computing Machinery

ER -

Scirea M, Togelius J, Eklund P, Risi S. Evolving in-game mood-expressive music with MetaCompose. I Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion. Association for Computing Machinery. 2018. 8 https://doi.org/10.1145/3243274.3243292