Intonation does not work the same way in robots
For my Master's thesis in Information Systems I wanted to combine the field of linguistics with the robot interaction field, by doing a user study on intonation in robot speech. To do this study, I found the perfect combination of supervisors, namely Paul Boersma, cofounder of Praat and expert in phonetics, and Maartje de Graaf, researcher in the social cues of human-robot interaction, and furthermore a woman in the field I personally look up to.
What is the effect of humanlike intonation in robot speech on the naturalness of verbal interactions between humans and robots?
To answer this question I implemented humanlike intonation in the speech of a NAO robot, following the Dutch intonation rules. To be able to perform user studies I designed a small conversation about films and games between the robot and a person. This was tested with 120 students where I looked at both objective measurements, like gaze and interruptions, and subjective measurements, with a questionnaire on naturalness and more. The findings of this experiment can be found in this paper, that was published in the HRI 2020 proceedings, and nominated for best paper in the category "User Studies". You can also take a look at the video below, which was my presentation at the conference.