Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).
Session Chair: Florian Keusch, University of Mannheim, Germany
Unlocking new technology – 360-degree images in market research
Relevance and Research Question:
Using 360-degree images in research studies presents a lot of benefits for researchers, clients, as well as consumers: it allows us to present more realistic concepts and products for evaluation – and it gives respondents the ability to examine products and concepts in more detail, and in a more realistic context, and thus hopefully increase respondent engagement.
Methods & Data:
In an experimental design, we compared the responses and behavior of respondent being exposed to traditional images (i.e. static/front-facing) vs. 360-degree concepts (N=600 completes). We focused on engagement metrics (direct engagement and passive in survey measures) and measured the possible impact of 360-degree images on the overall survey data.
We will show that unsurprisingly, respondents showed a positive reaction on the new way of displaying concepts / products; in particular, we will highlight how engagement measures increased. We will also discuss the impact on data we observed, and we will present our recommendations on whether or not to we believe replacing traditional images with 360-degree images would impact benchmarks or trends.
This research is examining the impact of the new 360-degree technology on survey data and gives an outlook on how it can be adapted to serve market research needs.
A new experiment on the use of images to answer web survey questions
Oriol J. Bosch1,2, Melanie Revilla2, Daniel Qureshi3, Jan Karem Höhne3,2
1London School of Economics and Political Science, United Kingdom; 2Universitat Pompeu Fabra, Spain; 3University of Mannheim, Germany
Relevance & Research Question: Taking and uploading images may provide richer and more objective information than text-based answers to open-ended survey questions. Thus, recent research started to explore the use of images to answer web survey questions. However, very little is known yet about the use of images to answer web survey questions and its impact on four aspects: break-off, item nonresponse, completion time, and question evaluation. Besides, no research has explored the effect of adding a specific motivational message encouraging participants to upload images, nor of the device used to participate, on these four aspects. This study addresses three research questions: 1. What is the effect of answering web survey questions with images instead of text on these four aspects? 2. What is the effect of including a motivational message on these four aspects? 3. How PCs and smartphones differ on these four aspects?
Methods & Data: We conducted a web survey experiment (N = 3,043) in Germany using an opt-in access online panel. Our target population was the general German population aged between 18-70 years living in Germany. Half of the sample was required to answer with smartphones and the other half with PCs. Within each device group, respondents were randomly assigned to 1) a control group answering open-ended questions with text, 2) a first treatment group answering open-ended questions with images, and 3) a second treatment group answering with images but prompted with a motivational message.
Results: Overall, results show higher break-off and item nonresponse rates, as well as lower question evaluation for participants answering with images. Motivational messages slightly reduce item nonresponse. Finally, participants completing the survey with a PC present lower break-off rates but higher item nonresponse.
Added Value: To our knowledge, this is the first study that experimentally investigates the impact on break-off, item nonresponse, completion time, and question evaluation of asking respondents to answer open-ended questions with images instead of text. We also go one step further by exploring 1) how motivational messages may improve respondent’s engagement with the survey and 2) the effect of the device used to answer on these four aspects.
Artificial Voices in Human Choices
Carolin Kaiser, René Schallner
Nuremberg Institute for Market Decisions, Germany
Relevance & Research Question:
Today, most recent available voice assistants talk with non-emotional tone. However, with technology becoming more humanoid, this is about to change. From a marketing perspective, this is especially interesting, as the voice assistant’s emotional tone may affect consumers’ emotions which play an important role while shopping. For example, happy consumers tend to seek more variety in product choice and are more likely to engage in impulse buying. Against this background, we explore how the tone of a voice assistant impacts consumers’ shopping behavior.
Methods & Data:
We develop a deep learning model to synthesize speech in German with three different emotional tones: excited, happy and uninvolved. Listening tests with two experts, 120 university students, and 224 crowd workers are performed to ensure that people perceive the synthesized emotional tone. Afterwards, we conduct lab experiments, where we ask 210 participants to interact with a prototypical voice shopping interface talking in different emotional tones and we measure their emotion and shopping behavior.
Listening tests confirmed very good quality of synthesized emotional speech. Experts recognized the emotion category with almost perfect accuracy of 98%, university students with 90% and crowd workers without any German skills still achieved an accuracy of 71%. The lab experiment shows that the tone of voice impacts participants’ valence and arousal which in turn impact their trust, product satisfaction, shop satisfaction and impulsiveness of buying.
In human-human-interaction, people often catch the emotion of other people. With the increasing use of voice assistants, the question arises whether people also catch the expressed emotion of voice assistants. Several studies manipulating voices found that the same social mechanisms prevalent in human-human-interactions also exist in human-computer-interactions. However, there is also research showing that people interact differently with computers than humans. For example, they are more likely to accept unfair offers by computers than by humans. Considering the contradicting evidence, this study aims to shed light on emotional contagion in the interaction between voice assistants and consumers. This is especially important since voice assistants may potentially reach and impact a huge number of consumers in contrast to one single human shop assistant.