Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
B 3: Smartphone and Sensors as Research Tools
Time:
Thursday, 10/Sep/2020:
3:30 - 4:30

Session Chair: Stefan Oglesby, data IQ AG, Switzerland

Presentations

How does (work related) smartphone usage correlate with levels of exhaustion

Georg-Christoph Haas1,2, Sabine Sonnentag2, Frauke Kreuter1,2,3

1Institut für Arbeitsmarkt- und Berufsforschung der Bundesagentur für Arbeit (IAB), Germany; 2University of Mannheim, Germany; 3University of Maryland, United States of America

Relevance & Research Question: Smartphones make digital media and other digital means of communication constantly available to individuals. This constant availability may have a significant impact on individuals exhaustion levels. In addition, being available often brings social pressure (e.g., "telepressure") at work or at home that may lead to a further increase in exhaustion at the end of the day. On the other side, constant connectivity may enable frequent contact to one’s social networks what might decrease exhaustion. We examine whether employees perceive "being available" as a burden or as a resource in their daily work.

Methods & Data: We use a combination of data from a probability based population panel from Germany (Panel Study Labour Market and Social Security -- PASS) and a research app (IAB-SMART), which passively collected smartphone data (e.g. location, app usage) and administered short daily surveys. Since app participants (N=651) were recruited from PASS, we are able to link both data sources. The PASS data provides us with sociodemographic variables, e.g. age, education, gender etc. and background information, which enables us to calculate population weights. From the passively collected app data, we can construct a series of predictors like daily smartphone usage and instant switches between apps. The level of exhaustion is measured by a survey question, which was daily repeated for seven days every three months, i.e., we have one to 14 measures per individual. Considering several selection processes within the data collection, we end up with an analysis sample of 163 individuals with 693 days that we use in a multilevel regression model.

Results: Our analysis is in an early stage. Therefore, we are not able to share results at the time of submitting this abstract.

Added Value: First, we assess if and how daily smartphone usage correlate with levels of exhaustion for individuals. Second, our analysis shows how a combination of survey and passive data can be used to answer a substantial question. Third, we share our experience of how to feature engineer variables from unstructured mobile phone data to valid variables that may be used in a variety of field in general online research.



The quality of measurements in a smartphone-app to measure travel behaviour for a probability sample of people from the Netherlands

Peter Lugtig1, Danielle Mccool1,2, Barry Schouten2,1

1Utrecht University, The Netherlands; 2Statistics Netherlands, The Netherlands

Relevance & Research Question:

Smartphone apps are starting to be commonly used to measure travel behaviour. The advantage of smartphone apps is that they can use location sensors in mobile phones to keep track of where people go at what time at relatively high precision. In this presentation, we report on a large fieldwork test conducted by Statistics Netherlands and Utrecht University in November 2018 and present on the quality of travel data using hybrid estimation using passive data and a diary-style smartphone app.

Methods & Data: A random sample of about 1900 individuals from the Dutch population register was invited by letter to install an app on their smartphone for a week. The app then tracked people's location for a week continuously. Based on an algorithm the app divided each day into “stops” and “tracks” (trips), which were fed back to respondents in a diary-style list separately for every day. Respondents were then asked to provide details on stpops and trips in the diary.

Results:

Having both sensor data and survey data allows us to investigate measurement error in stops, trips and details about these in some detail. A few types of errors may occur:

1) False positives: a stop was presented to a respondent that wasn’t a stop (and by definition also a track connecting this stop to another one).

2) False negatives: stops were missing from the diary (often because a respondent forgot the phone, or GPS tracking was not working properly).

How can we identify false positives and negative? How did respondents react to false positives, and how can we correct for this in estimates of travel behaviour?

Added Value: We will discuss each type of error, their size,and the context in which they occurred. Finally, we will discuss the overall impact of both false positive and false negatives and discuss their overall impact on the statistics of interest. We conclude with a discussion of how to generally move forward in combining sensor and survey data for tracking studies for social science, market research and official statistics



Data privacy concerns as a source of resistance to participate in surveys using a smartphone app

Caroline Roberts1,2, Jessica Herzing1,2, Daniel Gatica-Perez3,4

1University of Lausanne, Switzerland; 2FORS, Switzerland; 3EPFL, Switzerland; 4Idiap Research Institute, Switzerland

Relevance & Research Question: ---Early studies investigating willingness to participate in surveys involving smartphone data collection apps – and particularly, to consent to passive data collection – have identified concerns relating to data privacy and the security of shared personal data as an important explanatory variable. This raises important practical and theoretical challenges for survey methodologists about how best to design app-based studies in a way that fosters trust and the implications for data quality. We address the following research questions: 1) How do data privacy concerns vary among population subgroups, and as a function of internet and smartphone usage habits? 2) To what extent do expressed data privacy concerns predict stated and actual willingness to participate in an app-based survey involving passive data collection?---

Methods & Data: ---The data were collected in an experiment embedded in a three-wave probability-based, general population election study conducted in Switzerland in 2019. At wave 1, half the sample was assigned to an app-based survey, and the other half to a browser-based survey; at wave 2, the browser-based respondents were invited to switch to the app. At wave 1, respondents in both groups were asked about their attitudes to sharing different types of data and about their data privacy and security concerns. The quantitative findings are complemented with findings from user experience research.---

Results: ---Consistent with other studies, preliminary results show statistical differences in levels of concern about data privacy and the degree of comfort sharing different data types across subgroups (e.g. based on age, sex and response device) and confirm that privacy concerns are an important predictor of actual participation in a survey using an app.--

Added Value: ---Given the often weak relationship between attitudes and behaviours, and the apparent paradox between privacy attitudes and actual online data sharing behaviours, the possibility to assess how data privacy concerns affect actual participation in an app-based study of the general population is of great value. We propose avenues for future research seeking to reduce public resistance to participate in smartphone surveys involving both active and passive data collection.---