Part of: Emily Carr University of Art + Design
Course: Contemporary Dialogues in Design
Critical Response: The Trouble with Tinder: The Ethical Complexities of Researching Location-Aware Social Discovery Apps, in The Ethics of Online Research by Condie, J., Lean, G. and Wilcockson, B, 2017
(Advances in Research Ethics and Integrity, Volume 2), edited by K. Woodfield. Bingley: Emerald Publishing, 135–158.
Keywords: ethical considerations, user-generated content, feminist ethics, data mining, informed consent
The Trouble with Tinder: The Ethical Complexities of Researching Location-Aware Social Discovery Apps, in The Ethics of Online Research, is a draft of chapter number 6 from the book: The Ethics of Online Research (Advances in Research Ethics and Integrity, Volume 2), 2018. Dr. Jenna M. Condie and Dr. Garth Lean are the primary researchers and authors of this paper.
Dr. Jenna M. Condie and Dr. Garth Lean are both Senior Lecturers at the University of Western Sydney, Australia. Dr. Jenna M. Condie has earned her Ph.D. from the University of Salford. She is a digital researcher interested in how digital technologies transform people and places and how social media affects communities and neighborhoods. Her research methods are usually participatory, activist, and digital. Her work is motivated by the urgent need to address social injustice, urban inequity, and climate change, with housing, transportation, and gender as crucial focus areas. She is currently teaching eResearch and Social Analytics (Condie, 2018, 2021). Dr. Garth Lean acquired his Ph.D. from the University of Western Sydney in Australia and now teaches Tourism and Heritage Studies at the School of Social Sciences and Psychology. His research interests include travel, tourism experiences, personal transformation, and digital technology such as augmented reality, virtual reality, location tracking, digital geography, and digital research, as well as sexual racism and gendered encounters on location-aware social apps (Lean, 2021). There was another author involved, Brittany Wilcockson, but there was no information found about her.
In this paper, the authors focus on researching a trending subject of a location-aware social discovery app, Tinder, and how it plays a role in the modern travel experience. The authors also predict that Tinder will have a notable place in future social relations and encounters in location-aware technologies.
So what is Tinder? Moreover, why is it getting so much attention? Tinder is one of the famous dating apps which has millions of users worldwide. It uses proximity location and enables users to view, swipe – to accept/like or reject on the profiles of potential partners. When both the individuals' swipe each other right,' i.e. like each other's profile, they are 'matched' and may now be able to chat privately. Tinder also has a reputation of being a hookup app rather than a dating app. However, the authors were interested in its new premium 'Passport' feature; it lets users change their geolocations and match with people from different parts of the world; this helped users make connections before they arrive at the destination. This demonstrates the massive amounts of personal data collected daily through this app, right from where they live, travel and meet.
Meanwhile, even before the formal research began, the authors started to raise concerns about ethical considerations. The notion of ethical consideration is that the policies share core tenets such as human dignity, autonomy, protection, safety, maximizing benefits and minimizing harm, respect for persons, justice, and beneficence in the most current acceptable modality (Markham, 2015). This paper outlines the authors' methodological approach and application of various methods to different contexts. They concentrated on informed consent, privacy, copyright, and the distinction between manual and automated data mining approaches for user-generated content.
Meanings of some of the terms are as follows,
- Informed consent implies that a critical component is ensuring the information presented is effective and that participants understand it (Garcia, 2018).
- Data mining is the practice of examining a vast amount of data to identify trends and patterns (Twin, 2021).
- User-generated content refers to content created and posted to online social networking sites by individuals who are not affiliated with or sponsored by brands (Ramby, 2021).
The authors mentioned waiving the informed consent during their research by considering that Tinder user profiles were publicly available, i.e. the data had already been produced. As a result, consent would apply exclusively to the reproduction of data for research purposes. In contrast, the authors cite a poll indicating that most individuals are unlikely to contribute online data for research purposes. They also came across an Application Programming Interface (API) called "The Hoes" that automatically extracts data from Tinder profiles. The name itself raised a series of concerns that the authors started questioning themselves about why they even needed this data? They originally wanted to understand how the Tinder travel feature worked and how people presented themselves online based on it. Therefore, they decided to drop this quantitative method to approach the research and shift to traditional participatory methods.
There were many complexities in conducting this research, which led to changes in their methodological decisions. However, the authors also note that situating the study within epistemological and ontological frameworks would have saved them time and energy. As a result, they would have arrived at more feminist ethics and participatory approach sooner. Feminist ethics refers to understanding, criticizing and correcting how gender operates within moral beliefs and practices (Norlock, 2019).
Furthermore, there are excerpts from their original ethical applications that address the nature of their study throughout the paper. Including these in the study report allowed authors to reflect and analyze the ethical concerns at hand critically. Authors cited lines from NatCen (2014) and Ipsos MORI (2015), which caught the most attention.
"…people are more likely to say no than yes to their online data being used for research purposes…Just because data is available and accessible does not necessarily mean it is "fair go" for research purposes."
The quote brings the question of how much knowledge of 'data sharing' people have when they sign or subscribe to the privacy terms and conditions; likewise, how much do they understand and whether they even read it? Talking about sharing personal data on the internet in general; the article 'Customer Data: Designing for Transparency and Trust' posted on Harvard Business Review in May 2015, asked customers similar questions, and the response varied from country to country; but on an average, only 25% of users were aware that their data footprints contained location information, and only 14% were aware that they were also revealing their online browsing history. It is not that customers were not aware of it, but they did not realize how much data was online sectors had captured (Morey et al., 2015).
Then talking about Big Data is a compilation of information gathered about an individual right from internet searches, likes on social media, family background, credit card transactions, and more. Court Stroud posted a series of articles in Forbes that depicted the horrors of manipulation of big data; it has become so advanced that it can predict future behaviours by answering when what and where in regards to sharing different online information contexts. Collectors of big data then use this information to target online ads as one surfs the web. Nevertheless, Court Stroud points out the problem of when data collection first began, the promises to users or customers would be in regards that the user's data would not directly connect to the user names. However, today it is not the same case, as the data collector has enough information about the user to identify them in real life (Stroud, 2018a).
The preceding example demonstrates that users must take responsibility for understanding what data they are disclosing about themselves, how it can be used, and how to handle it if their data is misused. At the same time, it is recognized that the privacy, terms, and conditions statements are filled with jargon that appears to be written in a way so that not many people understand it. Thus, it does not constitute informed consent in this instance (Stroud, 2018b).
To conclude the quote, most of the population who publicly shared data do not know about the consequences. Hence when asked about it upfront, they reject the idea of sharing the data even if it is for helpful research. In the back of their head, they know sharing extra information about themselves online is not adequate, but as most of them ignore or do not understand the privacy, terms, and conditions statements, they fall into the pitfall. Therefore, some change is needed in the designing of these statements or the form of a feature. One of the great examples is how Apple devices give a pop-up after downloading a new app; it is straightforward and on point by asking the user whether they want the app to allow tracking activity across other apps and websites. This will enable users to control a part of essential data shared with the companies.
Two further noteworthy points recurred in the paper. The first one is related to ethical considerations in general. In their paper, the authors showed how different methodological changes are needed to reach a research destination due to obstruction caused by ethical issues considering various stakeholders involved. The second one is how they made the participatory approach more fun by creating a digital storytelling map, which was beneficial for the participants and conducted with correct ethical practices. Overall, the paper gave a raw and honest insight into how much iterations and significant changes are needed to have a successful outcome in the end. Also, one should always follow ethical guidelines and have a sense of ethicality even after getting the approvals from the board; this will also help update the guidelines in this fast pace world.
After absorbing all the information in this paper and reading related articles, it instantly triggered three interrelations between design, technology and behavioural science. Even though human behaviour was not a part of this paper, but thought of how much manipulation is happening with the help of big data is shocking. Maya Shankar, Google's Head of Behavioral Science department, gave a talk at the End Well Symposium (2019) and shared multiple examples where she mentions how a single small change can significantly impact a system. Then, coming to Arturo Escobar's Designs for the Pluriverse book (2018) and Pamela Pavliscak's Emotionally Intelligent Design book both mention how "Design Designs" us. So the curious thoughts arise: how is this connected to the notion of big data manipulation and our role as designers in it? How much knowledge of ethics do people have when it comes to such topics?
References
- Condie, J. (2018). Profiles. Westernsydney.edu.au. https://www.westernsydney.edu.au/staff_profiles/uws_profiles/doctor_jenna_condie.
- Condie, J. (2021). Jenna M Condie | Western Sydney University - Academia.edu. Westernsydney.academia.edu. https://westernsydney.academia.edu/JennaCondie.
- Condie, J., Lean, G. and Wilcockson, B., 2021. GSMD-502-F091-2021. [online] Courses.ecuad.ca.
- Escobar, A. (2018). Designs for the Pluriverse. Duke University Press.
- Garcia, C. (2018). Everything You Need to Know About Informed Consent - Atlan | Humans of Data. Atlan | Humans of Data. https://humansofdata.atlan.com/2018/04/informed-consent/.
- Lean, G. (2021). Profiles. Westernsydney.edu.au. https://www.westernsydney.edu.au/staff_profiles/WSU/doctor_garth_lean.
- Markham, A. (2015). 'Ethical considerations in digital research contexts'. ResearchGate. https://www.researchgate.net/publication/313470218_%27Ethical_considerations_in_digital_research_contexts%27.
- Morey, T., Forbath, T., & Schoop, A. (2015). Customer Data: Designing for Transparency and Trust. Harvard Business Review. https://hbr.org/2015/05/customer-data-designing-for-transparency-and-trust.
- Norlock, K. (2019). Feminist Ethics. The Stanford Encyclopedia of Philosophy (Summer 2019 Edition). https://plato.stanford.edu/entries/feminism-ethics/.
- Pavliscak, P. (2018). Emotionally intelligent design. O'Reilly Media.
- Ramby, K. (2021). What is User-Generated Content? The Ultimate Guide to UGC | Stackla. Stackla. https://stackla.com/resources/blog/what-is-user-generated-content/.
- Shankar, M. (2019). Google's Head of Behavioral Science on Why We Do What We Do? | Maya Shankar, PhD. Youtube.com. https://www.youtube.com/watch?v=NwCPtiPZwO4.
- Stroud, C. (2018b). How The Struggle Over The Use And Abuse Of Consumers' Data Could Play Out. Forbes. https://www.forbes.com/sites/courtstroud/2018/04/29/how-the-struggle-over-the-use-and-abuse-of-consumers-data-could-play-out/?sh=6e5d35b045bc.
- Stroud, C. (2018a). What You Need To Know About Big Data, Stripped Of All The Gobbledygook. Forbes. https://www.forbes.com/sites/courtstroud/2018/04/27/what-you-need-to-know-about-big-data-stripped-of-all-the-gobbledygook/?sh=4ced461a1a75.
- Twin, A. (2021). Data Mining Definition. Investopedia. https://www.investopedia.com/terms/d/datamining.asp.