IGF 2023 Launch / Award Event #169 Design Beyond Deception: A Manual for Design Practitioners

Time
Sunday, 8th October, 2023 (23:30 UTC) - Monday, 9th October, 2023 (00:15 UTC)
Room
WS 4 – Room B-1
Subtheme

Digital Divides & Inclusion
Digital, Media, and Information Literacy

Theme
Digital Divides & Inclusion

The Pranava Institute
Titiksha Vashist, The Pranava Institute Shyam Krishnakumar, The Pranava Institute Dhanyashri Kamalakkanan, The Pranava Institute

Speakers
  • Caroline Sinders- Critical designer, Researcher, Founder- Convocation Research + Design.

  • Cristiana Santos- Assistant Professor in Privacy and Data Protection Law at Utrecht University, Expert of the Data Protection Unit, Council of Europe

  • Chandni Gupta- Deputy CEO and Digital Policy Director at the Consumer Policy Research Centre, Australia

  • Maitreya Shah- Lawyer, Fellow at the Berkman Klein Center for Internet and Society

  • Titiksha Vashist- Co-Founder and Lead Researcher at The Pranava Institute

 

Onsite Moderator

Titiksha Vashist

Online Moderator

Dhanyashri Kamalakkanan

Rapporteur

Dhanyashri Kamalakkanan

SDGs

3. Good Health and Well-Being
5. Gender Equality
9. Industry, Innovation and Infrastructure
10. Reduced Inequalities
17. Partnerships for the Goals

Targets: This project strongly emphasises the need for action at the practitioner level to tackle deceptive design- as issue closely aligned with goals 9 and 17. Technology which is trustworthy as well as safe and non-harmful for users form the global majority is what truly constitutes a resilient internet. Deceptive design, also commonly known as “dark patterns'' obscure or impair consumer autonomy or choice and trick users into taking actions that they may not otherwise take. Deceptive designs have a disproportionate impact on marginalised communities, including senior citizens, women and gender minorities, children, families with lower incomes, and people who are less digitally connected. These design choices undermine privacy, consumer protection, and trust in online products and services. Hence, a safe, trusted and transparent web is crucial for technological development, as well as access to information and innovation. This closely ties in with issues of inequality (wherein users from the global majority/ global south are not often kept in mind while designing digital exepriences, and are left out linguistically among other things), and limits access and participation in the internet (in the case of deception in screen readers for people with visual disabilities for eg.). A better web is one where these groups are centred to create a more equitable, diverse and safe technological future.

Format

The session structure will be as follows: Introduction: What is deceptive design, and why is it crucial now? (5 mins) Documenting daily life: Asking participants to share their personal experiences of deception– when have they felt tricked online or deceived by a platform? (10 mins) Insights from the research: How deceptive design goes beyond visual interfaces towards its understudied forms, with a focus on vulnerable groups and underrepresented communities.(10 mins) Launch and introduction to the manual (with session participants; 20 mins): Introducing the manual, and having a design practitioner lead discussion on how it can serve as a set of tools and frameworks which designers can use in their daily practice. Discuss what is the way forward? How can we build strategies to tackle deception and a movement towards responsible design as a community?

Duration (minutes)
45
Language
English
Description

Deceptive design, commonly known as “dark patterns'' obscure or impair consumer autonomy or choice and trick users into taking actions that they may not otherwise take. Deceptive designs have a disproportionate impact on marginalised communities, including senior citizens, women and gender minorities, children, families with lower incomes, and people who are less digitally connected, particularly in the Global South. These design choices undermine privacy, consumer protection, and trust in online products and services. 2022 was a crucial year for deceptive design regulation as policymakers in different countries began to take notice of deceptive practices. Important examples are the Federal Trade Commission in the US highlighting the rise in deceptive design and the European Union publishing the Digital Services Act, which includes laws that tackle deceptive design. Consumer councils worldwide have begun to take up deceptive design as a problem that actively harms consumers and sought to understand how current laws can be adapted to ensure consumer protection in the digital economy. Digital antitrust initiatives in multiple regulations have also touched on the challenges involving deceptive design. While academic literature focuses on sorting, creating taxonomies and finding evidence of deceptive design on the internet, practitioners such as designers within technology-building ecosystems remain largely unaware of the harms caused by their daily practice. There is need for further research that informs practitioners about more responsible, ethical and trusted design practices. Studies have shown that designers are crucial actors in technology and product development processes, with designers values flowing into the technological artefacts they build. It is crucial, therefore, to see designers as key stakeholders in deceptive design practices, who are central to building a safe and trusted web. In the past year, researchers at The Pranava Institute have been holding consultations with global experts in HCI, human-centered design, privacy and data protection and civil society to create a manual for design practitioners using a five-step methodology which helps them tackle deception in practice. Supported by the University of Notre Dame and IBM’s Tech Ethics Lab, the Design Beyond Deception project seeks to create a manual of ethical design which can be used by designers independently, in teams, or within their workplace to create safe and trusted digital environments and experiences. Rooted in multi-disciplinary academic work, practitioner-lead research, and interviews with global experts, the manual provides frameworks, questions, reflections, values and visual prototypes to help create alternatives to current practices. We seek to launch this manual at the IGF 2023 in order to involve the larger research and practitioner community, and share our work for the benefit of all. Project website: https://design.pranavainstitute.com Access the manual here: https://drive.google.com/file/d/1DRWhMyd4w2SMmXNISMPORJRikYOMoxUY/view?…

1. While the manual to be launched is focussed towards design practitioners, we seek to engage the wider IGF community on deceptive design as a crucial issue to ensure a safe and trustworthy internet for all. The session will encourage participants to reflect on their interactions with digital platforms and identify instances of deceptive design that impact them. In the slots for opinions and questions from the audience, the onsite moderator will be attentive to the physical queue and will ask the online moderator in the case of hands raised or written comments, in which case the questions will be allowed in a round-robin basis (that is, starting with the online hands and written chats, and then following the physical queue, and so on). The online moderator has the main task of maintaining the order of the raised hands and written chat, reading the questions, and giving the floor to online audience speakers. That way we will achieve an equal foot between the online and on-site audience. 2. The manual and help links will be shared onsite, as well as with online participants in order to make access easy. 3. Collaborative tools such as Miro will be used to gather feedback and comments from audiences, with the online moderator focussing solely on engaging the online audience through Miro, Polls and Chat storms.

Key Takeaways (* deadline 2 hours after session)

To address dark patterns and their associated harms, multi-stakeholder intervention is necessary. This requires engagement with a wider community of researchers, designers, technologists and policymakers on deceptive design as a crucial internet governance issue in order to ensure a safe and trustworthy internet for all. This will make digital rights like privacy actionable, and ensure consumers are protected online.

Call to Action (* deadline 2 hours after session)

As technologies evolve and newer interfaces emerge, deception can take different forms. Adopting principles of ethical and human-rights-centered design is crucial while building new technologies including AI-generated interfaces, AR/VR etc. Regulatory measures mustn't limit themselves to existing interfaces and taxonomies but instead locate deception within human-technology interaction to design a collective future that is beyond deception.