Organizer 1: Gupta Kaushalya, World Wide Web Foundation
    Organizer 2: Cheng Sage, Access Now
    Organizer 3: Kibughi Sheila, World Wide Web Foundation

    Speaker 1: Falbe Trine, Technical Community, Western European and Others Group (WEOG)
    Speaker 2: Nnenna Nwakanma, Civil Society, African Group
    Speaker 3: Adedolapo Adegoroye Adedolapo Adegoroye, Private Sector, African Group

    Moderator

    Gupta Kaushalya, Civil Society, Asia-Pacific Group

    Online Moderator

    Cheng Sage, Civil Society, Asia-Pacific Group

    Rapporteur

    Kibughi Sheila, Civil Society, African Group

    Format

    Panel - Auditorium - 60 Min

    Policy Question(s)


    Ecosystem Snapshot and Visibility in Lived Experience
    Through this question we try to gain a clearer current picture of the definitions, understanding, and narratives around deceptive design.We are trying to identify different scenarios and specific harms caused as a means to ground policy ideas in lived experience.

    Question 1: What do you think are the typical scenarios where deceptive design manifests in daily lives?

    Barriers and Opportunities for change
    In this section we will identify gaps and opportunities for specific policy interventions and guidelines to address deceptive design.

    Question 2: What do you think could be the most prominent barriers to adoption in ethical and trusted design or where specific policies and guidelines around deceptive design (e.g. privacy, consumer protection, antitrust, children protection…) may have fallen short?

    Question 3: What could companies find useful and motivating from integrating more ethical design, or trusted design practices, and what kind of guidelines, rules or practices would you suggest for companies so that the adoption of ethical design is sustainable?

    Connection with previous Messages:

    SDGs

    1.4
    5.b
    10. Reduced Inequalities
    10.3
    17.6


    Targets: Deceptive designs bear disproportionate harm to vulnerable groups and marginalized communities, who are new to the web and hence less aware of these practices. Lower levels of digital literacy exacerbates their vulnerability, and deters them from accessing and reaping the benefits of the web to empower themselves and to leverage technology to better their lives. In order to promote women's empowerment, more emphasis needs to be placed on using enabling technology, particularly information and communications technology. Unfortunately, deceptive designs cause quite the opposite effect. Promoting appropriate legislation, policies, and actions to address these issues can help ensure equal opportunities and reduce inequalities of outcome. This is why we at the Web Foundation set up the Tech Policy Design Lab. We urgently need more collaborative and innovative ways to tackle the significant technology policy challenges we face, from privacy and online safety, to misinformation, discrimination and internet fragmentation, to name only a few. At present we lack sufficient spaces where companies, governments and civil society can work together constructively to create product and policy solutions that can shape a better, fairer, safer web. The Tech Policy Design Lab is the flagship initiative of the Contract for the Web, which is helping change this. We aim to bridge the gap between companies, governments, civil society, and those who use online services — applying the right mix of expertise and real experiences to create effective, workable product and policy solutions to some of the biggest technology challenges of our time. Finally, we believe that the sum is greater than its parts. The Lab is multistakeholder, meaning we will invite participants to join in the process from the beginning: by contributing research, recommending other participants, co-creating solutions, testing and adopting the solutions (governments, companies) or advocating for decision-makers to adopt the solutions (civil society). We often find that technology policy is shaped in Silicon Valley whereas the bulk of the users are located in the Global Majority. We are changing this to ensure that diverse voices are heard and play a role in the solutions-building process. At the Web Foundation, our Lab’s approach is to work with stakeholders across sectors to learn directly from those affected by technology so that we can work together constructively to redesign our digital spaces.The Lab is a place where people’s experiences drive policy and product design, and where solutions take into account the full diversity of those who use digital tools. Through Web Foundation and Access Now’s global network, we strive for a policy development approach that is human centered in order to create a web for everyone.

    Description:

    Deceptive designs, or “dark patterns”, refers to verbal and visual language in the interface design of apps and websites designed to obscure or impair consumer autonomy or choice and alter decision-making to lead us towards actions we might not otherwise take. Like when companies make it easy to subscribe to a service but near-impossible to cancel. Or when you have to jump through endless hoops to tell a service not to scoop up and sell your personal data.

    Deceptive designs jeopardize digital protections of online populations, especially those at risk. They reinforce a cyber norm that puts people’s personal data and assets at risk, attenuate trust, and further deprive agency from communities who are already marginalised. Moreover, deceptive designs compound the inequality between regions and communities in developments by disproportionately impacting those with less tools and channels to defend themselves. What makes this issue tricky to mitigate is that the language we use to address deceptive designs mean different things to a variety of stakeholder groups in their own contexts. Many people — such as children and teens, the elderly, or disadvantaged communities — don’t have the necessary information and aid to discern “trustworthy” or "responsible" designs from manipulative ones when they use these digital tools, services, and platforms.

    One of our key findings in a series of workshops and dialogues between professionals from policy, business, design, and advocacy backgrounds is the disconnection between policy development, design education, and public discourses. In addition, there is a major lack in incentives and business interest to invest in responsible design practices. As lawmakers move to clamp down on manipulative and deceptive designs, multi-stakeholder collaborations are in dire need to build guidelines and benchmarks for what responsible designs look like as part of a wider shift towards a future for people to take back control over their own data and online properties.

    This session invites everyone — policymakers, civic technologists, experts in both the public and private sectors, designers and digital product teams, and the people of the internet. We aim at brewing an effective synergetic approach that connects the efforts across sectors to enable a continuing channel to learn directly from those affected by technology. We hope participants will leave this session with directions for systematic interventions that combine policy change with public advocacy and design education.

    We need the designs that shape the digital world and online experiences to be responsible for people no matter who and where they are to live their lives online with dignity, autonomy, and a sense of trust. We urgently call for an integrated solution that mobilise governments, companies, tool developers, and design practitioners to centre human rights and digital protection and explore responsible design solutions at scale.

    Expected Outcomes

    The Web Foundation is currently running a multi-stakeholder Technology Policy Design Lab with reports, recommendations, and commitments culminating at the end of September 2022 to be shared widely and sustained through our network of partners. Including a session at the IGF on deceptive design will directly contribute to a report that will be shared widely and influence and affect government and platform policy decisions.

    Through this session, we are looking to gather some ideas around developing a model for engaging a set of committed stakeholders to formulate, implement and follow through on policy interventions:

    1) What kinds of engagement models seem relevant for combatting deceptive design?
    2) What are ideas we could use to rally and drive broader outreach?

    Learn how to participate in the network of civil society, industry, and government practitioners who will carry the work forward from the Tech Policy Design Lab.

    Hybrid Format: Our speakers will be present onsite and online to represent diverse viewpoints from across the world. This session comes from The Tech Policy Design Lab which is being run by Web Foundation and its global design firm partners, 3x3 and Simply Secure, as well as Access Now. We will be leveraging our experience of bringing stakeholders together from across sectors globally to co-create solutions at our workshop series. Careful consideration will be given to facilitate seamless transitions between online and on site speakers and attendees. Towards this aim, we will draft a detailed agenda, prepare facilitation materials, and host a rehearsal session ahead of the event.

    We may use a voting tool (Menti), unless IGF has a preferred method to poll audiences, in which case we are happy to comply with your preferred approach. We will use the chat function and Google Doc to engage audience questions. Most importantly, we will follow up with a call to action for participants to get involved in advancing the multistakeholder approach to combating deceptive design through a variety of initiatives.

    Online Participation



    Usage of IGF Official Tool.