Check-in and access this session from the IGF Schedule.

IGF 2022 Networking Session #26 YouRBan: Being LGBTQIA+ on social media

    Time
    Thursday, 1st December, 2022 (08:45 UTC) - Thursday, 1st December, 2022 (09:45 UTC)
    Room
    Speaker's Corner
    • Gender Standing Group - ISOC
    • Elnur Karimov, Kyushu University, Academia, Eastern Europe
    • Veronica Piccolo, Youth SG ISOC, Technical Community, WEOG
    • Umut Pajaro Velasquez, Gender SG ISOC, Technical Community, GRULAC
    Speakers
    • Jenni Olson, GLAAD, Civil Society, WEOG
    • Thereza Koole, Transactivist, Civil Society, GRULAC
    Onsite Moderator

    Umut Pajaro Velasquez

    Online Moderator

    Elnur Karimov

    Rapporteur

    Veronica Piccolo

    SDGs

    5. Gender Equality
    10. Reduced Inequalities

    Targets: This proposal links to SDG 10 by exploring the potential of access and innovation for the common good, reducing inequities, and addressing pitfalls before it is too late; and SDG 5 by giving space to underrepresented voices, such as women and gender-expansive people.

    Format

    The networking session will consist of a gathering at two times and places: online and onsite. The first part of the gathering will consist of a presentation on the topic by the speaker, then we split the session as emulation of breakout rooms: one on-site and another online. Both rooms will be in charge of the moderator who will stimulate the conversation between the participants sharing their thoughts about the talk and also talking about any topic they consider is important to bring into the conversation at the moment and possible future gatherings or panels at the IGF. 5 minutes before the end both, onsite and online, participants will gather again in one hybrid space and the moderators will share as a resume the main topics spoke in both conversation and some final remarks about the networking session. The session will guarantee the hybrid format following the next proposed agenda: 10 minutes of the speaker’s presentation 45 minutes of conversation between participants in the audience onsite and online 5 minutes for possible conclusions or new questions to be answered in future talks The guarantee of the full participation of everyone in the session, despite the location, both moderators will be granted according to petition the floor to the people that want to be part of the discussion with questions and comments related to the presentation and topic directly or indirectly. The conclusion will be an approximation of the summary of this shared by the rapporteur.

    Duration (minutes)
    60
    Language
    English
    Description

    Social media platforms have become the most essential way of communication for youth, and especially for LGBTQIA+ people who are artivists, activists, people starting exploring their identities, and even sex workers. Because those platforms could help create a sense of community, or belonging to something bigger, and/or a space of agency where our identities can be showcased. Thanks to the fact that in recent years the implementation of new technologies or tools to manage the content on those sites; these identities are being censured or restringed their visibility. This is another way to extend the marginalization and exclusion of their rights as a community, as humans, and this is why we should care about it and addressing in spaces such as the IGF where the essential conversations later define the reality of the Internet are taking into account.

    The online and onsite moderator will lead the conversation between participants with an equal time (if the session allows it) for both and have mainly into account the order of how participants requested the floor. Participants can make questions or participation in Spanish, English, and Portuguese. But as it is an international event English as lingua franca is encouraged.

    Key Takeaways (* deadline at the end of the session day)

    There are huge efforts coming from the LGBTIQ+ community in raising awareness about the constant violation of their civil and human rights, which is not limited online but it mostly translates into real life threats and criminal offenses towards them.

    There are elements from which it can be assumed that there is a phenomenon of diversity-washing operated by the main big platforms. In this regard, the speakers pointed out their experience with the inconsistency between the Social media’s policy on diversity and what they actually do by taking down LGBTIQ+ accounts and contents or by allowing anti-LGBTIQ+ hate and propaganda.

    Call to Action (* deadline at the end of the session day)

    Social Media platforms should hire more LGBTIQ+ people to deal with content moderation and their policy on diversity. Social Media platforms should invest more resources in promoting diversity in a way that is inclusive for the LGBTIQ+ community.

    Governments should not leave platforms alone to self-regulate content moderation, but they should rather come up with a legal framework that ensures fairness and legality in carrying out their activities, especially when those activities can harm people.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Umut Pajaro Velasquez, chair of the Internet Society gender standing group, during their introductory remarks described the networking session as part of the efforts or the gender standing group to expand the vision beyond the binary model of gender in the internet governance ecosystem, and more specifically as an attempt to understand the reasons behind the application of social media bans towards LGBTIQ+ people.

    Thereza Koole, an independent transactivist from Brazil, presented on the importance of awareness raising about diversity. For LGBTIQ+, diversity means to have the right to be different, free, healthy, and respected without being exposed to extremism and violence, but also to have the space to express themselves in different ways, also on social media. The reality is, however, very different, for they are either targeted by haters or disproportionately censored. The speaker pointed out the difficulties that she endured first-hand and what she is doing to change that: as a pre-service teacher, she deals with the concept of diversity inside classes, and as an activist, she raises awareness in public spaces where LGBTIQ+ people often experiment violence and backlash in form of laws that prohibit free speech.

    Thereza commented that there is a need to talk directly about the problems, but it needs to be done professionally, involving the public at large in the advocacy process, starting from the allies and professionals working in digital companies. In fact, if it is important to reach the users, it is even more important to reach the professional behind the code. For this reason, the speakers deemed it important that LGBTIQ+ representation is promoted inside big tech companies.

    The second speaker, Jenni Olson talked about her work at GLAAD, a U.S based LGBTIQ+ media advocacy organization. She mentioned their social media safety program, an accountability initiative focused on monitoring and advocating LGBTIQ+ safety, privacy, and expression in the context of five major social media platforms (Facebook, Instagram, Twitter, YouTube and TikTok). They work on algorithmic bias, data privacy, anti-LGBTIQ+ hate, and particularly around haters devoted to gaslighting people into retracting LGBTIQ+ rights by perpetuating disinformation about them.

    Jenni mentioned that in the US context, platforms have to be constantly encouraged to develop better policies but also to enforce their existing policies by taking down anti-LGBTIQ+ rights content that is in violation of their own hate speech policies. What these platforms do not understand is that, in certain cases, the anti-LGBTIQ+ rhetoric results in real-world harm. The speaker mentioned the case of the mass shooting outside a gay bar in Bratislava, where the culprit was reported to extensively post anti- LGBTIQ+ content. 

    In this context, GLAAD urges Platforms to improve their content moderation process through advocacy, recommendations, and other tools such as their social media safety index, which rates the platforms on a number of indicators including respect for digital rights. This notwithstanding, it emerged a clear need for regulatory oversight and accountability like the one that is being carried out in the EU. 

    The discussion about accountability was resumed by Gabriel from Nigeria, who carried out research according to which accounts are taken down, not upon the social media initiative, but after being reported by a number of users. 

    Jenni replied that, in reality, there are factual elements that confirm the inconsistency between what platforms say in their policies and what they do. This argument was supported by the moderator, Umut, who shared their research about the existing gap between the Platforms’ policy about diversity and the reality of facts.

    This latest comment was followed up by the intervention of the first speaker, Thereza, who clarified there are social media, like Reddit, that allow the use of words without restriction, however, they were successful in taking down many accounts that were using hate speech and promoting wrongful prejudice. The other speaker, Jenni, replied that this is an example of “artisanal” moderation, where it is the community that self-moderate. Historically, Reddit used to be an unhealthy space, but in the last couple of years, there were improvements being made thanks to the community itself doing the moderation. 

    Finally, an unidentified participant commented on the fact that a China-based platform, TikTok, uplifted the ban, despite the fact that the platform is not based in a liberal or democratic country. Jenni confirmed that TikTok is peculiar compared to other social media, mainly established in the libertarian cultural environment of Silicon Valley. However, there are pros and cons. On the one hand, it was surprising that TikTok agreed to implement the conversion therapy ban upon GLAAD’s request; on the other hand, there were episodes of bans being implemented in Arabic-speaking countries, where the authorities made TikTok remove the word “gay” in Arabic. Continuing on the cultural differences in the companies, Jenni stressed the recent pejorative changes made on Twitter after the takeover by a new owner.

    At the end of the session, the moderator, Umut, thanked the speakers and participants and wrapped up the session.