Session
Artificial Intelligence (AI) & Emerging Technologies
Chat GPT, Generative AI, and Machine Learning
Future & Sustainable Work in the World of Generative AI
Virtual/Augmented Reality
Organizer 1: Niels Van Paemel, 🔒
Organizer 2: Deborah Vassallo, 🔒
Organizer 3: Julia Piechna, NASK - National Research Institute
Organizer 4: João Pedro Damas Martins, EURid
Organizer 5: Sabrina Vorbau, 🔒European Schoolnet - Insafe
Organizer 6: Joachim Kind, 🔒
Speaker 1: Catherine Van de Heyning, Civil Society, Western European and Others Group (WEOG)
Speaker 2: Jenna Manhau Fung, Technical Community, Asia-Pacific Group
Speaker 3: Bernard Kao, Civil Society, Asia-Pacific Group
Speaker 4: Roberta Metsola, Government, Western European and Others Group (WEOG)
Speaker 5: Ricardo Campos, Civil Society, Latin American and Caribbean Group (GRULAC)
Niels Van Paemel, Civil Society, Western European and Others Group (WEOG)
João Pedro Damas Martins, Civil Society, Western European and Others Group (WEOG)
Julia Piechna, Technical Community, Eastern European Group
Round Table - 90 Min
LEGISLATION: What measures can be taken by governments internationally to fight effectively against practices online that may fall into legal loopholes?
AWARENESS: In the context of the inappropriate utilization of artificial intelligence to victimize and exploit children and young people, what awareness campaigns would prove to be more efficient? Furthermore, in situations where indicators of exploitation are not apparent, what message should be conveyed to young individuals?
PREVENTION: From a multi-stakeholder approach, how can we use AI to battle AI?
What will participants gain from attending this session? 1. Understanding the risks of sextortion and online grooming: Participants can learn about the various techniques used by online predators to groom children and the potential risks associated with online activities.
2. Information on how perpetrators are using AI tools to extort, entice or groom children online.
3. Knowledge about deep fakes and deep nudes and what tools and techniques can tech companies develop to detect and prevent deepfakes and deep nudes.
4. Better resilience: Awareness-raising of participants about possible threats and techniques used by perpetrators will support them in better protection of their safety during social media activity.
Description:
In cases of sextortion and grooming of minors online, AI-powered tools can make this process easier for perpetrators by allowing them to monitor social media activity on a larger scale and identify potential targets based on specific keywords, interests, or behaviours. These tools can analyse vast amounts of data and quickly identify children who may be vulnerable to manipulation. Perpetrators can use chatbots to create a false sense of trust and intimacy with children. They can initiate conversations about topics that interest the child and pretend to share the same interests. As the chatbot learns more about the child, it can use that information to tailor its responses and gain the child's trust even further.
The emergence of deepfakes has introduced new risks, with deep nudes becoming increasingly prevalent. These edited versions of ordinary photos or videos that depict individuals in the nude account for 96% of all deepfakes. Sadly, more and more young people are falling prey to this technology. Deepnudes are not only used by perpetrators of financial sextortion, but also to create pornographic videos featuring famous people. Shockingly, a nude photo of Michelle Obama has been created and is being circulated on the internet, while a robot service on Telegram has edited over 100,000 photos of unsuspecting women. Additionally, in Singapore, dozens of photos of women have been taken from social media and posted on a sex forum. Revenge porn is another way this technique is used for harmful purposes.
As the technology behind deepfakes becomes more advanced and accessible, soon anyone with a phone will be able to create deceptively realistic content. In fact, AI technology is currently the only way to reliably identify deepfakes. Therefore, it is crucial that people become more aware of the potential dangers and that the topic receives greater attention.
As listed above, expected session outcomes will draw-up on:
- Understanding the risks of sextortion and online grooming.
- Information on how perpetrators are using AI tools.
- Knowledge about Deep fakes and deep nudes, and related tools and techniques to detect and prevent these.
- Better resilience and awareness of prevention mechanisms and victim support.
Session outcomes will not only feed into the IGF 2023 messages but will also be summarized and published on the Better Internet for Kids portal (www.betterinternetforkids.eu) and further disseminated via its social media channels on Facebook, Twitter and LinkedIn.
Moreover, as the session topic is of great concern, the organizing team will make further efforts to continue the session discussion beyond IGF 2023, at regional, national and international events (e.g. national IGFs, events organized by the Insafe-INHOPE network of Safer Internet Centres).
Hybrid Format: One on-site moderator and two online moderators will jointly host the session. One online moderator will build the bridge between online speakers and attendees and on-site moderator, alerting each time a question arises. Second online moderator will take care of written responses, making sure comments are responded promptly.
Interventions by speakers might be supported by a short slide deck, making discussion more accessible and easier to follow. A few minutes will be also given to audience for Q/A. On-site moderator and two online moderators will facilitate the Q/A making sure questions from on-site and online are equally considered.
The round table will be accompanied by a dedicated hashtag on social media platforms. Participants are encouraged to use the hashtag, to share feedback with the wider IGF community, to engage in further discussions on the topic. The organizers may include interactive tools (e.g. Mentimeter) to increase further participation in the session.