IGF 2018 LIGHTNING SESSION #21 From Open data to open government for a better governance: the case of French policies

Etalab is the French State task force for open data and open government. Through this mandate, the French State has made the choice of opening public data and public algorithm as a fundamental principle for improving its government policies and actions. This lightning session will aim at presenting the French open data and open algorithm policies, and its on going work on data and algorithms, and how it contributes to a more open global governance.

 

Since 2011, France has had a very active policy on open data and open government developing a strong legal framework for transparency, accountability and co-creation of public digital resources.

Open data and open government policies contribute to a more open global governance.

In France, the data governance model (led by the national chief data officer), the open data platform data.gouv.fr and a strong legal framework (Law 78-753 of July 1978 on freedom of access to administrative documents; the “Free of charge” Act; and the Digital Republic Act) enabled to set the bases for a key infrastructures such as “reference data” as a public service and the principle of source codes as open data.

The new legal framework for public algorithms enforced the transparency on algorithms that take individual decisions. Moreover, the French implementation of GDPR designed rules for automated decision making. These new rules are complex to implement as they lead to many questions: what is automated decision making? Which public administration use algorithmic treatment? Are the existing practices on algorithm compliant with the law? Should we share just the source code of the algorithm or should there be more explanations added? How to best explain an algorithm? etc.

It is also leading the French administration to prepare the next public algorithms generation and to realize that reinforcing open data is necessary to make AI more accountable, in particular on questions such as gender or racial discrimination.

Session Time
Report

- Session Type (Workshop, Open Forum, etc.):  lightninh sessions

- Title: from open data to open algorithms... and back : A French experience

- Date & Time: nov 14th from 14h to 14h20

- Organizer(s): Niels Braley

- List of speakers and their institutional affiliations (Indicate male/female/ transgender male/ transgender female/gender variant/prefer not to answer):

Amélie Banzet, head of open government for Etalab, DINSIC, Prime minister office

Maud Choquet, legal advisor, Etalab, DINSIC, Prime minister Office

- Theme (as listed here): emerging technologies

- Subtheme (as listed here): Democracy

 

- Please state no more than three (3) key messages of the discussion. [150 words or less]

Since 2011, France has had a very active policy on open data and open government developing a strong legal framework for transparency, accountability and co-creation of public digital resources.

Open data and open government policies contribute to a more open global governance.

In France, the data governance model (led by the national chief data officer), the open data platform data.gouv.fr and a strong legal framework (Law 78-753 of July 1978 on freedom of access to administrative documents; the “Free of charge” Act; and the Digital Republic Act) enabled to set the bases for a key infrastructures such as “reference data” as a public service and the principle of source codes as open data.

The new legal framework for public algorithms enforced the transparency on algorithms that take individual decisions. Moreover, the French implementation of GDPR designed rules for automated decision making. These new rules are complex to implement as they lead to many questions: what is automated decision making? Which public administration use algorithmic treatment? Are the existing practices on algorithm compliant with the law? Should we share just the source code of the algorithm or should there be more explanations added? How to best explain an algorithm? etc.

It is also leading the French administration to prepare the next public algorithms generation and to realize that reinforcing open data is necessary to make AI more accountable, in particular on questions such as gender or racial discrimination.

 

 

- Please describe any policy recommendations or suggestions regarding the way forward/potential next steps.

Public AI and algorithm accountability is necessary to protect our democracy. Citizens should be able to understand when and how automated decisions are taken concerning their lives.

AI and algorithm accountability cannot be done without a strong open data and open source code policy; and an adapted legal framework.  The data and source code used to run the algorithm can be biased, and can include discrimination bias such as gender or racial discrimination. Thus, opening these data and source code is necessary to show transparency, accountability and identify potential bias. In order to design accountable algorithm and IA, the legal framework should set rules on data governance, open data and open source codes.

 

- What ideas surfaced in the discussion with respect to how the IGF ecosystem might make progress on this issue? [75 words]

IGF ecosystem should promote public algorithm and public AI accountability. It should support the development and exchange of best practices and doctrine on rules that reinforce open data, open source codes that lead to public algorithm and AI accountability.

Many questions still need to be addressed: what is automated decision making? Which public administration use algorithmic treatment and for what reason? Who is developing the algorithms? Who should be entitled to access these algorithms? How can Government make sure that existing algorithms are not distorted? How to audit an algorithm to make sure it corresponds to the legal text?

IGF ecosystem should also work on developing best practices : is transparency of an algorithm source code enough? Should we share the source code or give more explanations? How to best explain an algorithm to citizens? Is it worth it without publishing the related data? Etc.  

 

- Please estimate the total number of participants.

20

 

- Please estimate the total number of women and gender-variant individuals present.

70%

 

- To what extent did the session discuss gender issues, and if to any extent, what was the discussion? [100 words]

Gender issues where discussed when talking about IA accountability and fighting AI discrimination, on particular gender and racial discrimination.