The future of privacy





IGF 2010  

VILNIUS, LITHUANIA

14 SEPTEMBER 2010

SESSION 66

1130

THE FUTURE OF PRIVACY





Note: The following is the output of the real-time captioning taken during Fifth Meeting of the IGF, in Vilnius. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the session, but should not be treated as an authoritative record.   



********



>> CHRISTINE RUNNEGAR:  Good morning everyone and welcome to The Future of Privacy.  I'm Christine Runnegar from the Internet Society.  Let me welcome Cristos Velasco, our remoter moderator, and let me hand over to Katitza  Rodriguez from the Eletronic Frontier Foundation.



>> KATITZA RODRIGUEZ:  Thank you.  We have distinguished speakers today. To my right.e have Mr. Hugh Stevenson, Deputy Director for International Consumer Protection Office of International Affairs, U.S. Federal Trade Commission; Rafael Garcia Gozalo, Head of the International Department, Agencia Espanola de Proteccion de Datos, Spain; Catherine Pozzo di Borgo, Council of Europe Consultative Committee of Convention 108 (T-PD); to her left we have Mr. Joseph Alhadeff, he's vice president for Global Public Policy and Chief Privacy Officer for Oracle Corporation; to his left we have Rosa Barcelo, Legal Advisor, European Data Protection Supervisor; to her left we have Ms. Ellen Blackler, Executive Director, Regulatory Planning and Policy of AT&T; to her left we have Kevin Bankston, Senior Staff Attorney, Electronic Frontier Foundation; and to his left we have Pedro Less Andrade, Senior Policy Counsel Latin America, Google, Inc.  

I have made these brief introductions.  We have five minutes of each privacy, basically from Mr. Hugh Stevenson, if you could start the representation    oh, no.  Christine, if you could start the presentation.



>> CHRISTINE RUNNEGAR:  Thank you, if I could have the slides, please.  Thank you.  It's my great honour to present Insights on the Future of Privacy from the Internet technical community.  This is the start of an ongoing project, the Internet Society is collecting, innovative thought-provoking forward-looking perspectives on the Future of Privacy from the Internet technical community with expertise in Privacy, Data Protection, Identity Management and other related fields.   I don't seem to have any slides as yet, but I'll start anyway, because time is precious.



So starting off with the Internet Society, the Internet Society says - In the future, privacy will be redefined in response to changing social, technical and regulatory realities. While the concept of privacy will remain contextual, individuals will become more actively engaged with the protection of their privacy by actively managing their identity and related personal data.  People will be more informed custodians of their personal data - able to decide when the sharing of personal information requires explicit consent, and choosing appropriate levels of security and protection.  Internet-based solutions to support user-managed privacy protection are emerging, and Internet Society is helping to provide clarity about their use to individuals, enterprise and governments.

Then if we turn to the Internet Architecture Board. The W3C, the IETF and the Internet community of privacy experts must work together to provide an online experience that conforms with user expectations of privacy and the emerging regulatory environment. To keep up with the speed of innovation, the IETF needs to develop privacy guidelines, building blocks and tools that are useful for an entire class of applications. Technical work needs to be backed-up up by providing incentives to incorporate privacy into system design and at the same time to keep the speed of innovation and openness of the Internet intact.  The best technology will not help end users if it does not get implemented properly and deployed in a privacy friendly way.

Then if we turn to the W3C. As a universal, distributed application platform, the Web links personal data across individuals, organizations, and countries. New sensor APIs also give Web applications access to users’ location and to their physical environment. Everyday events – from the morning run to the credit card payment – are automatically brought online and shared online by friends and strangers.  Technology helps users defend against some intrusions, and it helps users understand who learns what about them. But when the data about preferences and habits that fuels the business models behind today's ecosystem of free services is gleaned from users’ online interactions, the policy framework needs to encourage privacy friendly behavior.    

The Kantara Initiative Privacy & Public Policy Work Group (P3WG) believes that it is important to support the open development of globally-applicable privacy standards, both technical and regulatory, in order to continue having confidence in the Internet ecosystem.  To do so, tt actively engages with individuals, enterprises, policymakers, regulators and adoption communities on best practices and common solutions. Fundamental to effective privacy are transparent architectures that secure private information and enable information-sharing in a secure, privacy-enhancing manner.  Only by multistakeholder collaboration will viable solutions emerge, be deployed, and maintained.   

If we turn to OASIS, the Organisation for the Advancement of Structured Information Standards.  This is what they have to say.  The state of privacy and information protection has changed substantially because of changes of technology, business models, and role of the individual, bringing ever significant challenges to effective application and traditional privacy management. Implementing policies for increasingly federated networks, systems and applications is a problem as typical policy expressions provide little insight into how to actually implement them, as well as lack of standards-based technical privacy frameworks or reference models that can enable development and implementation of privacy and associated security requirements.An effective solution would be a collection of privacy and security policy-configurable IT-based, systematic behaviors that satisfy the requirements of privacy and security policies within a wide variety of contexts and implementation use-case scenarios.

Another contribution, from Dr. Jose Manuel Gómez-Pérez, R&D Director, Intelligent Software Components (iSOCO) S.A. The Internet provides us with more and more online services, virtualized online for convenience, which relieve us from cumbersome installation and configuration processes. However, as usual, such advantages have a price to pay, in this case terms of a potential privacy loss. In general, the processes by which our own personal data are manipulated are often opaque to us, but citizens have the right to have knowledge of the logic involved in any process concerning their personal data.  These rights can only be enforced by combination of legal but also automated means that analyze the provenance of the data to support users in the understanding such processes, are capable of determining what and by whom has been done with the data (attribution), and determine whether the processes are compliant with established contracts (accountability), while facilitating the analysis of the (potentially large and complex) processes by the users themselves (abstraction).

James Clarke, Future Internet Assembly ‘caretaker’ says - It is important to ensure that privacy is addressed as fundamental to the design and development of the “Future Internet” as an aspect of maintaining the digital rights, dignity and sovereignty of the citizen.  

We received two contributions from Antonio F. Gómez Skarmeta, one as University of Murcia Spain UMU (RTSI ISG INS Partner) Identity and Access Management for Networks and Services (ETSI INS) ETSI Industry Specification Group (ISG)., Today, the need for Identity Management is present whenever the user needs to login or provider needs information about the user.  Information, authentication and authorization should be consistent and act as the glue between the different applications the user interacts with, This will be especially important in Network and Service Providers in relation to the user’s control and his or her identity and privacy.  In future Distributed Identity Management Platform, the user should be able to deal with various services, specify the preferences regarding the information revealed, and especially within privacy policy enforcement with respect to usage of quest and with regards to attribute provider’s privacy policy and user policies on what to disclose.

Antonio Skarmeta also provided his own contribution as a researcher and says Future Identity Frameworks should provide privacy-enabled Future Internet using Identities such that user control is maximized at all layers.  The user is at the centre of the control of its data and where they are stored and who uses/accesses it. More controlled privacy than today: not letting technology dictate the level of privacy.  The capability to establish zones of privacy as in real life. Controlled linkability and identity disclosure for accountability.  The capability of “limited identities” for minors. Identity, Privacy and Trust as a key enabler of a Citizen Living Use case in Future Internet.

Sam Coppens, a researcher and PHD candidate at Multimedia Lab of Ghent says – Nowadays, the Internet has become such a big information space,  people need technologies to filter out the information of interest. This leads to a situation where the Web has become a giant storage space for profile information on which technologies rely to target the user with information of interest. This profile information gets exchanged even traded on the Internet like any other piece of information. People have no control anymore over this profile. The problem gets worse because all of this information is stored on a medium that has no expiration date.  So users must recover full control over their profile information and this information, actually information in general, on the Internet should get an expiration date.

Then, finally, from the Center for Democracy and Technology, talking about The Future of Privacy and Global Information Flows – which will be particularly interesting for the next speaker. Substantive consumer protection facilitate global flows information. Growing recognition that the opt-in versus opt-out debate is insufficient. Emphasis on the responsibilities of companies to comply with the full range of Fair Information Practice principles. Likely implementation of accountability programs, consumer access and control tools and other mechanisms to both protect consumer privacy and encourage innovation. Recognition of these principles within the U.S. and U.S. government. The U.S. privacy bill is introduced and receives industry support.  The U.S. Department of Commerce initiative emphasizes that privacy protections and global commerce and innovation are intertwined. Finally, privacy protections should facilitate, rather than impede, free speech, the creation of user-generated content, and the proliferation of innovative platforms and services.  

All I need to say now is a big thank you to the Internet technical community for providing the contributions for this workshop.  A number of these people are present in the room and I hope we hear from them in the discussion section.  



Thank you.



>> HUGH STEVENSON:  Thank you, Christine.  My name is Hugh Stevenson.  For those not familiar with my agency, it is an independent agency of the U.S. Government with jurisdiction over trust matters and consumer protection initiatives.  Five minutes, I don't think is quite enough time to resolve The Future of Privacy.



Let me just address three questions from my perspective, which doesn't necessarily reflect the positions of my agency or the U.S. Government.  Some of the questions are -  Where are we now and Where might we go forward?



In terms of where we are in the United States generally, commission on since the 1970s, at the FTC, particularly in the missed 90s, privacy has become one of our highest consumer protection and we focused on enforcement in connection with the privacy online in connection with spam and spyware and security issues, the establishment after our do not call registry, the establishment of our ID database.



We focused on a lot of privacy issues in (off microphone) to go forward with the issues and examples might include use of I.D.s, international aspect of data security, the effects of global marketing, behavioral advertising and identity management.

So where are we now?  Technology innovation we see a number of challenges, I'm sure familiar to many of you here.  We see developments in advertising, in cloud computing, in micro services, social networking and data transfer, think of how much data was moving internationally in the 1970s versus the volume now.



Upon examining how those challenges and the existing approaches we have been using, particularly the information practices and focusing on notices of choice and on the model, focusing on actual harms to consumers, we've had a series of roundtables in the FTC, we've had (off microphone) we've



You'll hear there's a process of reexamination of privacy on practice in the European union and as we examine the privacy guidelines and the FTC has been trying to gather a report on this project using the results of this round table and the results of the various earlier studies that we have done.



And now a couple of ideas that we have considered in the report and elsewhere that we expect the agency ultimately will issue.  The first is we encourage businesses to integrate privacy and security into their systems at the outset - prevention and not just after the fact.  I think that's responsive to one of the comments here on the importance of incentives of privacy and system design. The second is to simplify consumer choice and increasing recognition that consumer choice is not always helpful in helping consumers to handle their information.  (off microphone) is not indicated is an important issue.

Third, to reduce consumer confusion and just now the technical comments and the people not understanding about how much control and there's information suggesting that people, in fact, do not understand well what is done with the information they may use online.

So questions like it might be useful to consider how processes can be used to simplify the choices consumers might have.  

And, finally, I mention international cooperation.  This is very much a long term process, privacy is issue about which there is sometimes narrow consensus, believe in working to establish, to promote and establish networks to focus on collaboration in terms of privacy enforcement and what more we can do in that area.



We're expecting the agency will issue this report and we look forward to your comments and input on that report once it comes out.  Thank you.



>> RAFAEL GARCIA GOZALO:  Hello, good morning, everybody.  Can you hear me? Hello? Hello, good morning.  Hello?  Good morning, everybody.  My name is    is this on? Hello?  Fourth time. I'm Rafael I work for the Espanola Proteccion de Datos Agency. My comments will be made about data authority.  In a sense, there    they will be similar to some of the comments made by Hugh.

And first of all, I would say the main idea, the main concern for the data authority today with regard to the Internet is to see or the fact that that existing laws, existing regulation, seem difficult to apply or difficult to implement when it comes to deal with the Internet or with the Internet based services. The reason is our regulations are related to our territory, our country or our region, but the Internet is global by definition, so there's a lack of coherence between the scope and application of laws and regulations and the reality that they are supposed to deal with.



What kind of, what can, the solution be?  Well, first of all, I think it's necessary to review this laws and regulations with the aim to better adapt them to the logistics of the Internet.  There's information and processes going on both in the United States, in Europe, and also, I would say, wider levels, perhaps, you've heard that the last International Conference of Data Protection and Privacy Commissioners that was held in Madrid last year adopted a resolution, including a set of international standards and data protection privacy principles which tries to be, I would say, different things, perhaps some other for countries which tries to adopt laws and regulations but perhaps the beginning of international test and hopefully international legally binding test.  That would help to solve some of the problems that we are encountering now when we have to deal with Internet cases.



But as Hugh mentioned, I think that this is not the only solution. It's very, very important to improve the international corporation among data authorities and regulators and the agencies that are responsible for the enforcement of privacy and data application regulations.  And in that sense, the information and initiatives that have been launched recently to facilitate better cooperation among data application authorities, one of them has been underway for some months, and we probably start with function in let's say, a good speed in a few weeks.



In fact, when I mention .. I would say that it's important to adopt, new phenomenon, the new situation, but also the international standards adopted in Madrid, but I would say this it is not enough that  I don't think that it's enough to review just the laws.  It's also necessary to review them in a harmonized manner.  I think that's another …   to really have this improved cooperation among data application authorities out of respect legal frame works are more compatible or more harmonized. That's one of the difficulties that may hamper the cooperation is the fact that in some cases, even though we are working to the same goal, our instrument that is we use, are slightly different in some cases, widely different in others. So, as I say, solutions, or possible solutions, review of the legal frame works, better international cooperation, and review of international cooperation that will have to be helped by and based on better harmonization of existing frame works.  



Thank you.



>> CATHERINE POZZO di BORGO:  Thank you, very much.  Hello, good morning, everybody.  I'm Catherine Pozzo di Borgo.  I am delighted to be representing the Consultative Committee of Convention 108 (T-PD), which is the Council of Europe committee dealing with the protection of personal data. This committee comprises bodies responsible for protection and it works exclusively with the application of 108.  As to our topic, the future of privacy, private life and that application are subjects to which the Council nationale de l'informatique et des libertes of Europe attaches the utmost importance.  



It is important that the European Convention should retain its place at the European level, but to an international level.  So the Council of Europe has mandated the consultative Committee to the principle of Convention 108 in the light of current technologies and to the demand of city society and industry.



So to do so, data protection must firstly take account of different rights, including the freedom of expression, the right of information, and intellectual property rights.  Secondly, data protection principles must dovetail with application of legal texts, geared to ensuring the protection of individuals, particularly minors, for example on the Internet. Thirdly, and lastly, data protection principles must be compatible with increased mobility, marketing globalisation, and the opportunities provided by new technologies for customized exchange services.

So the Convention of 108, we take into consideration the technological aspects involved and the various developments, so what types of possible avenues of inquiry when examining the Convention?  There are several avenues of inquiry, but I'll go over them here only as example without order of priority, and this presentation should not pre-judge in any way the future work in this field within the bureau.  

The discussion inside the bureau, first of all, might concern the aim pursued by the Convention, or its scope, or the data protection principles themselves, and the guarantees set out by the Convention in 1981.  So think about changing the aim of Convention 108. With the aim of this Convention rather than being purely punitive, my thought is with a more positive approach by entitling the individual to control the use or his or her personal data.  It may be necessary, then, to clearly enshrine a right at a time when technologies, more by configuration than by necessity, generate and store traces of the use of services, which facilitates more detailed knowledge of the person and his behavior, without any supervision.

Shall we think about challenges. Certain concepts of the convention, some of them were writen up in 1981 and they might have lost their relevance in the light of technological developments. One development is the identifiable or “distinct” nature of the person, the concept of a file while might be enlarged to the broader concept of processing, for instance.  

Shall we think about revising some of the data protection principles and provide for new ones?

This question comes in two main subjects; the data subject consent first of all, and the privacy by design technology.  About the consent of the data subject, we rule out that Convention 108 does not assign any official status to the data subject’s consent, whereas a number of services running on the Web do systematically ask for the person's consent in order to legitimate the relevant processing of information. Nevertheless, consent may not appear to be a sufficient basis for legitimacy unless it is specific, informed and obtained by fair means, so it's fair to write it down as a principle in the convention.

The need to encourage the development of “privacy by design” technologies.  This would seem necessary in a first approach in order to restore confidence in the use of the new technologies and provide individuals with improved data protection system.  Privacy by design might be taken as a basic principle in this field.

Should we consider the scope of the guarantees of Convention 108.  Security, the key to Internet trust, but Convention 108 adopts a restricted approach to security, confining it to destroying data and protecting confidentiality.  But maybe in order to restore network and security, we might consider a requirement on security geared to not only preventing unauthorized access, but also to giving the data subjects control over access to their data.

This approach may be accompanied by an obligation on those responsible for data processing to ensure the data subjects are promptly informed if their personal data are wrongfully disclosed or used in a manner incompatible with the purpose for which they were gathered.  

Should we think about laying down new rights for the data subjects? The bureau must analyze the need for such new rights:  the right to object in the field of geolocation facilities, for instance, the right not to be stalked, the right of enlarged excess, the right not to be bound by a decision taken by a machine, or the right to have data erased.

Also, of all of these rights, the most important, the right to oblivion or the right to be left alone.  The right to oblivion should probably be more carefully analysed in the light of current debates on the subject at the international level.  This idea that data storage should be permanent is a matter of concern.  Moreover, this can be addressed by programmes that facilitate erasure of data.  So this may be called into question.

My conclusion, as I've pointed out, in Convention 108, the Council of Europe has an instrument which has been genuinely effective for 30 years but which it would now like to analyze in order to grant its continuing efficiency and stability, in line with future technological developments.

I've pointed out some avenues which will be explored by the department itself and would like now invite the representatives of other institutions and society to participate in further reflection.  

Thank you.





JOSEPH ALHADEFF >> Thank you.  I would like to thank the organizers for the opportunity to speak here.  Some of the topics I'm going to talk about related to “The Future of Privacy” are more of a thematic nature.  The first are accountability as we think about how companies and organizations become accountable, both in the context of Web 2.0, 3.0, whatever number we get to next, as well as in the general remit of their operations.  We start changing the paradigm slightly because the concept of accountability means the obligation flows with the information.

It's a greater recognition that we live in the world of global information flows and those aren't likely to be rolled back.  How do you paradigm?  Accountability is one of those.  It is in the heart of Canadian privacy law, PIPEDA, which is the Canadian Privacy Act and in APEC’s, Asia Pacific privacy framework.  It's one of the principles in the privacy guidelines and it is really what cross border, what binding corporate rules are based on.  APEC, they're cross border rules and they're working on analog ways of looking at things.  We're working more in global systems and not the point-to-point systems, the basis on rules like the directive and other things are drafted.

There's a change in the way information is used, how it flows and is managed.  To that end, a lot of people speak of “privacy by design”.  Too often they speak of privacy by design as if it's a silver bullet that lives in technology.  If the time you start thinking of privacy is the time someone is implementing in technology, it is too late to get it right.  

You need to think about privacy as an issue that deals with the model or the idea before you actually get into coding something related to it.  That is when you need to start thinking about it.

So what you have to think about when you look at privacy by design is privacy is a component not just of technology, but more of people, processes, practices and technology.  Because without looking at all of those things, you will not design privacy into a system.  You will merely make some aspects of the system, perhaps, have better privacy tools associated with it.  But that is not, in fact, “privacy by design”.

As we look at “The Future of Privacy”, we also have to look at more and more technologies and business models coming on board.  Part of the problem at looking at those technologies and business models is we forget to look at them in their use an context.  Taking a look at technology in the abstract and trying to figure out what is the appropriate regulatory paradigm for that technology without considering how it's being used in the context of use is not a productive exercise that usually constrains innovation unnecessarily.  Understanding how technology will be applied and in what context is a critically important concept that we have to think about how to do a better job navigating.

As we go into this base and as we think about regulatory paradigms, whether it's the privacy standard that was developed in Madrid, whether it's accountability approaches, one of the things we have to start understanding also is how do we measure?  How do we evaluate the processes that we're doing?  Especially from a regulatory perspective, who should have oversight responsibilities to some of these issues because that is a new science only developing now.  

On the same side we have to look at regulation, because we have to look at how is regulation effective.  That's one of the things being looked at in the context of the review of the directive.  How does the directive promote positive outcomes?  How do you make sure you don't have a really administrative burdens built into a regulation that don't have effective and positive outcomes?  How is regulation more effective?  There's innovation to be done on the regulatory side as well as innovation to be done on the technology and business model side.

And, finally, I'll just make one observation.  There has been a lot of discussion related to control.  I think one of the issues with control is to be careful not to create overly granular expectations of what people should or will control because I think at some point, control becomes too much of a burden also. So control has to be factored in with these ability because if everyone had to make a choice every time a cookie is being set, no one would want to use the Internet anymore.  It would become burdensome and cumbersome.  What is a way to assert and manage control without having a burden of control that becomes too bad?  I think all of us have had the experience of speaking with a grandparent or parent and speaking them through a level of issue and having controls.  

Thank you.



>> ROSA BARCELO:  So, Katitza, thank you for the invitation to participate here and it is a pleasure.  So I will try to be very loud, but in getting an answer to    



>> More.



>> ROSA BARCELO:  I will put forward five points that will be like wishful thinking regarding a possible future of privacy. My office, would like to see these points in the future of privacy.

So just to ensure that you understand, I would like to say a few words about my office.  I work for the European Data Protection Supervisor.  This is a body distribution similar to, a commission department.  We are based in Brussels: our function is to advise the Commission, the Parliament and the Council when they put forward legislation or other measures on data protection and privacy.  So my five points regarding the future of privacy have to be understood from the European perspective and in this reality.

Now, again, my five points or six have to be understood not in a vacuum, but with the European legal background, the European regulatory framework.  And it is important to remember that in Europe, we have the two main directives, the Data Protection Directive, 95/46 and the e Privacy Directive that was amended last year. So they have to be understood in this framework. And moreover, it has to be understood in light of the fact the Data Protection Directive is now understood to be under review.  

At the end of this year, the Commission will adopt a communication that will put forward the main changes of the possible review of the Data Protection Directive. Then mid 2011, the Commission will come forward with proposal which will go to Parliament and Council.  

Let me go through my five points quickly.  Number one, my office is happy about the review of the Data Protection Directive. Number two, this should not mean watering down the protection of personal data and privacy.  The protection, the right, the basic rights, the right of information and transparency, the right to access of one’s personal data, they should remain and should not be changed.  However, we might want to see some changes in the rights. For example, access has been traditionally given in paper.  This might need to be slightly changed to put more emphasis to the online world.  

Third, some new rights and principles may need to be added. We would like to have a principle of accountability.  This will be a new principle that what will require data controllers not only to comply but put in place all the organisational measures to show that they had compiled. It will require to have everything ready to show compliance to the regulator; this would favour a culture of compliance.  A new right that we will support will be right to privacy by design.

Number four, we will also support the broadening of rules on security breach notification which currently only apply to ISPs and network operators.  We think that security breach will be a good thing to make people more aware of data protection and enhance security in companies. We would also like to see more enforcement and court cases. We would like the legislation to support collective action, i.e. the possibility for consumers associations to go to court on behalf of various consumers.

And, finally, the last point as you may know the processing of data by police and justice is not fully covered in the European Union.  This is a vacuum that has to be covered.  We think we need a comprehensive framework that encompasses also the protection of personal data in this sector.

Thank you.



>> Hello?  They're getting louder as we get louder.



>> ELLEN BLACKLER:  I'm Ellen Blackler with AT&T.  I just want to talk a little bit about what we see as “The Future of Privacy”.  I think that focus is really on usability.  We've talked a little about that, other panelists.  The industry really needs to innovative around usability.

Joe mentioned the notion of control that there seems to be broad consensus on, we ought to make it usable for consumers.  That is something the industry and regulators need to work on because we've seen it's easy to overwhelm consumers with a lot of information that's not useful to them.

With this technology that has so much potential and so many great innovative minds, we need to turn to finding a way to make consumers be able to control their privacy in a way that makes sense for them.  We know the lengthy privacy notices and lengthy privacy settings require a lot of action on consumers and that is really not a 2.0 way of thinking.  So  we need to also look at this question of interoperability of permissions so consumers don't have to tell every single person that they interact with on the Internet how they want their privacy treated.

I've heard privacy advocates why can’t I as a consumer tell you my privacy policy and you decide to comply with it?  I think it would be nice if we could try to give the consumer some way to have to state their preferences once and have the Internet adapt to these preferences by each consumer.

I will add to that this notion of what we talk about as the evolution of privacy by design, featuring privacy so we build in these different privacy features but it's easy for consumers to see what's happening when it is happening to them.  So that as they use a particular feature, it becomes more obvious to them what kind of information is collected and how it is being used.  And we have a lot of smart consumer scientists in the industry and we really need to work on that as a user interface issue and figure out ways to make these things more clear to consumers, not transparent in the sense they are being described accurately in a privacy notice but transparent in the sense they can see it's happening to them when they use a particular feature.

And, lastly, one of the big paradigm shifts that I think we need to get our arms around is this idea that privacy is no longer about building a wall to keep the information safe.  That's important, but we see consumers wanting to share, so we need to find a way to let consumers share what they wanted to share when they have control.  That's a little bit of way to …  the best ways to lock up information about consumers.



>> KATITZA RODRIGUEZ:  Thank you.  Back to the Electronic Frontier Foundation and now moving the discussion on The Future of Privacy vis a vis the government,



>> KEVIN BANKSTON:  Thank you for inviting me to speak, thank you for listening.  This is my first IGF, hopefully not my last.  Forgive me if I don't get all the acronyms correct, but my primary perspective is that of a litigator of privacy issues in the United States where I primarily work on the issues that, as Rosa notes, are not dealt with in the Data Privacy Directive, which is government access to your communications and related data. My job is to ensure the government complies with appropriate legal standards when it wants to conduct electronic surveillance or obtain other user data from communications services, and when those standards are in dispute, do my best to ensure that the courts adopt the highest standards possible. So, for example, I am one of the lead attorneys in EFF's lawsuits against our National Security Agency and our friends at AT&T over our government's warrantless wiretap programme, and am also involved in a lot of disputes over the appropriate standards for when the government wants to access your e mail or track your cell phone.

Despite my admittedly provincial perspective as a litigator of U.S. law, I think I've identified a few useful concepts concerning communications privacy against government that we should consider critically when we consider The Future of Privacy.  I believe that preserving The Future of Privacy when it comes to our telephone and internet communications turns on overcoming the past and moving away from the outdated assumptions and prejudices that have guided communications privacy law and policy in the 20th Century. Specifically, I would ask you to consider three dichotomies that pervade American privacy law when it comes to government access, and also that of many other nations.  EFF believes that these dichotomies have been rendered false and counterproductive by changes in technology and that we must move past these ideas for a privacy policy that works for the 20th Century.

The first of these outdated privacy dichotomies, I think, is the dichotomy between the data that you store in your home or office on your own laptop and in your own filing cabinet, and the data you store with a third party provider, say your email provider or a cloud storage provider.  The former has typically been viewed as very strongly protected in the law while the latter has not been strongly protected in the law.  But in an age where millions of us are trusting web based e mail services such as Microsoft's Hotmail or Google’s Gmail to store years' worth of our private correspondence, and trusting cloud services like Google docs to store their most private documents, it's time for the privacy law to treat such online storage as an extension of our home and an extension of our office.

The second increasingly false dichotomy I see in the government surveillance arena is that between surveillance of your communications as they are happening, contemporaneous or real-time wiretapping, which the law strongly protects against, and surveillance of your past communications, which in the U.S. the government is allowed to do under much more liberal standards.  In the U.S., for example, if the government wants to wiretap your email, it must obtain from a judge a search warrant based on a showing of probable cause that a crime has been committed or is being committed.  They have to not only do that, but they can only use this technique for certain very serious crimes, they have to demonstrate that they've exhausted all other investigative techniques, and they have a duty to minimize their interception of communications that aren't relevant to their investigation.  So it's a very high standard.  In contrast, if they want to obtain your stored email, our Justice Department's practice is to merely have a prosecutor issue a subpoena to the e mail provider--without any court oversight, without probable cause, without exhausting other methods, without minimizing the acquisition of irrelevant data. Yet it seems that in 2010, can we really say that 30 days of wiretapping--which is the standard length of a wiretap in the U.S.--is that really so much more invasive than obtaining five years worth of all o fyour e mail correspondence?  I don't think so, and I think that dichotomy is false at this point.



Finally, the third and final false dichotomy--or,  increasingly false dichotomy--is that between the content of your communications which is typically strongly protected, and non-content transactional data or metadata, which is typically much less protected.  We at EFF question whether this was a meaningful or justified distinction even in the telephone content, but today on the Internet the line between the two is increasingly blurry, and your so-called transactional data can often be as revealing if not more revealing than, say, a wiretap on your telephone.. A dossier of everyone who you communicate with, by email and IM and Skype and telephone, when you communicate and how much you communicate with them, can be intensely revealing.  A good example of this is a study   done by researchers at MIT called the “Gaydar “study, where they were able to predict people's sexual orientations based on who they were friends with on Facebook.  One can imagine how such similar analysis of your social contacts and your network and who you communicate with can be intensely revealing also of political, religious or economic persuasion.  Meanwhile, the monitoring of other data that is arguably transactional, such as the location of your cell phone, or the clickstream data that shows all of web sites that you visit, or the search logs that indicate what you search for on Google or Bing or you search engine of choice, should in our view be considered just as invasive as looking at your email or listening in on your telephone calls.

We at EFF believe that we need to move past these three false dichotomies and reexamine the laws that control government surveillance of communications with new eyes; ignore the past distinctions that technology has rendered moot, and focus solely on the invasiveness of the surveillance techniques of issue.  In the U.S., we've sought to address this and update our admittedly woefully outdated electronic privacy laws through something called the Digital Due Process coalition, a fairly unprecedented coalition effort between civil society groups like the EFF, the American Civil Liberties Union, and the Center for Democracy and Technology, in cooperation with companies such as Google, Microsoft, AOL, Amazon, Facebook, and even our sometimes-courtroom opponent, AT&T.  Though we may have our many disagreements, we all agree that this privacy framework is outdated and needs to be strengthened, clarified and unified for the 21st Century.  

In closing my final hope for the future of privacy is that this Digital Due Process effort might prove to be a model that could be adopted in other countries where civil society and industry can join together to ensure a model privacy regime when it comes to governmental surveillance.  

Thank you.



>> PEDRO LESS ANDRADE:  Well, first of all I would like to say, EFF, let me share with you some stuff today.  As we heard from the prior speakers, it's an agreement that certain privacy issues should be revised and updated.  I want to focus my    (off microphone) that has to do with liability exceptions in connection with privacy.  So in order to present this case, I'm going to focus on the work of search engines.  

I'm going to take some idea by a great article written by the former director of the Spanish EPA, he wrote a great article on internal privacy and freedoms. So with that are search engines are critical and they show what is a viable line from the source.  They're the windows through which we can free up social landscape with good things and bad things when we face possible attempts to threaten one's privacy. This is not from the people looking at information, those attempts arise from displaying data.  And this data is stored in web sites which search engines simply index and record. How we should react to the possible attacks on our privacy that may derive from information that search engines shows, that is one of the question?  We think that's… I think that we will agree that search engines shouldn't be taking a decision over the information.  They shouldn't be given existing information that is a violation.  This will open the door to information manipulation. This essentially … that is pretty dangerous.  So we think that search engines shouldn't be controlling …    okay    now you hear me.  We think that search engines shouldn't …   information that exists.  If this is the case it will be falsified reality.  It will be the first to work complete mistrust of search engines.  

Therefore, those who are searching information, those who are responsible for providing information respect that protection.  I    and it's (off microphone) and necessary cancel information that facilitates or prevention those accessing it. We just imply a line of code.  Let me give you an example of this.  We have official set from a given country that purposes to    promises to make public the acts of government.  (off microphone) position involving embarrassing conduct of the person and Spanish of the online version of the official.  Every time you type the name of this individual, this information shows up.

In this case, could be, for example, someone that has a need to do something in the street and he was convicted for some not    some exposure in the street.  From the    from this person that is suffering this, every time he types this name in the search, it shows this kind of information and has to do with a right that we already discussed this.  

This search engine is just showing reality and is in fact helping with the access of information and the nature of the individual that is making certain acts public and how this is a responsibility of this source to make this information public for search engines should take some necessary measures to make it in a way that is not as easily as it was.

This has been so the (off microphone).  They had to consolidate this and they say the web site and search engine should the applicable in order to prevent it from personal data.  However, this has not been adopted in different countries.

In Argentina, for example, the Argentina National Court of Appeals overruled in favor of search engines, search engines liability for search party … third party web site and this was the case of celebrity that they tried to search for web site search for images or name to sexual activities but they never seek (off microphone) from these web site and went directly to different search engines. This has to do a lot with incentives because if there is no incentive to go and try to track down the perpetrator of these activities, intermediary, in this case a search engine, but any online platform like only application platforms or even social networks or different kind of online platform that they really don't know what's inside the platforms, this will make front centre in terms of trying to correct this kind of bandwidth. So this kind of technologies like search engines are no (off microphone) technologies.  Search engines and freedom go and should go hand in hand in the form    let's not assume control of the information.  

In this case, they would represent a danger for the democracy and freedom.  Search engines should be instrument of freedom, democracy and knowledge.  In order to ensure this, it's important to ability and perception, a new revised privacy and principles.  

So  an approach to this could be like users could   … expression of information is stored, processed or reference should be considered data controllers.  A data controller would be someone content about the content use of the data, regardless of not such data are collect, stored or processed or disseminated by him, an agent or intermediary service provider.  This is something I bring to the audience for an analysis.

Also, another very important issue has to do with freedom of expression.  And the pressure over Internet intermediaries, which is also ways of indirect censorship, we go directly to the source.  We try to make an action against a technical operator to remove the search.  So thank you very much.



>> KATITZA RODRIGUEZ:  Thank you, everyone.  We have 45 minutes for dialogue.  I would like to start out with a specific question, maybe try to pick up some of what everyone has said,, and try  to hear what you have to elaborate on a specific point. From an individual perspective, citizens want effective control over their own personal information, and due to the overall privacy complaints floating around, I would like to ask those companies and regulators present here, how citizens can exercise control over the own personal information.  There were also proposal in the table about the accountability project. Please can you elaborate more about that, and which is the actual status of the discussion held in some inter-governmental forums.  Furthermore, how agencies plan to work together, and how the accountability principles will be actually enforced? Furthermore, taking into account the different standards being review/revised so far, how all those standards will be harmonized?



>>JOSEPH ALHADEFF:  On the concept of control, it's less any specific regulatory language and more the idea    more and more people talk about the need for the user to be able to control their information.  As you get into the further discussion, the level of granularity of that control starts to become or can become overwhelming if everything is left to the individual to have to police on their own. There are some individuals who are clearly up to the task.  If they provide them those tools, that's great.  A vast portion of individuals, either would not wish to or be able to have the level of granular control and those need to be commented and my comment from AT&T talked about the concept of usability.  The concept is not that the idea of control is bad, but to make control real, one has to understand how it's useable and what context and what level it's appropriate.  So if a person is suggested to give control to micromanage a supply chain, that's not going to work.  Understanding who is going to manage a supply chain is something important to them.

JOSEHP ALHADEFF: On the concept of accountability, I think there's lots of work at the OECD, the Transportation Commission of Canada.  At the APEC, a memorandum, I would refer to you on the phrasing of that.  In terms of how enforcement agencies would collaborate, we don't exactly now the shape of the accountability principles and how they're going to work, so this is a work in progress in which industry is working with data protection commissioners and others to discuss how these things work to make them practical and effective with minimized burdens to the extent they can be in terms of bureaucratic findings and things of that nature.

JOSEHP ALHADEFF: Sometimes the word harmonization is used in place of regulation.  I don't know that harmonization is the right world.  You'll find that based on legal frame works, but we have to consider how they become more interoperable.  The laws of the jurisdictions will have to work in a way that is less combative than it is now where you have obligations that don't flow to obligations that can work at cross purposes.  So the concept on the regulatory side may be interoperability and might be a pathway towards harmonization.



>> KATITZA RODRIGUEZ:  I would like to hear from Rafael and Mr. Stevenson about their own views on the harmonization dialogue, and how  would be possible to reconcile the U.S. and European regulatory framework on this  topic?.



>> RAFAEL GARCIA GOZALO:  Thank you.  I'm not that familiar with the U.S. system, about how it works, but I have an impression, and this is my personal impression, that both the European legal systems and the American legal system or the U.S. legal system are evolving from, perhaps, different perspective and evolving different ways back to the situations we have before. Our nations and some examples, and that would be the breach of notification, that is this obligation or this possibility was familiar within the European framework just some months or years ago, but that was part of the privacy American culture and now we are trying to develop our security breach notifications and that would be an example of how the European system is evolving towards some of the, let's say, adopting some of the American practices or experiences. And now as Hugh was mentioning, some of the ..I would say, the cornerstones of the American data privacy regime are being reviewed, trying to adopt them to the new situation, it's specifically because of the consequences of the Internet but not only because of these consequences.   

So I would say that we're following different ways, but we are converging.  So that I will  .. this international standards project that we adopted at Madrid international conference, I would like to say just a few words about that. That was the result of the work of a working group of the authorities from around the world as well as representatives from the industry, Academia/Civil Society and the goal of this work was to develop a set of rules that could be accepted universally accepted. So it's not    I mean a (off microphone) document, follows the experience of all existing international documents, such as the Convention 108, the OECD guidelines, the European Directive, United Nations Guidelines, the framework, we Friday to identify common elements among all these international instruments and tried to pull them together and    in a coherent way. That's the problem, it is sometimes difficult to use institutions from different geographic and legal ingredients and make them work smoothly.  That was our purpose. I would say international standards are not always incapable with all the international instruments or with national laws or relation, they just be normal, simple, something to get ideas from.  So it's very flexible and, as I say, it's not incapable with other laws.



>> KATITZA RODRIGUEZ:  Thank you, Rafael.  I have the representataive of the Council of Europe Next to me. I would like to ask her  how she see the role of Convention 108 within this discussion.  The reality is that despite all this regulatory standards, e have many examples of privacy fiasco like those happened with social media, which companies have changed the privacy policy time after time. and our personal information that  we previously decided in first place to share it only with  few friends, it became then public, our information is being disclose with everyone. This is not the protection we should have.



I would like to hear the U.S. Government, the FTC about the protection privacy in the U.S., but I also would like to hear from the civil society  on this social network recent fiascos examples, so I would like to hear about that.  Thank you.



>> HUGH STEVENSON:  One of the challenges of conveying the information that's more impressive that people don't understand, in response to your last point, and one of the challenges that come up with the use of detailed privacy notices is that sometimes you saying less is more, in this case, more can be less in terms of people understanding what is there.  We've had experience in the U.S. with (off microphone) law requiring privacy notice to be sent to people, and may serve certain purposes for accountability but they don't serve a lot of purpose in terms of consumers understanding the detail they're providing. Sometimes it may be that what we need to do is to focus on what is the detail that is most important for people to understand.  When is the moment that they need to get that information to make a meaningful decision so that consumers are, as a practical matter, really more in control of what they're and how they're sharing and using their information and making the decisions about that?

On the other issue you raised about the international standards and whether harmonization is the right word, I think that's a better word, privacy without a protection law is a very large field.  It really covers a whole lot of different issues.  It covers so many difference things and involves balances with so many other things and the European Court of justice decisions, and the balances with freedom of expression, balances with intellectual property rights.

The finish, it's a European human rights case, balancing the need foreign (off microphone) balance human rights occur in various places in the law.  It's a large task to take on thinking about how do you bring the standards that we're using closer together.  It's really a broad area of law as to which there are areas of difference of considerable difference.  I think that's one of the challenges that we have here is how do you bring the standards closer together, part of that process of doing that by what Rafael is suggesting how do you have the conversation about where do different pieces work?  How do the rules apply?  How can we cooperate? We've had experience with this, other areas, my agency is consumer protection and look at overlap of common approach and common interests and really using that piece of it to be begin the collaborative process and working together on those kinds of things. As Rafael says, there are definitely areas of overlap and the rules are different, it's a long term process.  



>> KEVIN BANKSTON: I wanted to jump back a bit to talk about usability and then tie it in to social network and privacy policies.  Somewhat in reply to Ellen and Joe, perhaps I’m  seeing false dichotomies everywhere I look at this point, but it seems to me that granular control and usability are not necessarily opposing values or mutually exclusive, and that you can have both at the same time.  I'm not one to hold up Facebook as the paragon of good examples on privacy, but I think their latest privacy settings revamp in April demonstrates this.  It's still not perfect but they're substantially improved privacy settings, while making them more simple, while at the same time making them more granular.  I don't think in is a tradeoff, or if it is, it's a false tradeoff like security and privacy.  I think these are values that can be brought together simply with good design.

A part of that is a sentiment that Hugh pointed out: when is the right moment for the information to be presented to the user about what their privacy level is?  If we're counting on the privacy policy to educate the user about how the product works and affects their privacy, they're not going to know.  

It comes down to how is the interface educating them about their privacy as they use it?  So to hold up Facebook as a negative example on this point, one of the positive things it did is it added a new setting for when you post a status to Facebook, where for each status you can set whether you want the content available to everyone, just your friends, friends of friends or a particular list of friends.  Yet, in their current interface, to see the level that you're about to publish to, you actually have to go and pull down a little setting; it’s not not immediately evident when you post, even though there's plenty of real estate on the page where they could easily say ”this is about to go to everybody”, or “this is about to go to your friends.” So, I think there a great deal of opportunities in terms of interface design to make usability and control “harmonious”, to use a word that’s been going around, and that we should look beyond the privacy policy for opportunities for “just in time” disclosure about how information is being used.



>> We need to think of other tools.  We need to think of privacy centers, the use of breach easily a lot of information how we can care about privacy. Also, this .. what can you say it's not enough to try to indicate the user of the privacy policy.  The user should be ..  this kind of application should be included from elementary school and high school.  This is something that should be …   when we analyze a platform, we analyze the speed, but we need to start to analyze how we treat our privacy and it should be a competitive advantage in order to choose one platform from another.  I see this is also important in terms of changing to privacy policies, we should ask for a simple important change on our privacy policies to make it even simpler. So the information we    (off microphone) privacy policy to make it easier for user to read it and understand it.  We need also to inform about these change in advance, give people time before we officially change the policy so they can adapt and make an informed decision about how to use this kind of service.



>> KATITZA RODRIGUEZ:  I would like to ask Cristos Velasco, our remote moderator, if there's any remote participant making any questions?



>> CRISTOS VELASCO:  Good morning.  I am Cristos Velasco from NACPEC. I have two comments and one question from a remote hub located in Argentina.  First, somebody referred to the Madrid Data Protection International Standard on the Protection of Personal Data and Privacy that emerged from the last Privacy and Data Protection Commissioner’s Conference in Madrid. This standard has been very helpful for developing countries that have not yet enacted data protection laws. In the case of Mexico, it was very useful and it came right on time because Mexico has struggled for more than ten years to enact a data protection law for the private sector; and finally the Executive Power enacted a data protection law that regulates personal data and information in possession of the private sector last July.

While the Madrid resolution helps out in many ways, however, by reading and revising the whole document, I believe that it is very silent with regard to the issue of cross-border jurisdiction and conflict of laws, particularly when more than two countries want to pursue a data protection investigation that they are legitimately entitled to.  This is going to start happening more and more in developing countries and in Latin America. First of all, I want like to know your thoughts about this issue, how could such conflicts of laws be avoided or resolved in the future? since this is the topicof this panel is The Future of Privacy.

Another challenge for developing countries is capacity building.  Capacity building and the availability of financial and human resources and training for the staff of the data protection authority in order to enforce the national data protection legislation, accordingly, I would like to hear your views on these aspects.



Finally, there is a remote participant from Argentina, her name is Analia from the University of Buenos Aires, she would like to hear the opinion of this panel on whether it is imperative to have data protection rules for RFID technologies.  Thank you.



>> Thank you, Gustus.  With regard to National Standards, we know and we are going to this    about the fact that some country have developed it.  The rules are following or looking at the international standards.  That was one of the standards.  It's    we know there's difference happening in Mexico.  It's happening in other countries.

About the absence of specific provisions of cross border jurisdiction and conference, it is very simple.  It's impossible to reach an agreement on that front, so that's    I say before, that's    we were looking for commonalities and common grounds and that's the way it works, but it was really impossible to    we did not have time to develop    conflict of law and jurisdictions    as I say, we didn't have time to reach an agreement at that point.

I should say that it was the opinion of the working group and also of the international conference that as both the harmonization and legal regimes that it would be easier to solve this type of    these types of problems of issues    of course, it will not solve the problem, but it will make things a lot easier because even if you are not    you are    the data protection authority, you're not    (off microphone) specific case, you can be    you (off microphone) authority which may be similar to the one you are abiding, with elements in common. As I said, this is not the solution, but it's part of the solution.  I have to confess the story of failure.



>> You are   



>> One.



>> You have to answer another one.



>> Only one to answer the    yeah.  I want to ask because there were previous question that was not answered regarding … what is the relationship with Convention 108 and what is different from that menu?  Then you could answer that question and then if you have a question from the floor, feel free to raise your hands.



>> Just in terms of RFID, I think this was a finding of (off microphone) a couple years back, but there's no concept as to why the principles of the directive should not apply to RFID.  The question is not that you need new principles, you need to think how some apply.  Within the context of RFID, there's very little space to provide you with a notice.

One of the things when you think of notice in RFID, is using the same equivalent as a washing machine notice.  There may be a tag and it may be encrypted and have a potential for    not that a notice doesn't apply but how you may apply notice.  When we think of the OEDC, not that you … OEDC and how you apply the principles and the RFID.



>> Can we comment on that?



>> Well, in principle I agree with the view.  I think because RFID is new and because the environment is not so simple, I don't think we can completely disregard the possibility to specify how data protection laws apply. Companies have been given the responsibility to define how this will apply.  It remains to be seen whether this will work in practice.  

>> GENEVA:  My name is Geneva, I'm an instructor in Turkey and I'm also    okay.  Is it better now?  I'm Geneva, I'm an instructor in Turkey and I'm (off microphone) am ambassador.  I also believe in the future of privacy and it's an effective tool and must not be underestimated because I believe it's national legislation would be really effective, more flexible, more up to date as well as general purpose we have when it comes to harmonize and cooperate.I also have a question about context in the reviewing the documents, instructions for privacy protection.  So    data protection.  So to make sure it is really freely given, especially when I think about the Internet services and (off microphone) those services.  Thank you.



>> I'll jump in, I guess.  It's not retaining as around it.  We need to continue improvement barring mechanisms we use to get people's controlled consent and where we inform them of what's    what they're consenting to.  I think that's when we've been talking about improved usability and control, that's certainly what I meant.



>> CATHERINE POZZO di BORGO:  Okay.  If I can add something about consent, it was what we're thinking about in the Committee of Convention 108, its consent can be just consent.  I think it needs qualification and then    I mean to be confined.

The reason why I said it should be defined as being, you know, fair given with enough information, freely given, specific, et cetera.  So I think we have to think about the hints inside the group. So if I can just address something about when we were talking about harmonization, a bit earlier, I should have mentioned that consider Europe thinking of reviewing the Convention, nothing related to new technology, but to I think to open to the world, if I may say so, legally speaking, the Convention is a treaty, and I think that Council of Europe wants to open it to the world and to third parties and it will call for signatures from states which are not European but from, you know, third party states.

Thank you.  



>> KATITZA RODRIGUEZ:  Any other questions?



>> KEVIN BANKSTON:  Yes,I just wanted to add quickly a comment on consent.  I don't have a magic bullet, a good answer for that, but I do think we need to reconsider what we mean by consent.  I’ll illustrate by example: I have iTunes on my Mac and they recently updated the software and when it updated presented me with a 55 page terms of service, with  a checkbox that says “I have read and understand and agree to this terms of service.”  I’m a lawyer and even I don’t have the time to read that, I probably don’t have—even as a lawyer expert in this area--the ability to spot all the issues in it or understand every provision of it, and yet this fiction that we all actually read those 55 pages and understood them, and agreed to them is at this point the basis of all eCommerce.  And that is not sustainable.  I do not know what the answer is, but that cannot be it.



>> SERGE KISOVICH:  Hi, I'm Serge Kisovich from the Data Protection Agencies to get them to the international standards and so on, it like to play the devil's advocate and the perspective of competition, what would happen if a national data supervisory agency were able to enforce strict supervision of data restriction and jurisdiction and like what they did in Switzerland for example and competition between the national data protection standards due to the other way of cooperation doesn't get the same advantage we hope for?



>> We both here have the fear there would actually be the opposite possibility which is lax enforcement of data protection to companies with (off microphone) I don't know.  That would be my concern.



>> I think the problem we keep struggling with, I know, in industry is that the market for the increased privacy protection seems to be nascent.  Those that are in the field wish there were stronger from the consumers.  The complexity of the market has made it harder for the consumers to use their dial votes on privacy, and that is something that I think we all struggle to overcome to expose that market we believe is out there so companies can put their resources into innovating.  So in light of that, I think you might end up with the opposite problem is that you might end up with people putting their data into a least secure place and getting consumers to use that box.



>> Actually, I think it could happen both ways.  I have    an actually where you mentioned is happening.  Companies can choose a least authority to transfer data from a member state to others.  The legal authority is going back to others, authority, and analyze the rules. I hear more companies saying, if I go to these little authorities, that sets the threshold very high.  If it goes to    over the threshold    they will say this is fine, but there's still the others.  In a way this is taking place already.



>> KATITZA RODRIGUEZ:  Any other question, please?



>> AUDIENCE:  Yeah.  (off microphone) Italy.  My question is about The Future of Privacy and actually, The Future of Privacy (off microphone) and we used to think invasion of privacy comes from government or corporations or businesses.  But I think that more and more the invasion of privacy comes from online users and thinking of children, grownup, thinking of Facebook, (off microphone) and so do you think that technology regulation  is going to take into account that sort of user to user engagement?



>> KATITZA RODRIGUEZ:  Let's pick up two to three questions and we reply because time is finishing.



>> I'm (off microphone) from Brazil and I represent providers, Internet providers in the Brazilian committee, and I would like to ask with the    the European community representative the Convention 108 if this expansion is (off microphone) document or if it is to and it's open to discussion and remission?



>> KATITZA RODRIGUEZ:  Two more questions, please.  There's one here, too.



>> GERSHON JANSSEN:  Okay.  I'm going to ask a question to the full panel.  Today during this session I heard a lot of information on the policy side of privacy, which is of course very important and the basis of all.  But actually the operational side of privacy is left un-discussed.  I'm Gershon Janssen  representing the OASIS Open standards organization and we focus on technical standards and we do believe there is a need for privacy standards on an operational level in order to implement privacy management and privacy rules.

We recently started a new technical committee called Privacy Management Reference Model (short PMRM), and from that perspective, I'm interested in what the panel has to say on the following: the, focus now is on the policy aspects of privacy but what about the technology consequences and how to implement this and assess the compliancy to privacy rules of a certain system? What measures or standards do you think are required?  Thank you.

>> KATITZA RODRIGUEZ:  One more.



>> GRACE BOMU:  I'm Grace from Kenya.  I'm asking this in my personal capacity.  I'm asking about the (off microphone) how far is the thinking on this, for example, in the context of Europe and even America, have we got to a point where there is a bit of consensus on, for example, how long it should take before my data or my expressions on the Internet are forgotten?  Are we close to getting to sort of a time, say, after three years, the things I said when I was younger are completely forgotten?



>> KATITZA RODRIGUEZ:  What wants to start answering?



>> I can respond to the question now, you know, the right of publication to be left alone.  At the moment we thinking more about the right of, you know, to be forgotten on the Internet, but maybe it's not a good idea.  We don't know, yet, we have to discuss it.  And maybe, another idea would be, indeed, to fix a special, you know, to introduce the idea of fixing a date at which all the data would be become expired, so to think about the right to be left alone or forgotten.



>> To answer the question on the new (off microphone) of the net user, the controller of the technologies, one based an inactive, the user is not only receiving information anymore, it's creating information, it's uploading information. So this is    I would say this is one of the most characteristic changes of the Web 2.0, and you are right, I would say that's    it's being    it's being difficult to take that into account.  And to    taking from the regulators' perspective, once again, to accept that subject, someone who's just Facebook or has a blog or something like that, it's someone's kind and what piece he or she is doing, and then place some perspective.  As I say, it's been difficult, but as I say, it's happening.  

But, again, your specific question is can technology specifically solve this problem, but is it being done with technology being an active player on the Web?  My answer would be not only.  I think that's technology that    technology elements help, but again, this is something which has to do with the way in which people behave.  I mean, it's the same    I always use the same example.  It's like traffic.  

I mean, you may have an excellent car with all the security of 15 (off microphone) to drive not properly, so I would say that the user has control of it.  This is the situation with the same    I would say profile, technology may help, may be really useful in some cases but at the end of the day, it's the individual behavior.



>> KATITZA RODRIGUEZ:  I would like to give the mic to the representative of the Council of Europe so she can provide a final answer to the question from the representative from Brazil. Then, I would like to give the word to Christine to give her final remarks, and I would like to take the opportunity to thanks everybody for attending this workshop. There was not enough time to addressed all the issues / question we spot on today but we hope to continue this on going dialogue.  



>> CATHERINE POZZO di BORGO: Just because I forgot to translate the question coming from Brazil, I think that third countries will be called for just the nature, they want to adoption Convention 108, that's the reason why we're thinking about revising the Convention and why we have decided to examine it completely, so I think, you know, the Council of Europe is ready to get new terms and is ready to do a revision of the Convention for that reason.



>> CHRISTINE RUNNEGAR:  Okay.  Thank you.  Just the last final words.  Four things.  First, I would like to thank everyone for coming along today and participating in the workshop. Secondly, to pick up on what OASIS Open said, it's very important in the discussion about privacy, not to forget to include the technical community. Thirdly, just to point out, the Internet Architecture Board also provided a position paper as part of the contribution for Internet Society presentation today.  I have a copy available here with me if anyone would like a copy, otherwise you can find it online at the URL I provided on the slides.  Finally, this is a bit of an advertisement.  We would like to capture your views on privacy and down the back somewhere is my colleague, Greg Wood.  If you would stand up, you can make yourself known to Greg, he would like to do a mini interview with you in a nice, quiet room and capture your views on The Future of Privacy.  Thanks again.



(Applause).   

********