Privacy Protection in the Federal Public Service

Unresolved Issues of the Digital Era

by Chantal Bernier. Posted May 12, 2015


Introduction

In one single day, the 23rd of March 2015, the privacy protection issue in the Federal Public Service was under the spotlight, for two reasons: the Ottawa Citizen headlined that complaints relating to the weakness of the security measures protecting personal information in the Federal Government had reached a never-before-seen record, and the CBC as well as Radio-Canada disclosed new Edward Snowden revelations questioning the legality of gathering of personal information by the   Communications Security Establishment Canada (CSEC).

As is the case for all other institutions, privacy protection in the Federal Public Service in the digital age has become an unprecedented challenge, in its importance as well as in its nature. Even experienced managers find themselves unequipped to deal with the convergence of two towering phenomena: an information technology that is wreaking havoc with all traditional patterns of data protection, and a public security environment that calls for the collection and analysis of personal information at an unprecedented rate. The coupling of totally new data collection capacity with a new interest for it, forces a questioning of the acquired schemes of protection, and development of new measures in this area.

Beyond the technical measures required by these new information technologies, the Federal Public Service must update its policies relating to privacy protection so they reflect the unique challenges of these technologies. If no corrective legislative measures are taken — there is no real appetite — the Treasury Board Secretariat (TBS) becomes the main source of standards for privacy protection in a digital environment. I propose five main steps, which result from observations I made during the time I managed the Office of the Privacy Commissioner of Canada (OPC) over almost six years. I start with the main challenges found by OPC studies within the Federal Public Service and that resulted from the arrival of digital technologies:

  1. Management of new information technologies’ vulnerabilities
  2. Definition of personal information in the digital environment;
  3. Debate on the storage of personal information in the cloud;
  4. Differential repercussions of the Internet on the need for public transparency and privacy requirements; and,
  5. Emerging challenges.

 

I will treat them separately in order to define the issues, and propose policy directions for the protection of personal information.

Management of New Information Technologies Vulnerabilities

One can summarize the risks affecting personal information protection brought on by the arrival of new information technologies in the Federal Public Service, as is happening everywhere else, as follows: i) their complexity is such that it overwhelms the common abilities of employees and senior staff, ii) these technologies collect data on such small devices that the best controls can miss them, and on such powerful devices that if the data is lost, the private life of thousands of individuals can be compromised in one fell swoop, iii) the virtual mode of access of these technologies complicates the control over this access, and iv) permanent files created, correctly or erroneously, have massive dissemination possibilities, appropriate or not.

Through all these violations to privacy protection, especially in relation to the digital environment, I saw these risks materialize around four constants: i) the small size and the enormous complexity of the devices became increasingly challenging because of insufficient employee digital training; ii) governance structure is incomplete relative to the realities of the risks; iii) protection from indiscretion is lacking, and iv) new technologies are adopted without proper assessment of the risks involved.

To illustrate the lack of digital literacy, an employee left a USB key uncoded on the desk of a colleague, without any physical protection, thinking that a USB key was more secure than an e-mail. To this day, the USB key is missing. It contained medical information concerning approximately 6,000 individuals. In another case, a lack of digital literacy caused an employee to record on his electronic organizer the reason for a meeting with another employee (for disciplinary action), unaware that the content of his organizer could be read by 17 people who, by the way, knew the employee.

This lack of literacy is due to governance weakness, which does not ensure proper training of employees before letting them use information technology devices.

These incomplete structures of governance have been found in studies by the OPC, even in the case where departments run excellent personal data protection policies. Simply put, these policies were not accompanied by an efficient implementation mechanism. For example, movable devices were not identified, or registered, or entrusted to anyone. With no one responsible, protection of the devices was totally lacking. The devices that contained personal data were lost and never found because there was no mechanism to protect them, hence to find them, or at least to permit tracing of persons responsible for the protection of these devices.

This lack of protection from indiscretion is also quite frequent: the studies of the OPC, from 2008 to 2014, have uncovered the severity of the problem within the public service as well as in the private sector. In the Federal Public Service, we witnessed employees searching in the medical records of a former lover, employees distributing tax returns of celebrities, or accessing tax returns of new lovers and their family. Even if these indiscretions are quite rare, they reveal the systemic weaknesses that make them possible: access authorizations are too wide, controls such as journaling and reviewing are insufficient.

We already know that the main differential repercussion of the digital environment is the following: the smallest mistake can cause enormous damage. For example, a file that really got my attention made me write in 2012 “Ten Thing HR Professionals Need to Know“. It was a case whereby a Director General had her abilities assessment mistakenly sent to 321 colleagues. The mistake being: someone has hurriedly pressed on the button without even being aware of the consequences of the act, furthermore, the department had not restricted the distribution through e-mail of human resources information. The result was: the humiliation of the person and the damage to her reputation. It also triggered an inquiry by the OPC. I would venture to add, along with the damages to the person: the loss of employee confidence in the management of personal information in the department.

How can these blunders be avoided? My recommendations are in “Ten Tips” regarding the digital environment:

  • Avoid sharing sensitive information electronically, even though it is the current method of communication for all other types of communications;
  • Continuously ensure that the technology is mastered by the employee before handing it to him as a working tool, and test the capabilities of the person using it; and,
  • Develop a regime of access authorizations, as restrictive as possible while preserving the functionality of the organization, and support this regime by establishing a journaling process, and regularly review the access data it contains.

 

However, the complexity of information technologies does not affect employees exclusively. Higher level management in the public service, economy and political science experts, do not necessarily have the reflex of owning, as they should, the issue of their privacy protection on new technologies devices. This is what the 2010 audit of the OPC concluded on the use of wireless technologies within five Federal entities.  Those entities had all adopted those technologies and none had implemented an adequate risk assessment. The expected consequences occurred: the employees did not protect their devices through a solid password, the devices were not kept in a safe place, and the adequate protection policies had not been established because the risk had not been determined. I believe, however, that this complacency has now been replaced with greater acknowledgment of the risk, especially since the loss of a hard drive at Employment and Social Development Canada containing the financial data of some 600,000 individuals.

My recommendations in this regard were part of the Special Report of Inquiry relating to this incident and submitted to Parliament on March 25, 2014. Briefly, they were:

  • Protection of personal data in the digital environment should be addressed as an ecosystem of interacting components, i.e., physical, technological, administrative and employee security checks, including the digital literacy needed to handle those work tools.
  • Protection of personal data must be considered as being an institutional issue and not as a distinct and separate issue, that is, of the exclusive domain of the administrators of information technologies or of the office of access to information and privacy. Its implementation must be accompanied by a governance structure that:
    • Reflects the accountability regime established by the Privacy Act, which defines the attribution of this responsibility to the very senior public servants within an organization ; and
    • Insures that the necessary supervision imposed by the Act is present at all levels in order to abide by this regime of personal information protection.

Definition of Digital Personal Information

Internet has challenged the established definitions of both the “personal information” and the private sphere. Two notions that have been challenged in the last while within the Federal Public Service: the privacy aspect, or not, of the Internet subscriber’s data and the IP address, as well as free or protected access to personal accounts on the social networks.

1. Personal Data on the Internet

The question as to whether proprietorship of IP (Internet Protocol) addresses (name, address and other identifiers of the subscriber) or personal data, do or do not constitute personal data, was pressing in the last few years as regards the many successive bills that would have allowed access to these data by the executive and security authorities without court approval. Much of the argument touched on two different contradictory understandings: one concluded that the IP address and personal data, and the relevant personal data of a subscriber, do not carry more value than a phonebook, and that the absence of such a phonebook for Internet cannot determine the judicial statute of the data. The other, of which I approve, said that the subscriber’s data contained in the IP address of the subscriber constitutes a key to the subscriber’s interiority by giving access to his Internet searches – i.e., his areas of interest, his worries or his allegiances – and, consequently, should be considered a static and limited data of a physical address and a phone number.

In June 2014, in its decision re. R. v. Spencer, the Supreme Court ended the debate: it declared that the subscriber’s data in the IP address, allowing access to the Internet searches, is so revealing as to constitute protected personal data, to which the controlling forces can only have access after court authorization.

The consequences for the Federal Public Service are mostly felt within the RCMP and CSIS, but they also have a wider reach: the Privacy Act has just been modified to include, within the interpretation of personal information, the name to which the IP belongs.

Consequently, the federal institutions have to abide by the following constraints:

  • The subscriber’s data in the IP address, or the IP address that can lead to the identification of the subscriber, can only be collected if there is a direct link with the programs or activities of the institution ; and,
  • These data must be obtained through their owner, unless that constrains the use for which these data are intended (for example, a police inquiry).

 

The analytical framework of the OPC  A Matter of Trust: Integrating Privacy and Public Safety in the 21st Century,  published in 2010, gives the four steps of applicable considerations for integration within the measures of public security, of the relative obligations towards privacy protection. They also apply to a regime of access to personal data on Internet:

  • Establishing the legitimacy of the measure on the base of empirical data, which proves its necessity, its proportionality and its efficiency compared to the need and absence of less intrusive alternatives;
  • Implementation of security measures in order to protect data gathered and used legitimately;
  • Development of an internal governance framework that ensures conformity with these security measures; and,
  • Development of an external and internal supervision framework that ensures the accountability of the organization in regard to its duties regarding privacy protection.

 

Therefore, privacy protection is not a hindrance to the carrying out of the main duty of Canada’s government, i.e., to protect the security of its population. It rather provides an implementation framework that protects fundamental freedoms as well as personal security.

2. Access to Personal Accounts on Social Networks

The right of access, or not, of federal institutions to the personal accounts of individuals has been contested in at least two major cases of the OPC: a Privacy Impact Assessment (PIA) of the factors involved in the private life of a program that would have allowed the surveillance of accounts on public servants’ social networks in order to control their political activities, and a study on the surveillance by two departments of the Facebook account of an activist.

In the PIA’s case, OPC’s reaction had a sobering effect on the project: it violates Section 4 of the Privacy Act, as there is no direct link between the project and the activities or programs of the institution. Even if public servants are mandated, in different capacities, to stay away from political demonstrations and the public service is entitled to ensure that this rule is obeyed, the wide gathering of data inherent to the surveillance of the accounts on social networks would have widely exceeded what was deemed necessary in order to ensure that these restrictions to partisan activities are respected.

The project has been vehemently criticized by the upper echelons of the public service following the comments of the OPC, nevertheless, it constitutes an illustration of the consequences such straying can have on the digital surveillance ability without a framework.

One other illustration of this phenomenon was noted in a study by the OPC in 2013. An activist alleged that two departments had gathered her personal information from her Facebook account. None of the departments denied it. However, both said that they had not broken the Privacy Act, as Facebook accounts are in the public domain, hence, the information appearing there is also public and not protected by the Act.

The OPC rejected this argument: the information available does not lose its personal confidentiality just because it is available on Internet. The information still belongs to an identifiable person and is destined to selected people, not the government. And if there is no direct link with the department’s program or activities, it remains out of bounds for the department in question.

The study highlights the uncertainty stemming from the legal status of personal information deliberately posted on Internet. In order to clarify this legal status and the obligations of international institutions in this regard, the Special Report of the OPC to Parliament on January 28, 2014, Checks and Controls: Reinforcing Privacy Protection and Oversight for the Canadian Intelligence Community in an Era of Cyber-Surveillance recommends:

  • Regulating access to sources of open personal information sources accessible to the public; and,
  • Developing outlines defining specifically the gathering, use and dissemination of personal information on line and on social media sites.

 

This recommendation remains valid … and awaits implementation.

Hosting Personal Information in the Cloud

The commitment of governments towards protection of data on Internet has caused some to ask for the government to have its data hosted on its territory. In practice, this excludes government institutions from financial and functional benefits of the cloud because suppliers of cloud computing are mostly Americans. Edward Snowden revelations in June 2013 have increased mistrust to the point where governments that had planned a loosening of these rules had to backtrack.

The Government of Canada has, wisely, not imposed hosting of electronic data in Canada. However, the Canada Revenue Agency reserves the right to allow, or not to allow, the storing of accounting and financial information outside of Canada. The increasing use of cloud computing casts a doubt on the pertinence of this rule, which has, at least, to be explained within the cloud computing environment.

British Columbia and Nova Scotia governments require their institutions to store their data within Canada, except for a few conditions, excluding or complicating the use of cloud computing. I think that the residency requirement of the electronic data in Canada, with all its good intentions, weakens the security of personal data because it eliminates a particularly secure platform for hosting data: the dependable supplier of cloud computing.

I will now move to the strategic factors that should guide Federal institutions for converting to cloud computing.

1. Benefits and Risks of Cloud Computing

A policy paper published jointly by the OPC, the Office of the Information and Privacy  Commissioner of Alberta and the Office of the Information and Privacy Commissioner of British Columbia, and a Fact Sheet authored by the OPC, describe the benefits and risks of data stored by cloud computing hosts: in its favour, cloud computing is an on-demand Internet service that does not require the user to have his own technological infrastructure, allowing for “on-demand self-service, broad network access, resource pooling, rapid elasticity and measured service”. As a result, the user saves money, benefits from a lighter management load, improved efficiency, and the supplier being dependable, enjoys greater security of the data because it is handled by professionals. In this regard, OPC clarifies:

“For businesses that are considering using a cloud service, cloud computing could offer better protection of personal information compared with current security and privacy practices. Through economies of scale, large cloud providers may be able to use better security technologies than individuals or small companies can, and have better backup and disaster-recovery capabilities. Cloud providers may also be motivated to build privacy protections into new technology, and to support better audit trails.”

In stating the potential risks of cloud computing, the OPC refers to physical distance of the data hosting locales, the multiplicity of clients of the supplier, the possible misuse of the data, i.e., using them for other ends than what they were gathered for, and, because of the low cost of storage, the keeping of excessive amounts of data.

The OPC concludes that, in regard to the implementation of cloud computing, “Privacy is not a barrier, but it has to be taken into consideration.”

One can summarize as follows the relevant factors the Federal Government should consider when implementing cloud computing:

  • How is the existing infrastructure improved if cloud computing is adopted?
  • Which data can be stored in the cloud, and according to what criteria?
  • How would users of government services know that the data is in the clouds?
  • Is the cloud supplier dependable, certified ISO/IEC 27018 in privacy protection in cloud computing?

 

This brings me to the perfect combination: one where technological security of cloud computing by renowned suppliers pairs with a contractual mechanism that ensures conformity with certified ISO/IEC/ 27018 cloud computing security.

2. The ISO/IEC 27018 Standard for Privacy Protection in Cloud Computing

The OPC acted upon its beliefs concerning cloud computing: for a long while, the Office provided its expert advice for the development of standard ISO/IEC 27018 Information technology — Security techniques — Code of practice for protection of personally identifiable information (PII) in public clouds acting as PII processors, adopted on April 25th, 2014.

This standard increases dramatically the security of personal information in the cloud by creating a security certification base that combines the supplier’s technological strength with a framework that ensures that the conformity is really solid: by contract, the observance of which is checked through audits, the client organization using cloud computing keeps its control over the data and the supplier cannot use it for any other purpose than that defined by the client. Moreover, the supplier must support the client in respecting his own legal obligations. Finally, the obligations of the cloud computing supplier are subject to audit by the client as well as by the certifying organization, to ensure the level of adequacy. To be certified, the supplier must implement all the security measures required by Standard ISO/IEC/ 27018. Not only does a supplier that does not respect the standard lose his clientele, he also loses his certification.

How is this normative development relevant to the Canadian Public Service? It allows, at lower cost, a maximal protection of personal data by storing it in the most sophisticated technological infrastructures, as per the most efficient and demanding governance model. Standard government institutions are being asked more frequently to share their data beyond Canada’s borders and to hire suppliers that would enhance the efficiency of their services, when they do not possess the required resources to be able to render these services. The ISO/IEC 27018 Standard is universal and is accepted by the various players in the transborder flow of data.

Following the recommendations of the OPC, the Treasury Board Secretariat (TBS) has published, in the framework of its policies of information management, the document Privacy Matters: The Federal Strategy to Address Concerns About the USA Patriot Act and Transborder Data Flows, as well as the Guidance Document: Taking Privacy into Account Before Making Contracting Decisions. These documents should now be supplemented by the ISO/IEC 27018 Standard. But first, let’s have a look at the ISO Standards.

ISO, International Standardization Organization, and IEC, International Electrotechnical Commission, constitute the specialized system of international standardization. Both have members, states, institutions and experts. They are at the heart of the certification of compliance to some ISO standard by an accredited organization to do so. (Rewrite.) Certification is maintained, or revoked, following regular audits.

TBS already uses the universal standards, the ISO Standards. For example, the TBS Standards for the geospatial standard is based on the implementation of ISO Standards 19115 and 19128. The ISO/IEC 27018 Standard would be the perfect and most comprehensive contractual model in order to implement the conversion of federal institutions to certified cloud computing, thus achieving economies of scale and greater data security.

Avoiding cloud computing is obsolete, adopting it without guidance would be irresponsible. The adoption of the ISO/IEC 27018 Standard by TBS would show other Federal institutions the way towards secure cloud computing for personal data according to universally recognized settings.

Balance Between Public Transparency and Privacy

The legislative framework defining the balance between transparency and privacy is based on the complementarity of the Privacy Act (PA) and the Access to Information Act (AIA). Section 19 of the AIA bridges the gap between transparency and privacy. It forbids a public servant in charge of a Federal institution from communicating documents that would contain personal information as defined by the Privacy Act, i.e., information relating to an identifiable person. Three exceptions: the identifiable person agrees to this communication; the public already has access to the information; or, the AIA allows for a special case communication.

The weakness here stems from the fact that the disclosure imperatives emanate from principles that rival in strength the fundamental right to privacy in relation to administrative tribunals since Internet became public. In reality, I am of the opinion that transparency of judicial tribunals must be reviewed within the framework of the differential consequences of Internet. But judicial tribunals are not part of the Federal Public Service. Administrative tribunals are, and are therefore subject to the Privacy Act.

The Federal Public Service has eleven administrative tribunals, of which four work regularly on personal data issues: Canada Agricultural Review Commission; Public Service Labour Relations Board; Human Rights Tribunal; and, Social Security Tribunal. The Public Servants Disclosure Protection Tribunal also publishes decisions that contain personal and even very sensitive information, but they are subject to such disclosure restrictions that the tension between transparency and privacy is resolved within the legislative framework applicable to the tribunal. Furthermore, the other tribunals are proceeding quite cautiously towards a resolution of the natural tension between transparency and privacy.

In 2009, the OPC published, jointly with its provincial and territorial counterparts, a Guidance Document Electronic Disclosure of Personal Information in the Decisions of Administrative Tribunals. The incentive to act came from an observation of real cases of differential consequences of Internet on the materialization of the transparency principle. It is worth noting that, in the Internet environment, the transparency principle does not give prominence to the tribunal, although it is subject to the principle that ensures its impartiality; it gives it to the parties whose identity is of no public interest. The massive and permanent distribution of this information can unjustly damage the parties’ reputation and would cause them to lose all hope of finding work due to such an insignificant cause. And this hinders the access to justice, due to the fact that complainants decide not to exercise their rights, fearing loss of reputation due to the Internet posting of their cause.

The Guidance Document of the Canadian Commissioners of Privacy is based on Section 8 of the Privacy Act, which restricts the communication of personal information without the consent of the concerned individual, except in very rare exceptions that seldom apply to the decisions of administrative tribunals. In summary, here are the proposed parameters, subject of course to the specific rules applicable to each tribunal:

  • Employees of the tribunal should inform the parties, as soon as a recourse has been filed, of the risks relating to privacy and what the safeguard measures are, and encourage the parties not to disclose more personal information than what is strictly needed;
  • The decisions should not divulge any identifier, directly or indirectly. Transparency applies to the reasoning of the tribunal and not to the parties. For example, names should be replaced with initials and addresses deleted or generalized. ISO/IEC 27018 becomes crucially important in the transborder flow of data and the outsourcing services;
  • The decision might contain an identifier when, in accordance with Section 8 (2) m) of the Privacy Act, it is in the public interest to publish the parties’ identities (for example, in criminal or fraud cases);
  • The tribunal would develop criteria to exercise its discretion in the application of the public interest concept.

 

This discretion is an absolute necessity for personal security reasons (a plaintiff contesting her disability pension was threatened by thugs who, having seen the tribunal decision on Internet, knew her address, the amount and payment date of her pension), but also for reasons relating to reputation and financial integrity (two complainants couldn’t find employment for ten years because any Internet search concerning them revealed their grievance).

The TBS must pick up after the OPC and issue policies aimed at re-establishing a fair relation between public transparency and privacy for administrative tribunals in the digital era.

Conclusion: Emerging Challenges

I summarized, from the start, the actual and important challenges that face the Federal Public Service in relation to privacy protection, and the double effect of two fundamental transformations in our means of communication: the arrival of new, complex, powerful and vulnerable information technologies; as well as the increase in cyber-surveillance capacity in a public security environment largely dependent on personal information.

The growing number of new security technologies is clearly moving ahead, mainly in one direction: risk assessment is being refined and multiple technological restrictions are being used following the risk analysis. New applications, like information hubs, where information management is centralized but respects the separation of the different databases, abide by the Act in this regard.

However, the limits of Internet surveillance in a free and democratic society have yet to be defined in view of its progress as well as the evolution of risks concerning physical security.

Privacy protection challenges to Internet surveillance are at the heart of the relationship between the citizens and the state. Section 12 of the Universal Declaration of Human Rights illustrates eloquently this essential character of the right to privacy by declaring:

“Section 12. No one shall be subjected to arbitrary interference with his privacy, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.”

Brought ahead in the debate on Bill C-13, the current Protecting Canadians from Online Crime Act, the issue of the legitimacy of Internet surveillance reappears in Bill C-51, whose short title is the Anti-terrorism Act, 2015. While the Bill traces the legislative process, it also highlights the manner in which the Internet surveillance capabilities force an upgrade of public service obligations towards privacy, i.e.:

  • Section 8 of the Canadian Charter of Rights and Freedoms protects against unreasonable surveillance, which under the Privacy Act includes all gathering of data that have no relation to government programs or activities, which in turn includes the legitimate purposes of inquiry based on a reasonable suspicion, and individualized;
  • Restrictions that regulate the sharing of personal data between government agencies in order to avoid citizen profiling that produces new information beyond what has already been collected from the individual himself, and over and above the reason for which the data was gathered;
  • Natural justice principles of impartiality and accountability, specifically within Internet surveillance, which is meant to be secret and, thus, has to define its own form of accountability to the citizen.

 

The other double-edged sword of technological evolution resides in the analysis capabilities of data, leading to possible data mining.

As, on a smaller scale, the Census data supported, even anonymously, the government’s decisions at all levels, as well as the business decisions, according to demographic, social or economic movements, we therefore will have to develop an ethical framework for the analysis of Big Data that we store to draw conclusions for the greater good of the citizen. These data can improve the government’s services, refine decisions and adequately adapt the programs. Solutions seem to favour a governance framework based on anonymity, necessity and consent, i.e.:

  • When the public service develops efficient policies and programs and needs unidentified demographic data for census purposes, the Privacy Act does not forbid it, on the condition that an efficient anonymization process be applied. This would include the separation of data between demographic and nominative data that are pertinent in such a way as that the demographic data do not relate to an identifiable person, because the re-identification would be so difficult that it would become improbable.
  • For all nominative data needed by the operations of a public institution, Section 4 of the Privacy Act allows their gathering, and Section 7 allows for their compatible use.
  • If the Public Service needs to use personal data for other purposes than the ones that justified its gathering, even if it is in the public interest, it has to request and obtain the consent of the person concerned. For example, if a department wished to contact people for medical research purposes, it would have to explain the purpose of the research, how the personal data would be used, and ask if, in the interest of science, they would consent to this new use of their personal data.

 

These basic parameters show a general trend, insufficient, though, to settle the ethical challenge between reconciling privacy and public interest within the analysis of big data. This subject of conversation, like the one relating to Internet surveillance, must be raised more frequently, in order to ensure privacy protection within a new technological frame.

At this point, the priority of the public service should be to develop a normative framework that reflects what Canada considers to be legitimate in relation to the gathering and use of data in the digital age. In a certain way, Bill C-51 has provoked this debate, concerning both Internet surveillance and the analysis of personal data. But this debate is not what it should be: the die is cast, and the discussion is framed in a limited and political debate instead of within a social blueprint taking into account the real challenges, in a thoughtful and empirical manner.

That’s the next step we need to take in order to preserve privacy in the digital age, and it has become urgent.

 

Appendix

Chantal Bernier, Legal Counsel, Dentons Canada LLP, is a Senior Fellow at The Graduate School of Public and International Affairs (GSPIA), University of Ottawa and Former Interim Privacy Commissioner of Canada. For more information, visit Dentons.com.