Speech by Seán Sherlock TD on Data Protection Bill 2017
18 April 2018
Parliamentary Party Chair and Spokesperson on Social Protection, Community and Rural Development and the Islands; Agriculture and the Marine
The first issue I wish to raise relates to the complexity of the data protection code. In Ireland we have chosen to retain most of the Data Protection Acts 1988 and 2003, although we will in the future apply them only to cases involving national security, defence or international relations. For every other case, we will have the new EU data protection regulation which has 173 recitals and 99 articles. We will also have this implementing Bill which has 165 sections and three Schedules. The Bill and the regulation will have to be read side by side.
Data protection compliance is as pervasive an issue in modern society as tax compliance and the code is becoming as complex, attracting its own cohort of professional advisers who are needed to explain all of it to those of us who must comply with it.
The rules do not just apply to the Mark Zuckerbergs and Denis O'Briens of this world. The reality is that every one of us who has even a basic mobile phone that stores names and numbers is a data controller in the eyes of the law. Years ago the code was extended to apply to manual as well as electronic records. The rules will not apply to what are called "purely personal or household activities", but if someone has any name or number on his or her phone that he or she no longer needs for personal or household activities, strictly this law will apply to that person and he or she will be in breach of it. I wonder how many citizens are aware of this.
An example of the complexity are the rules relating to children. First, there is the so-called digital age of consent. This is the age at which a child has the capacity to consent to the processing of personal data by the provider of an "information society service". The regulation deals with personal data, data that identify or are about an individual person and include names, addresses, dates of birth and anything that a person shares on social media such his or her preferences, status updates and, in many cases, private conversations over messaging apps, among other elements.
A key issue in the GDPR involves the notion of profiling. Profiling is essentially about using algorithms, data analytics and artificial intelligence to predict what a person might like, what his or her fears might be, what his or her biases might be, what advertisements he or she would like to see and to what messages he or she would respond well. While this all sounds of little consequence, recent scandals such as those involving Cambridge Analytica show that profiling is an extremely powerful tool and can be used to undermine democracy and manipulate ordinary citizens. The tools used to profile users are complex computer algorithms that use machine learning, data analytics and artificial intelligence. A major question in the context of data protection is whether the typical user of any online platform on which one shares data actually has a good enough understanding of how these algorithms work and what the consequences of sharing data that are fed to them might be.
Before proceeding, I acknowledge the work of people like Professor Barry O'Sullivan, to whom I have deferred on this matter as a guide on its technical elements. Professor O'Sullivan who is a professor of computer science at University College Cork highlighted to me that an important point to remember was that the Internet was designed as a democratic environment in which all users were treated and regarded equally. As a consequence, no special allowance is made for children online, but they do deserve special attention.
There has been considerable debate in the Houses on the digital age of consent. It is Professor O'Sullivan's contention, as an expert in the field, that the digital age of consent is not about when children can go online, use devices, etc.; rather, it only relates to situations where the processing of the personal data of a child is performed. The Bill sets the Irish digital age of consent at the lowest possible age - 13 years - but do we believe a child of 13 is able to understand the power of algorithmic profiling? Do we believe children understand how their actions online influence the adverts that they see or the content suggested to them? The answer is surely not. Of course, one can easily make this argument against 16 year olds also, but we should err on the side of caution and set the digital age of consent at the upper end of the allowed age band, that being, 16 years. Doing so would keep Ireland in line with Germany, the Netherlands, France and others that have best-in-the-class approaches to protecting children online. A digital age of consent lower than 16 years is out of line with the age of consent for other matters, for instance, giving consent for health procedures.
It is important that parents have the power to give consent for young children, which a digital age of consent of 16 years would provide. Parents need to be in a position to parent their children, not to be excluded in this instance at the age of 13 years. Advocates for a digital age of consent of 13 years might argue that children have a right to a voice online and a right to participation. No one is arguing against this. However, the digital age of consent only applies when their personal data are being gathered and processed. The argument should be about providing children under the digital age of consent with platforms that do not exploit their personal data as a condition of their use. This is particularly important for vulnerable younger children who should not have to rely on parental consent. The issue in this instance becomes the provision of safe places online that protect their anonymity and platforms that do not profile them on the basis of personal data. Relying on commercial platforms such as Facebook, Instagram, Snapchat and others to provide such safe places or to attempt to use them as such puts vulnerable children at risk. Such risks can arise from the unintentional breach of their anonymity, their targeting through advertising and marketing campaigns based on their interactions online and the potential exposure to individuals online who might not have their best interests at heart.
Many special interest groups and children's rights representatives have argued for a digital age of consent of 13 years. Some of the arguments include the question of how we should support children whose parents are, for example, abusive and, therefore, cannot be relied on to give consent; the need for a child in care to have access to his or her birth family, other family members and friends where it is not clear who has the power of parental consent; and those children who need access to support and preventive services.
It is important to stress that it is dangerous to rely on social media platforms to provide a mechanism for vulnerable or at-risk children to network with like-minded children. First, the existence of these children on such platforms is often easy to discover by a parent. For example, if the child and the parent use phones to access Facebook and each has the mobile number of the other on his or her phone, Facebook can recommend each of them as a possible friend to the other. In other words, the anonymity of the child is not protected, as has been argued on social media sites.
Second, the closed groups might themselves be dangerous since predatory individuals might lurk in them to befriend vulnerable children.
The solution that must be provided to support the children of abusive parents is not standard social media companies but platforms that are safe and that do not process personal data, which would make them exempt from the digital age of consent. Who has parental responsibility for the child is something that should be clarified in the Data Protection Bill 2018. For example, such consent could be given by the person or entity that has guardianship of the child. This is something the Bill needs to include and that can easily be solved.
We have to acknowledge the complexity of the world in which children live. In this party we take the view that we should err on the side of caution and that the most appropriate digital age of consent is 16 years.
The regulation defines an information society service as a service normally provided for remuneration at a distance by electronic means and at the individual request of a recipient of the service. It includes services that are not directly remunerated by those who receive them, if they represent an economic activity. Google search, as an obvious example, falls within the definition of an information society service. The Government has decided to set at 13 years the age at which a child can consent to the processing of his or her personal data in connection with an offer from an information society service. I note that a Seanad amendment provides for a review of this age limit after three years. However, the Bill contains provisions relating to codes of conduct for the protection of children who, for this purpose, are all persons under 18 years of age. We will have 13 year olds with the capacity to sign away their personal data, even though for the next five years of their lives they will be the subject of special protective codes because in the eyes of the law they are still children. It is a contradiction in terms.
On top of this, we have to incorporate the common law rules relating to children and contracts. As I see it, these rules have not been displaced or amended by the Bill. Article 8 of the regulation specifically states its provisions "shall not affect the general contract law of Member States such as the rules on the validity, formation or effect of a contract in relation to a child". What is the position on this apparent contradiction? The digital age of consent is not about a child’s general capacity to enter into a contract; it is solely about his or her capacity to consent to the processing of personal data. It does not affect the general law of Ireland on the validity or effect of a contract with a child or so it states. As I said, under Irish common law, an individual does not have full capacity to make a contract until he or she is 18 years of age. There is an exception for the sale to a minor of what the law calls "necessaries". If these common law rules are left in place and unamended, it seems that the outcome will be that a minor will have the capacity at 13 years to consent to the processing of his or her personal data but may not have capacity to enter the substantive contract in question on the grounds that it is not a contract for a "necessary". Whether the contract is for a necessary entirely depends on a court’s views on the contract in question and, in particular, the goods or services in question and the minor’s need for them. If that is so, in such a case the contract will be unenforceable against the minor such that, for example, any debt accruing will not be recoverable. My point is not so much about whether these are good or bad rules. My question is whether it is an improvement to have two sets of rules that can conflict with each other and which cease to apply at different ages. That is the nub of what I am saying.
I will speak briefly about electoral activities. Following the Cambridge Analytica saga, the mass manipulation of personal data banks for electoral and referendum purposes has attracted a great deal of interest and comment. The Minister responded in the Seanad with amendments to section 45. The section now provides that, as an exception to a general prohibition, the processing of personal data revealing political opinions is lawful where it is carried out in the course of electoral activities in the State by a political party or candidate or a referendum commission. I wonder if this tightly drafted exception has missed the point. As I understand it, Cambridge Analytica is not accused of processing personal data that revealed political opinions. It is accused of amassing data from Internet usage that enables it to compile a crude psychological profile of an individual in order to assist an advertiser in deciding how to shape a political message that would specially appeal to that individual, a custom designed message that would not be seen by the broad mass of voters. If that is correct, I do not see how section 45 would apply to these activities at all. The section, as amended, does seem broad enough to cover political canvassing and the marking of the register, but, as the Minister seemed to admit in the Seanad, it is not a form of exemption envisaged by the regulation. In other words, there does not seem to be a specific provision of the regulation that enables the Oireachtas to carve out an exception enabling the processing of political opinions for electoral purposes. The best the Minister can point to is Recital (56) which states "Where in the course of electoral activities, the operation of the democratic system in a Member State requires that political parties compile personal data on people's political opinions" and so forth. I note that the Minister has said he intends to bring forward an amendment. I do not know if it will speak specifically to that issue. I am hopeful it will. We are putting him on notice that it is our intention to speak to this issue further on Committee Stage.
We in this House and the upper House, councillors and every elected representative in the country are not always electioneering. We aim to always be ready for an election. In the interim, we act as a citizens advice bureau and social workers, legal advisers, advocates and facilitators. We explain and, to some extent, humanise the bureaucracy of the State on behalf of our constituents. This daily routine of our lives has nothing to do with section 45. It is not electoral activity and does not involve processing data on political opinions, but in this process we compile varied and detailed files which, of their nature, are full of personal information, much of which is very sensitive. When elections are held, we contact the people in question and remind them of all the occasions on which we engaged with them in the preceding years. There is nothing underhand about this. It would be astonishing if we did not do it, but is it legal to do so?
Sections 55 and 56 of the Bill seem to have been designed to override the general provisions of Article 21 of the regulation. That article gives an individual the right to object to the processing of his or her personal data. The two sections of the Bill state this right does not apply to direct mailing or data processing carried out in the course of electoral activities. I am entirely unclear on whether this covers direct mailing during an election from a database compiled outside election time and which is compiled from constituency clinic cases. There is a need for clarity in that regard because it goes to the heart of what we do. If no clarity is provided, we are in an existential crisis. If, for example, we are obliged under the general rules not to retain constituent data for longer than is necessary to deal with a clinic case, how would we have the wherewithal to generate any direct mailing list at election time? I can well appreciate that the law should apply to our activities and the information we amass and keep, but I cannot help wondering, having regard to our own activities, how well we and, by extension, the rest of the country will, in practice, comply with the detail of the rules we are about to make.