Privacy and the Actuary
On 12 March this year new privacy laws took effect.
There has been much reference to the APPs or Australian Privacy Principles in the press. Maybe internal legal and compliance colleagues have introduced new privacy practices and procedures or it may all be news to you.
But why should you care? Why is it relevant to you or your organisation– after all actuaries don’t generally deal with individuals? Isn’t it just more red tape that the compliance team will deal with?
In this article we give you a snapshot of some of the most important changes and go on to consider privacy issues in the context of the actuarial profession and the trends and developments in the use of actuarial services.
BACKGROUND
Data privacy laws protecting the personal information of individuals have been in place in Australia since 2001.
Up to now, privacy compliance often consisted of ensuring that there was a privacy policy – usually one that looked the same as all the others, tucked away on websites or included in disclosure material. The popular mantra to withhold access to personal information has been “we can’t provide that because of privacy.“
The new regime changes much of that. There are greater obligations placed on organisations about:
- how they collect, use and disclose personal information;
- how individuals can obtain access to their information and require it to be corrected and updated;
- the security measures organisations need to take to protect personal information; and
- what constitutes a valid privacy policy and approach and what organisations must tell people about their activities.
Australia is following overseas trends. Approximately 80 countries have introduced similar laws reflecting evolving attitudes towards data privacy. Where more and more business is done online, national boundaries are not as significant as with a paper based environment. With the proliferation of cloud solutions and data centres located overseas, personal information is freely – often unknowingly – transferred into other jurisdictions – but this transfer is regulated by Australian laws and the laws of other countries.
At the same time, sophisticated data tracking, mining and analytics is becoming a valuable business tool including to identify and pursue sales of products and services based on an individual’s preferences gleaned from their online browsing habits.
But we know from a range of surveys – including from The Office of the Australian Information Commissioner (OAIC) in its Survey of October 20132 – that individuals are becoming more aware of their digital footprint, the value of their privacy and their rights (see the break out box on page 9).
Echoing this, the new Australian privacy regime has tougher requirements on collection, handling and disclosure of personal information and the OAIC has enhanced powers to compel compliance and to punish breaches.
PRIVACY AND THE ACTUARY
Actuarial work, by its very nature involves the analysis and interpretation of large volumes of data. Much of the time, the data that actuaries use will be personal information about an individual. For example, actuaries may handle such information as an individual’s name, date of birth, employment details and home address3. Where an actuary is working with this kind of information, the Australian Privacy Principles (APPs) will clearly apply, and the actuary will be under certain obligations in how they deal with that information. It is critical that actuaries are aware of their privacy obligations under the APPs when dealing with personal information, and take those obligations seriously – particularly when breaches may now be met with a fine of up to $1.7 million4.
Often the data that actuaries use will not be personally identifiable, and it is not necessary for the data to be personally identifiable in order for the actuary to produce meaningful results for their clients. This, however, does not mean that privacy law is any less of a concern. Even where the data is not personally identifiable, it may still be protected by privacy laws. This is because ‘personal information’ is defined in the Privacy Act to include information about an individual who is reasonably identifiable5. If the bits of data can be put together in such a way that an individual can be identified from it, the data will constitute personal information, and the APPs will apply to it.
De-identification of data has long been proclaimed as an effective means of safeguarding privacy, and side-stepping the obligations imposed by privacy laws – de-identified data is not ‘personal information’ to which the laws apply. Today, however, with unprecedented volumes of data being available, it is uncertain as to how effective de-identification is, and will continue to be in keeping data anonymous. With more and more data in the hands of data processors, it is more and more likely that de-identified data will be capable of being ‘re-identified.’6
So, when does data become personal data? The answer to this is not entirely clear, and is only likely to get more complicated. This is particularly so with the proliferation of Big Data and data analytics, which is an area in which the actuarial profession is becoming increasingly involved.
BIG DATA AND PRIVACY
Data is everywhere. Every day, we produce over 2.5 quintillion bytes of data7, which streams from just about everything we do in our daily lives. Big Data, as a means of making use of this enormous wealth of information, is here to stay.
Big Data is characterised by the staggering volume, velocity and variety of data which is being analysed and interpreted using highly sophisticated processes. A useful and succinct definition is provided by Microsoft, who describe Big Data as ‘the term increasingly used to describe the process of applying serious computer power… to seriously massive and often highly complex sets of information.’8
CHANGING COMMUNITY ATTITUDES TO PRIVACY
The Office of the Australian Information Commissioner (OAIC) has been monitoring the public attitude towards and awareness of privacy for over 20 years. In October 2013 the OAIC released a research report, the Community Attitudes to Privacy Survey.
The Survey tracks important changes in community attitudes towards privacy and personal information. Australians are much more aware of their rights and have a much clearer expectation that organisations will tell them how they protect and handle their personal information and tell them if it is lost.
The Survey found that Australians had varying levels of trust of those who handled their personal information. Health services received a 90% trust rating but social media organisations only received 9%. This reflects the view of Australians that the biggest privacy threats are online services and one of the biggest concerns is having their personal information sent offshore – 90% have concerns about this practice.
The 2013 Survey also tracked changes in behaviour since the previous survey in 2007. As well as having an increased awareness of their rights, individuals were changing behaviours to protect themselves. 63% of the survey participants refused to deal with organisations because of concerns about handling personal information. This increased from 40% in 2007.
The clear message from the 2013 Community Attitudes to Privacy Survey is that organisations need to change their behaviours to meet community expectations. The new privacy law now reflects those expectations.
Big Data and data analytics represents an area of enormous potential growth for actuaries, who are well equipped to use advanced mathematical and computer processes to give structure to vast amounts of data, and to draw a meaningful analysis of risk and opportunity from it.9 It also, however, creates some important concerns for privacy, and challenges some of the fundamental concepts on which our privacy laws are based.
(a) De-identification and Re-identification of data
As mentioned previously, only personal information is protected by our privacy laws. Thus, information that is de-identified will not be the subject of obligations imposed by the APPs. Big Data, however, poses a real threat to the idea of de-identification, because of the sheer amounts of data involved. The accumulative effect of all this data is that it is much more likely that de- identified data sets can be matched with personal information, thereby allowing the data to be re-identified10. Whether the de- identification of data can remain an effective protection of personal information in the age of Big Data is yet to be seen.
Source: ADMA BIG DATA GUIDE, p 24: http://www.adma.com.au/assets/Uploads/Documents/Big-Data-Best-Practice-Guidelines2.pdf
In the UK, the Information Commissioner’s Office has issued a Code of Practice for the anonymisation of personal information, which provides organisations with good practice guidelines for the de-identification of data, as well as a framework for assessing and mitigating the risk of re-identification11. This code acknowledges that ‘the risk of re-identification through data linkage is essentially unpredictable because it can never be assessed with certainty what data is already available or what data may be released in the future.’12 While it provides practical guidance on the issue, it nonetheless emphasises that the question of whether de-identified information still constitutes personal information is a difficult one, and will depend largely on the particular circumstances.13 As a practical measure, the Code suggests that an organisation should apply the ‘motivated intruder’ test to assist in measuring the risk of re-identification – would a person, who has access to resources and by making enquiries and using investigative techniques, be able to identify an individual from the de-identified data?14
In April of 2013, the Office of the Australian Information Commissioner released a draft paper on the de-identification of data and information for consultation. This paper considers steps that can be taken to effectively de-identify data, and proposes applying a similar ‘motivated intruder’ test in determining whether it is reasonably likely that data can or will be re-identified. It is clear from this guidance that merely stripping data of its personal identifiers is no longer enough to ensure that de-identification is effective.
In Australian privacy law, it is the reasonableness of the likelihood that the information can be re-identified that matters. While it may be technically possible for data to be re-identified, it may be such a complicated and time consuming exercise that it is not a practical possibility.15 In assessing the likelihood of de-identified data reasonably identifying an individual, factors such as cost, difficulty, practicality and likelihood of re-identification may be taken into account.16
(b) Notice and Consent?
The APPs are built on fundamental concepts of notice and consent, but how do these concepts reconcile with how Big Data and data analytics works?
In particular, APP 5 requires that an entity collecting personal information must take reasonable steps to notify the individual of the circumstances in which the information is collected. This includes notifying the individual of the purposes for which the information is collected, and whether it is likely to be disclosed to another entity. Furthermore, APP 6 requires that if personal information is to be used or disclosed for a purpose other than that which it was collected for, the individual must give their consent.
In the context of Big Data and data analytics, however, this kind of transparency is necessarily lacking. Individuals are often unaware that their data is being collected, and are much less aware of the uses to which their data is being put. In fact, much of the value of personal data is in its future secondary uses, which are uncertain and unpredictable at the time of collection when notice is given, and consent is sought. Thus, the principles of notice and consent are difficult to apply. Having a robust privacy policy is a good start, but the burden still rests on the individual to understand the increasingly complicated ways in which their data is being collected and used. In short, as stated by Microsoft in their Global Privacy Summit Report, ensuring individual control over personal data is ‘an increasingly unattainable objective.’ 17
With concepts of notice and consent falling short in the age of Big Data, there is much to be said for shifting the burden of privacy protection away from the individual, and on to those organisations collecting and using the data. If anyone knows with any degree of certainty how personal data is used, and is likely to be used in the future, it is those organisations handling the data. So, it makes sense to hold those organisations accountable for protecting privacy, rather than expecting individuals to protect themselves through empty vessels of notice and consent.18
This lack of transparency inherent to Big Data breeds a lack of trust, and public sentiments that the way organisations are dealing with their data is ‘creepy.’19 As individuals become more aware that their personal data is valuable, a rigorous approach to protecting privacy may well become a competitive advantage.20 By featuring privacy protection, rather than doing the bare minimum to comply with the law, the organisation is likely to engender trust, and thereby enjoy better business relationships.
THE FTC AND ‘RECLAIM YOUR NAME’
In 2012, the US Federal Trade Commission (FTC) launched a comprehensive investigation into privacy practices in the ‘data broker’ industry, issuing notices to nine separate firms ordering them to provide detailed information as to how they collect and use personal data. The investigation was sparked by growing public unease, following the media storm surrounding a now infamous campaign run by Target, in which Big Data analytics was used to predict whether customers were pregnant – right down to their due date!21
So far, the FTC has issued 10 notices to data broker firms, officially warning them that they may be violating their obligations in the way they deal with personal information. On top of these notices, the FTC has suggested to all firms involved in similar kinds of Big Data work, that now is a good time to run a compliance check-up.22 The focus for the FTC is squarely on transparency, and how this can be practicably achieved in a world of Big Data .
Interestingly, Commissioner of the FTC, Julie Brill, has proposed a ‘Reclaim Your Name’ initiative, which envisages a centralised mechanism which would inform individuals of how their data is being collected and used, allow them to access the data that exists about them, and provide them with an opportunity to ‘opt-out’ if they disagree with the way their data is being used.23 The FTC views this as a viable, and crucially important tool which will empower individuals, giving them meaningful control over their personal information. Brill argues that the transparency which such an initiative seeks to achieve ‘would go a long way toward restoring consumer trust in the online and mobile ecosystems, allowing us to continue to enjoy all the convenience, entertainment and wonder that cyberspace has to offer.’ 24
The idea is beginning to catch on. Acxiom, one of the world’s largest data brokers with information on about 700 million individuals worldwide,25 recently launched a similarly conceptualised portal, the first of its kind, which allows individuals to see the data about them held by the company, make changes if that data is inaccurate, or ‘opt out’ altogether.26 This has been applauded by the FTC as an important first step in improving transparency in a Big Data world, and other firms are urged to follow suit.
Is ‘Reclaim Your Name’ the answer to Big Data privacy concerns? Not everyone is convinced. The major pitfall of an initiative such as this is that it places the burden on the consumer to take proactive steps to assert control over their personal data. Considering that people rarely take the time to read privacy policies that are put in front of them, is it reasonable to expect that they will utilise a mechanism such as that proposed by the FTC?
In the meantime, ‘Reclaim Your Name’ is still only a proposal, and the practicalities of such a scheme are yet to be ironed out. While it may not be a complete answer to the problem of protecting privacy in the age of Big Data, it is certainly a step in the right direction in enhancing the accountability of those performing Big Data functions. Where Big Data is involved, organisations need to be aware that accountability and transparency will be key to privacy protection into the future.
1 The author acknowledges the contribution of Annalisse Morrow of Sainty Law in the preparation of this article.
2 OAIC
3 Robert Reitz; Actuaries, Data Security and Hill Street Blues
4 S80W(5), Privacy Act 1988 (Cth)
5 S 6, Privacy Act 1988 (Cth)
7 Microsoft News Centre, The Big Bang: How the Data Explosion is changing the world
9 Above, n 4.
11 Ibid, 18.
12 Hunton and Williams LLP, UK ICO Publishes Anonymisation Code of Practice, (21 November 2012).
13 Omar Tene and Jules Polonetsky, ‘Big data for all: privacy and user control in the age of analytics’ 11 Northwestern Journal of technology and intellectual property 239 (2013).
14 Above, n 9, 22.
15 Office of the Australian Information Commissioner, Australian Privacy Principles Guidelines, (1 March 2014) 21.
16 Explanatory Memorandum, Privacy Amendment (Enhancing Privacy Protection) Bill 2012, 60.
17 Charles Duhigg, ‘How Companies Learn Your Secrets,’ The New York Times (online) 16 February 2012
20 Julie Brill, ‘Demanding transparency from Data Brokers,’The Washington Post (Online) 16 August 2013
21 Forbes, ‘Finally you’ll get to see the secret consumer dossier they have on you’ (25 June 2013).
22 Katy Bachman, ‘Senate Commerce Report Says Data Brokers ‘Operate Behind a Veil of Secrecy,’’ Adweek (18 December 2013); See also About the Data
24 Julie Brill, ‘Demanding transparency from Data Brokers,’ The Washington Post (Online) 16 August 2013
25 Forbes, ‘Finally you’ll get to see the secret consumer dossier they have on you’ (25 June 2013).
26 Katy Bachman, ‘Senate Commerce Report says Data Brokers ‘Operate Behind a Veil of Secrecy,’’ Adweek (18 December 2013) ; See also About the Data
CPD: Actuaries Institute Members can claim two CPD points for every hour of reading articles on Actuaries Digital.