Data governance in Australia: A licence to drill (Part I)

Ellie Rennie
14 min readJan 17, 2020

For some years, one of Australia’s big four banks sent letters and push notifications to their customers warning them that their money might be at risk if they sign up to certain fintech companies. These financial services start-ups currently use digital data capture (otherwise known as screen scraping) to get read-only access to their users’ internet banking transaction details. With this data they are able to offer savings and investment services. The CEO of one start-up, Raize, pointed out that banks also use the same screen scraping technologies for various activities, and conform to the same security standards and financial services regulations. Raize does not appear to have experienced a major scandal or breach, whereas a Prudential Inquiry found the bank to have a number of shortcomings in their governance, culture and accountability frameworks, particularly in relation to non-financial risks.

The new Consumer Data Right will address some of the perceived vulnerabilities in relation to fintech when it comes into effect for the financial sector in mid-2020, requiring that banks securely share data with approved entities at a customer’s request. It will also significantly strengthen existing privacy laws including providing meaningful redress if things go wrong. The CDR will progressively be rolled out to other sectors, with the energy and telecommunications among the first to experience it.

The CDR will necessarily trigger a process of review and change inside companies. In addition to implementing technical systems that allow them to comply, companies will need to examine their internal governance and risk processes. The intention, however, is not to curtail the use of data but to expand it.

Currently, many of us are suspicious of online services; half of those who commence the sign-up process with Raize do not complete it, suggesting they are still not comfortable either with the possibility of a data breach or feel there should be more safeguards place if something goes wrong. The Digital Rights in Australia study found that “Most Australians are concerned about their privacy online and are concerned about privacy violations by corporations” (Goggin et al. 2017, 3).

Will the CDR change our willingness to share data with corporations? To answer that question, we need to understand the dynamics of distrust, the interplay between organisations and regulations, and how the nuances of privacy play out through data practices. In this post I provide an overview of how these have been considered to date. In a follow-up piece I will examine what good data governance means, drawing from interviews with Australian company directors and data risk specialists.

“Campus Le Monde” by Alain Bousquet is licensed under CC BY-NC-ND 4.0

What do we distrust?

Many of the kinds of events that can undermine confidence in a digital economy existed prior to the internet, including breaches of contract, privacy or statutory duty, collusion, and misleading or deceptive conduct (Ryan 2019, 14). These are complicated in socio-technical systems: Online transactions frequently occur across jurisdictions; processes can be carried out by code without human oversight or direct recourse; and the user doesn’t always have ability to understand what they are agreeing to (Solove 2013). While corporations are bound by the laws of the countries within which they reside, democratic nation-states have granted broad discretion to online intermediaries, meaning they are “free to set out whatever terms they see fit, and we are each free to accept or refuse them accordingly” (Suzor 2019, 109). Citizens have little say over the scope of data collection about them or “the development of private platforms, run by private entities, with often opaque decision-making processes, behavioural analytics and identity profiling and data on-selling” (Goggin et al. 2017, 5).

According to the Office of the Australian Information Commissioner, 9 in 10 Australians consider the following activities to be a misuse of information: “an organisation that a person has not dealt with before obtains their personal information (87%); personal information is revealed to other customers (87%); personal information is used for a purpose other than the one it was provided for (86%)” (OAIC 2017). We also know that people’s concerns vary by sector. A survey by the ANU’s Centre for Social Research & Methods (Biddle et al. 2018) found that trustworthiness in looking after personal information is highest for health, followed by financial institutions, then state and Australian government departments. Market research, e-commerce and social media have the lowest scores. People expect government departments to share data and overestimate the extent to which this already occurs.

In terms of privacy-related practices, the Deloitte Australian Privacy Index 2019 found that there is a growing consumer awareness of privacy and desire to take control of their data: 63% of consumers deleted apps due to privacy concerns; 89% have denied an app access to their location, photos, contacts or features such as camera and microphone due to privacy concerns; 52% have used privacy enhancing applications (VPNs, encrypted messaging apps, private browsing mode); 46% are likely to provide false personal information when engaging with an app because of privacy. However, the Office of the Australian Information Commissioner found that 61% of Australians do not regularly read privacy policies; 50% do not shred documents or clear browsing history; and 43% do not change settings on social media sites (OAIC 2017).

The term “privacy paradox” is often used to describe these contradictions, whereby people are concerned about their privacy and yet do not act to protect their data. Companies grapple with how to respond. Legal scholars have pointed to the failure of the Notice and Choice system, whereby users give their consent by agreeing to Terms of Use/Service. The intention behind Notice and Choice is that bad practices are minimised through the forces of competition (by us choosing to engage with a service or not), yet our agency is limited when the product or service we need is only available at one site. In addition, users may not be able to process the complexity of these legal agreements; the various ways data might be aggregated by different entities over time make it impossible to weigh up the costs and benefits; and privacy is dealt with at the individual transaction level rather than holistically (Solove 2013). Even those with high digital literacy may feel helpless and resigned to the fact that their data may be exploited for uses beyond what they feel comfortable with (Hargittai & Marwick 2016). Qualitative research has also shown that individual’s views on data practices can be nuanced, reflecting the fact that people’s interactions can span various roles and needs.

Because we cannot meaningfully control how our information is used, privacy management relies to a large extent on industry and organisational practices (Xu 2011). In digital economies, companies have developed mechanisms to “replace or represent the social capital of trust” (Ryan 2019, 33), including the use of contracts as well as reputation and feedback ratings. Other mechanisms to provide the public with information include self-regulatory standards and Codes. For instance, signatories to the Student Privacy Pledge commit to not collecting or sharing children’s data beyond authorised educational purposes or as authorised by a child’s guardian, as well as disclosing in a clear manner what data they collect and what it is used for. However, one study found that the Pledge can provide false assurance as some signatories were violating the terms (Pfeffer-Gillett 2017).

One difficulty for organisations is that the methods used to manage and mitigate risks are themselves complex technological systems, many of which are outsourced and costly (Ryan 2019, 17). While these systems present challenges particularly for small businesses and non-profits who may not have the technical capacity to manage them, non-technical processes for handling data across large organisations and with third parties may present bigger risks.

Institutions and organisations

In her book ‘Who Can you Trust?’, Rachel Botsman defines trust as “a confident relationship with the unknown” (Botsman 2017, 20). The missing element here is that trust also involves an expectation about another’s action — you expect that the other person or entity will choose to take a particular course of action when the available options and consequences are before them (Dasgupta 1998, 51). If trust is a “device for coping with the freedom of others” (Gambetta 1998, 219) then when it comes to data we are mostly unable to cope.

Organisations and institutions often substitute for trust relations, particularly when exchange occurs between people who are not known to each other (Cook, Hardin and Levi 2005). In such situations, institutional arrangements are necessary because trust is unobtainable. Institutions also enable cooperation in situations where there are unequal power relationships. Distrust can be useful in society in that it can drive us to create structures that help limit exploitation and protect those who cannot protect themselves (Cook, Hardin & Levi 2005).

Distrust, rather than trust, is therefore the starting point for understanding the relationship between the public and data industries. Legal and self-regulatory frameworks are a response to the condition of distrust and the multiple uncertainties that we face in relation to data. Regulation provides assurance to consumers and possible recourse when things go wrong. As Phillipa Ryan writes, “Businesses that demonstrate active engagement with these rules and recommendations can then advertise their willing compliance on their websites. This transparency helps to build a strong brand and positive reputation with the regulator and establish trust with customers” (Ryan 2019, 57).

UK think tank Doteveryone surveyed over 1000 people in the tech industry on their hopes and concerns related to technology. They found that “workers are calling for an end to the era of moving fast and breaking things” (Miller & Coldicutt 2019, 6), with 78% of workers favouring responsible innovation guidelines, and government regulation the preferred mechanism. Almost one third “have seen decisions made a about a technology that they felt could have negative consequences for people or society” (Miller & Coldicutt 2019, 8).

Social Licence

The Productivity Commission (2017) asserted in its Report on Data Availability and Use (which led to the CDR), that it is “crucial that Australia builds and maintains a social licence for data sharing and use” (177). The idea of consumer benefit underpins this:

people and organisations are more willing to share information when they trust how it is being used and can see personal benefits stemming from access to their data that go beyond the immediate service they access […] For example, the vast majority of Australians are willing to share their de-identified health data for research purposes, as they see a benefit in the discovery of new drugs and therapies (373).

The assumption within the report is that as more data is made available the benefits will grow, which in turn “builds public trust in the system and supports continued use of data” (373). The benefits are described as “wider range of products and services, more informed decision-making and choices, and improved convenience” (373). The reforms proposed by the Productivity Commission were intended to build social licence by: “ensuring that consumers have the right to practical control of their data, and are made aware of how it is shared and used”; “assuring data custodians and users that data can be shared, released or used without adverse consequences to them or the individuals whom the data is about”; “Promoting community trust that data custodians and their authorised users will safeguard their data” (transparency).

The term ‘social licence to operate’ was invented by business and rose to prominence in the context of the resource industries and mining from the 1990s. Changing community attitudes on the impact of these industries, fuelled by highly publicised disasters such as chemical spills, had resulted in a loss of acceptance among the public, threatening operations (Moffat et al 2016). Applying it to the consumer data practices of businesses highlights the extractive nature of corporate and government data use (also suggested in the language we use to describe data processes; information is ‘mined’ then ‘processed’ from its ‘raw’ state to become valuable).

Social licence is broadly taken to mean that “acting ‘responsibly’ endows the organisation with a perceived legitimacy among external observers who may otherwise constrain or frustrate organisational activities” (Parsons et al 2014, 83). Attaining social licence involves strategies for possessing and maintaining reputational capital (Gunningham et al 2004). The concept acknowledges that social norms can precede legal rules (Gehman, Lefsrud and Fast 2017), implying that to achieve societal acceptance for some actions, companies need to go beyond compliance (Jenkins 2018). It is closely related to the concept of legitimacy as discussed across political science, law and economics. Organisational legitimacy is the acceptance by the public and authorities of an organisation’s right to exist and pursue activities in the manner it chooses. The fields of new institutional theory and management studies have examined how organisational legitimacy is related to rules, cultural alignment, status and reputation, including how legitimacy is won and lost (Deephouse and Suchman 2008).

How do organisations establish social licence? Surveys of community views in the mining sector suggest that social licence depends up on procedural fairness (the extent to which a community feels included in decision-making); distributional fairness (receiving a fair share or return); and belief that governance arrangements, including regulation and law, are adequate (Moffat & Zhang 2014). According to the Productivity Commission, social licence for data use is built when people have “a sound basis for believing in the integrity and accountability of institutions (public and private) managing data”; an “inalienable ability to choose to participate in extracting benefit from data sharing”; “control over how their data is used”; and when they “understand the potential community-wide benefits of data sharing and use” (169). The report references the NZ government-funded Data Futures Partnership, which provides organisations with eight key questions to consider, grouped in to the categories of value, protection and choice. The value questions include who will use the data, who will benefit what will the data be used for; protection is concerned with the security of data, whether it is anonymous and whether it can be accessed and corrected; choice involves consent when asking to use data and whether data can be sold. If the ‘value’ questions are answered positively, there are further prompts to consider active engagement with specific groups that may be affected including Indigenous people (Data Futures Partnership 2017).

There are problems with applying social licence to data use. Parsons, Lacey and Moffat (2014) point out that social licence can encourage practices that achieve community acceptance without necessarily considering broader moral issues. The focus can be on determining what a company can get away with, maintaining existing power relations and quietening voices of dissent as opposed to morally grounded actions. In the case of data, the issue of how to obtain social licence is complicated by the fact that there is not necessarily a local community to consult with, and that the digital economy often spans multiple jurisdictions and regions with vastly different cultural attitudes towards privacy. Moreover, as Meese, Jagasia and Arvanitakis (2019) argue, the CDR can only be used in relation to the marketplace and when individuals engage with organisations subject to the CDR as opposed to fundamental rights that apply in any context (the basis of Europe’s GDPR).

In relation to how the renewal energy sector understands consumers, researchers have warned that “we need to guard against conceptualising individuals as having a simplistic agency, able to autonomous perform the roles we have identified” (Walker and Cass 2007, 465). In the case of consumer data practices, individuals are not necessarily equipped to know how their data is being used, or may find themselves accepting sub-standard practices when they encounter a particular online service. If what we find to be an acceptable level of privacy in one context does not apply in another (Nissenbaum 2009), what steps will companies need to take to ensure they are meeting consumer expectations? If users of a service agree to terms and engage in practices while holding distrust this is hardly a basis for a company to claim ‘social licence’.

In the second part to this post I plan to explore the following:

· What good data governance within organisations looks like;

· Whether start-ups should be seen as riskier than established organisations;

· The extent to which artificial intelligence and machine learning are currently being considered in relation to public trust; and

· The role of regulation in changing data practices within organisations.

To be continued…

Note: I have been conducting interviews with company directors and technology leads within Australian companies for a project that is being led by Data 61 in partnership with the Governance Institute of Australia. The ideas above are based on reading in relation to that project but should not be taken as findings from that project. Thanks to Rob Hanson from CSIRO’s Data61 and Jee Lee (RMIT & University of Canberra) for discussions and for pointing me to some of the resources referenced here.

References

Abraham, C. et al. (2019). How Digital Trust Drives Culture Change. MIT Sloan Management Review, 60(3), 1–8.

Australian Competition and Consumer Commission (ACCC) (2019). Digital Platform Inquiry.

Biddle, N., Edwards, B., Gray, M. & McEachern, S. (2018). Public attitudes towards data governance in Australia. CSRM Working Paper, no. 12.2018. Australian National University.

Cook, K. S., Hardin, R., Levi, M. (2005). Cooperation without Trust. Russel Sage Foundation.

Data Futures Partnership (2017). A Path to Social Licence: Guidelines for Trusted Data Use.

Deephouse, D. L. & Suchman, M. (2008). Legitimacy in Organisational Institutions. In R. Greenwood, C. Oliver, R. Suddaby & K. Sahlin (Eds.), The SAGE Handbook of Organizational Institutionalism (49–77). Sage.

Dasgupta, P. (1998) Trust as a Commodity. In D. Gambetta (Ed.), Trust: Making and Breaking Cooperative Relations (49–72). Basil Blackwell.

Deloitte. (2019). Deloitte Australian Privacy Index 2019.

Esayas, S. Y., & Daly, A. (2019). The Proposed Australian Consumer Right to Access and Use Data: A European Comparison. European Competition and Regulatory Law Review, 3. 187–202.

Gambetta, D. (1998). Who can we Trust? In D. Gambetta (Ed.), Trust: Making and Breaking Cooperative Relations (213–238). Basil Blackwell.

Gehman, J., Lefsrud, L. M., Fast, S. (2017). Social Licence to Operate: Legitimacy by another name? Canadian Public Administration, 60(2), 293–317.

Goggin, G., Vromen, A., Weatherall, K., Martin, F., Webb, A., Sunman, L., & Bailo, F. (2017). Digital Rights in Australia. The University of Sydney.

Gunningham, N., Kagan, R. A. & Thornton, D. (2004). Social Licence and Environmental Protection: Why businesses go beyond compliance. Law & Social Inquiry, 29(2), 307–341.

Hargittai, E & Marwick, A. (2016). “What can I really do?” Explaining the Privacy Paradox with Online Apathy. International Journal of Communication, 10(2016), 3737–3757.

Jenkins, K. (2018). Can I See Your Social Licence Please?, Policy Quarterly, 14(4), 27–35.

Meese, J., Jagasia, P. & Arvanitakis, J. (2019a). Consumer rights to personal data: Data access in the communications sector. Australian Communications Consumer Action Network.

Meese, J. & Jagasia, P. & Arvanitakis, J. (2019b). Citizen or consumer? Contrasting Australia and Europe’s data protection policies. Internet Policy Review, 8(2). DOI: 10.14763/2019.2.1409

Miller C. & Coldicutt R. (2019). People, Power and Technology: The Tech Workers’ View. Doteveryone.

Moffat, K., Lacey, J., Zhang, A. & Leipold, S. (2016). The social licence to operate: a critical review. Forestry: An International Journal of Forest Research, 89, 477–488.

Moffat, K. & Zhang, A. (2014). The Paths to Social Licence to Operate: An integrative model explaining community acceptance of mining. Resources Policy, 29(2014), 61–70.

Nissenbaum, H. (2009). Privacy in Context: Technology, policy, and the integrity of social life. Stanford University Press.

Office of the Australian Information Commissioner (OAIC) (2017). Australian Community Attitudes to Privacy Survey 2017. Office of the Australian Information Commissioner, Sydney, Australia.

Office of the Australian Information Commissioner (OAIC) (2019). Notifiable Data Breaches Quarterly Statistics Report: 1 April to 30 June 2019.

Parsons, R. Lacey, J. & Moffat, K. (2014). Maintaining legitimacy of a contested practice: How the minerals industry understands its ‘social licence to operate’. Resources Policy, 41, 83–19.

Pfeffer-Gillett, A. (2017). Peeling Back the Student Privacy Pledge. Duke Law & Technology Review, 16, 100–140.

Poppo, L. & Schepker, D. J. (2010). Repairing Public Trust in Organizations. Corporate Relations Review, 13(2), 124–141.

Productivity Commission (2017). Data Availability and Use. Report №82. Available from:

PwC (2018). Digital Trust Insights 2018. Available from

Ryan, P. (2019). Trust and Distrust in Digital Economies. Routledge.

Solove, D. J. (2013). Introduction: Privacy Self-Management and the Consent Dilemma. Harvard Law Review, 126 (7): 1880–1903.

Suzor, N. P. (2019). Lawless: The secret rules that govern our digital lives. Cambridge University Press.

Treasury (2019). Consumer Data Right Privacy Protections.

Walker, G. & Cass, N. (2007). Carbon Reduction, ‘The Public’ and Renewable Energy: Engaging with socio-technical configurations. Area, 39(4), 458–469.

Xu, H., Dinev, T., Smith, J., & Hart, P (2011). Information Privacy Concerns: Linking individual perceptions with institutional privacy assurances. Journal of the Association of Information Systems, 12(12), 798–824.

--

--

Ellie Rennie

Professor at RMIT University, Melbourne. Australian Research Council Future Fellow 2020–2025: “Cooperation Through Code” (FT190100372) Twitter: @elinorrennie