Bulletin January 2018 (Vol. 19 No. 2)

We should be talking about the ethical use of big data and artificial intelligence because the law is always behind the technology. Professor Anne SY Cheung the European Union (EU), the UK and the US have raised concerns about the problem of profiling, predictive decisions and discrimination, and the harm that may result from that. This is because the use of big data is very different from our traditional understanding of how to regulate personal data. “The traditional approach is essentially one of notice and consent: the collection of personal data is allowed only for a specific and limited purpose. But in the age of big data, the more data one has, the more accurate and arguably useful one’s conclusions will be. So the collector tries to collect as much data as possible and only after they have it and have done their analysis, will they find correlations and identify the purpose,” she said. It can be difficult to control the use of personal data in these circumstances. The EU will implement a new regulation in May, 2018 on profiling and the use of anonymous data. Among other things, for decisions made about EU citizens using data collected through automated processes, the individual will need to be notified and will have the right to correct or object. However, this only applies to EU citizens (including those working abroad) and in specific areas such as employment and credit scoring. There are still grey areas. Professor Cheung cites the example of ‘well- being’ apps that track physical activity and other data on individuals. In the United States, one employer encouraged employees to use such apps and tied this to the health insurance focussing on notice and consent – looking at the use of such data, the risk level, how likely it will be shared with a third party, who will be the downstream party. We should be targeting those uses rather than just seeking broad consent, and we should be getting more specific in terms of the possible usage and the context of that usage.” Individuals also need to learn to protect themselves. Professor Cheung herself does not use her real name or put much personal information on Facebook (although Facebook managed to correctly guess her secondary school from the data on her friends). She does not use WeChat, electronic wallets, well-being apps or, for the most part, location services. Her friends have told her that she is being too cautious and that privacy is dead. “I don’t know how long I can resist this trend,” she admitted. “Most people embrace technology and the conveniences and advantages it brings. Of course, big data and artificial intelligence do have advantages that we cannot deny. But in terms of potential risk to the individual, it depends on awareness and education, a discussion about the issues and the approaches of leaders in both government and industry,” she said. For, as the sidebar on China shows, big data not only raises challenges related to privacy, but also governance. █ premium it offered its staff. Health data is regulated in the US but not well-being data. The case has ended up in court. “If I’m very concerned about my privacy and don’t want to join such a scheme, would I be punished by having to pay more for my insurance?” she said. Ethical, risk-based approach needed Professor Bacon-Shone sees two key issues at play with big data use: transparency and fairness. Transparency means allowing people to correct incorrect information about themselves and making it clear how decisions are made, while fairness means avoiding situations like the US bail example. Importantly, it is not only individuals who face unfair decisions – groups can be targeted, too. For example, Mac computer users will be shown more expensive hotel options when they visit the travel website Orbitz than will PC users. “When you’re basically saying to a computer, ‘here is all the data, make the best decision for me’ without understanding how that decision is reached, whether it is fair, whether it has unintended consequences, then you have really very challenging questions. These are ethical questions, not just technological,” he said. Professor Cheung concurs. “We should be talking about the ethical use of big data and artificial intelligence because the law is always behind the technology,” she said. “I’ve been arguing we should move to a risk-based or harm-based regime instead of just (Courtesy of Tai Ngai Lung) China: Big data, big brother? The use of big data in China is of an altogether different level of concern from commercial uses of personal information. The central government is in the process of rolling out a social credit system that draws on big data to rate each individual’s reputation based on their political leanings, purchase history, social interactions and other factors. “China is like a big data laboratory,” said Professor Cheung, who has been studying the situation there with colleague Dr Clement Chen. “Arguably, there is 360-degree surveillance watching individuals and gathering data. They have real-name registration [for mobile and internet services] and close connections between the government and the banking system and internet companies.” The social credit system was announced in 2014 and although it will not be fully implemented until 2020, Professor Cheung and Dr Chen have already found that individuals suffer consequences for a low score. On about five million occasions (as of August, 2016), ‘judgment defaulters’ who defied unspecified court orders were blocked from buying airline tickets. Such individuals were also stopped from travelling on high-speed trains. Low-scorers have also been barred from employment in the civil service and public institutions, and even their children can suffer by being disqualified from studying in private schools. China does not have a law to protect personal data. Provinces and cities are also introducing their own scoring systems in addition to the national one and it has been suggested that people even be scored for filial piety – how well they take care of their parents. “How would they know? We don’t know. It could be from neighbours, your parents or travel tickets you purchase,” Professor Cheung said. “This is more than a privacy issue, it is a governance issue, too, because it concerns the relationship between the citizens and the State. Some scholars agree with the government rhetoric that this is to restore trust and sincerity in China after corruption and dishonesty got out of hand. Some say China is the real Orwellian state, with big brother and small brother watching together, which one cannot escape because people use their phones and the internet and there is real-name registration. It’s unresolved, which makes it interesting and challenging to study.” 13 | 14 The University of Hong Kong Bulletin | January 2018 Cover Story

RkJQdWJsaXNoZXIy ODI4MTQ=