The first item on the CBC radio news last evening concerned criminality, the criminal use of data in elections. Not the issue of Russian interference to facilitate the election of Donald Trump as president of the United States. Instead, CBC reported on the four hours of testimony that Canadian whistleblower, Chris Wylie (a data scientist who helped found Cambridge Analytica and an advocate for Britain leaving the EU), gave before a committee of the British parliament on the role of data aggregating firms hired by Vote Leave with respect to Brexit and led by Cambridge Analytica funneling money illegally to the Canadian company, AggregateIQ (AIQ), to collect Facebook data and, in the last days of the Brexit vote, influence “persuadables.” Further, he opined that it was reasonable to conclude that the effort altered the outcome of the Brexit vote.
Two criminal acts were allegedly involved. First, aggregate data was illegally collected. Second, money, significantly in excess of that permitted to be used by Vote Leave, was funneled through several data collection companies in order to appear to fall within the limits permitted. On this charge, Wylie backed up the testimony of another whistleblower, Shahmir Sanni, who provided concrete evidence of the breach of spending limits to Parliament. Of course, the companies continued to insist that they had complied with all legal and regulatory requirements.
Wylie testified that the British vote was but one instance of such efforts. The activities ranged around the world, from the Trump election to the Kenyan presidential race, clearly implying that Cambridge Analytica and its parent and related companies were systematically involved in manipulating voters illegally and undermining the democratic electoral process. This past Friday, the Walter Gordon Symposium dealing with “Making Policy Count: The Social Implications of Data-Driven Decision Making” in its first panel took up the issue of Contemporary Policing and Surveillance.
One message came through loud and clear. Police departments are barely into the computer age and are ill-equipped, to say the least, to deal with law enforcement related to abuses in the use of data analytics. Electoral Commissions do not have people on staff that even comprehend let alone are trained to counter such efforts, whether used by Russian hackers or domestic cheats.
Rosemary Gartner, the chair of the panel and a Professor Emeritus from the University of Toronto program on Criminology & Sociolegal Studies zeroed in on how the issue of large scale counting can be unfair when it comes to individual cases of blood alcohol levels, meting out punishments, or even deciding on what is considered a crime worthy of police attention. In light of the big news items, these concerns, however significant, seemed picayune when our whole faith in democratic institutions has been under attack.
Paul Sloly, former Deputy Chief of the Toronto Police Force, who now works for Deloitte, did zero in on mass surveillance and digital crime at its broadest, from cyberfraud to cyberbullying. However, police lacked the most basic servers to do their work let alone counter such criminality. At the same time, there has been an exponential increase in surveillance. The ethical issue of most concern seems to be privacy. In the name of privacy, cameras cannot be used to record and charge speeders who race down our residential streets endangering the lives of children. Automatic Speed Cameras (photo radar) were phased out in the mid-90s. When the Conservatives regained power in Ontario in the mid-1990s under the leadership of Mike Harris, the experimental use of such cameras was phased out.
A report by Drive Safely Michigan stressed improving safety on residential streets by proposing alternatives to surveillance, that would decrease speed or reduce through traffic on local residential streets and in general developing a “traffic calming program” (stop signs, speed limit signs, turn prohibitions, one-way streets, warning and portable signs, speed bumps, rumble strips, street closures, traffic diverters and even road narrowing) to control speeds, but, at the same time, warning or “advising” drivers with permanent markings or signs about the cautions introduced. We have all seen the huge multiplication of these techniques, but I personally – and this is clearly anecdotal – have only observed increased speed on my residential street.
Why not assign officers to monitor traffic? How many? When? Use warnings or tickets? Two problems – the large cost and the effectiveness is restricted to only those periods and places where officers are deployed. What about Automated Speed Enforcement Devices, that is, speed radar and a 35 mm camera interfaced with a computer that could or could not be equipped with issuing automatic tickets? If a vehicle travels down a residential street over a preset threshold speed, the camera photographs the vehicle and its license plate. Tickets could be automatically sent out indicating the date, time location, posted speed and travel speed of the vehicle. Instead, in most jurisdictions, only warning letters are issued. Enforcing speed limits by general surveillance is viewed most frequently as an unwarranted expansion of surveillance. The fact that such surveillance might be significant in analyzing traffic problems that induce speeding and suggest intervention measures, gets slipped to one side in the debate.
There seems to be a misfit between the ethical principles at stake and the nature of contemporary crime. When I interview people on the issue, their concern is not privacy per se, but theft and fraudulent use of private information. They are not so much concerned with keeping their personal information private as preventing its misuse and criminal use. Perhaps, instruments to build in “Privacy by Design” might be helpful, but detection and intervention with actual criminality might be a greater issue.
Professor Akwasi Owusu-Bempah from the Department of Sociology raised the issue of race and the criminal justice system with the old issue of carding, collecting information on “suspicious” individuals, a process that disproportionately, and significantly so, focused on visible minorities, a practice evidently detrimental to policing itself and the integrity of the criminal justice system. Surveillance of what police do in their interactions with the public has undermined almost completely the practice of carding. I thought I had received a double message. On the one hand, traditional values, such as fairness and privacy were critical. On the other hand, in order to protect those values, the police themselves had to be continually subjected to surveillance.
Dr. Valerie Steeves, Associate Professor in the University of Ottawa Department of Criminology, directly addressed the issue of big data and the search for patterns using algorithms to both prevent crime and apprehend criminals. For one, big data can and has been used to undermine the thesis that harsh measures of incarceration cut down criminal activity and to establish that the decline in traditional crimes has taken place independently of such efforts. As far as prevention is concerned, using large data sets and algorithms have not proven to be useful in identifying potential criminality. The feeding frenzy accompanying the mastery of large data and analytics seems to her to be misguided and one must be humble in presenting proposals, implementing them and evaluating the results. Relying on efforts to create smart cities with monitoring sensors everywhere may also be misguided. Steeves was very wary about the process of privatizing the public sphere.
My sense was that the panelists were more concerned with traditional ethical concerns of privacy, transparency and fairness – valuable as those concerns may be – but totally out of touch with the need to understand and be equipped to counter the pervasive kinds of criminality in the use of big data now given almost free reign by the absence of both tools and training to even detect let alone interfere with this raging epidemic. Just because individuals generally are not being killed does not mean that enormous harm is not being carried out – from the pervasive fears that someone will steal my identity and hack into my financial accounts to the undermining of the very political structure on which the health of our society depends.
Hegel in his writing on police in the Philosophy of Right noted that the police were part of civil society and not the state, that they were given exceptional powers of coercion, but only to serve and protect the members of civil society, including, and most importantly, their right to vote in fair elections. The administration of justice is first and foremost needed to ensure that offences against property and persons are negated and the safety of persons and property sustained.
Police and the system of justice more generally were created in a modern nation-state first and foremost to deal with a subjective willing of evil – whether that evil be predatory sexual behaviour, racist victimization or criminal mischief-making. The latter activities, quite aside from a myriad of other pressures and influences, undermine the ability of individuals to make rational choices. Private actions outside of our individual or collective control that either do or could injure others and wrong them must be prevented and offset or compensated for when offences are committed. This is why traffic cameras to monitor speeding and automatically issue tickets should be instituted – not because they are perfect instruments, but because the benefits to personal safety and well-being far outweigh risks to privacy or error.
For the issue is not merely countering injury, but reducing the possibility of injury to as close to zero as is feasible given the need and desire to protect other norms. If police lack the training, if police lack the tools – and I use police in the broadest sense to include institutions such as an electoral commission – if police lack the budgets to counter both actual and possible offences of this order, instead of preventing and limiting harm, the system of justice will be abetting such harm.
This does not mean that surveillance need become ubiquitous. Rather, careful judgement and weighing of ethical norms as well as effectiveness are required to mediate between suspicion and commission of criminality, between suspicion and surveillance, between suspicion and inquiry, between suspicion and what is actually injurious as distinct for what is believed to be injurious, and between what is supposedly suspect and what is claimed to be injurious but is really innocent.
Let me give an example of a failure of policing and the justice system having nothing to do with large scale data and analytics. It was the second item on the CBC 6:00 p.m. radio news last evening. The issue had to do with the case of sexual predatory behaviour at Michigan State University. Yesterday, a former dean of the university, William Strampel, was charged for not preventing a sports doctor, Larry Nassar, from sexually harassing students. It had already been proven that Larry Nassar had for years violated girls and young women, particularly gymnasts, with his finger examinations. This once world-renowned sports physician was sentenced to 175 years in prison.
William Strampel was the dean of the College of Osteopathic Medicine and was responsible for oversight of the clinic where Nassar worked. Strampel failed to enforce orders by, at a minimum, not allowing Nassar to examine students unchaperoned. Nassar was eventually fired in 2016, but between 2014 and 2016, when Strampel had been fully apprised of the risk Nassar posed to students, he failed to set up procedural safeguards thereby allowing Nassar to commit a series of additional sexual offences.
However, in the process of the investigation, evidence turned up that Strampel’s computer had 50 photos of female genitalia, nude and semi-nude women, sex toys and pornography. Further, Strampel himself had solicited nude photos from at least one student and had harassed and demeaned, propositioned and even sexually assaulted students. Strampel insisted in his defence that he was not guilty of any of the charges, but that the problem of enforcing Nassar’s practices rested with the university’s Title IX investigators and not himself. Whether true or not, why was the university itself not charged with negligence with respect to its duty to serve and protect its students?
This is an old-fashioned case of an injustice, though one involving the accumulation of data as evidence. But it is not a case of analytics and large data. The question it raises is that if existing institutions are so grossly negligent in ensuring protection and safety for those for whom they are directly responsible, how can they be tasked with the much larger goal of preventing and inhibiting the epidemic of crimes committed through the use of analytics and large-scale data?
The root of the problem, in my estimation, is the widespread belief in untrammelled individualism. It is why Mike Harris pushed the policy cancelling the use of automatic speed cameras in Ontario. The belief is widespread that personal conscience is the supreme judge of morality precisely at a time when the consciences of individuals are being subjected to widespread manipulation. It is why sexual predators complain that their rights to privacy are being abrogated. It is why they argue that laws should only be introduced to which the individual consents explicitly to bind his or her will. The source of justice, in this misguided view, is seen to be each individual’s unrestricted and unguided conception of virtue and the common good. The result – the diminution of inherited practices of order and good governance that not only respect the individual’s rights to consent and freedom, but reinforce them precisely by also respecting community values and norms already developed to defend our institutions against new assaults. That now entails relatively minor investments in items like automatic ticketing speed cameras, which save money (and lives). Such initiatives also entail massive investments in the technology and skills necessary to counter cyber-criminality.