Not much of a consultation, not much of a plan: Our submission regarding the federal government’s proposed approach to addressing harmful content online

Natasha Tusikov (Assistant Professor of Criminology, York University) and I have submitted the following to the federal government regarding their proposal to regulate harmful (or, more accurately, illegal) content online.

The tl;dr: We’re very much in favour of democratic government regulation of the digital space, but this proposal seems thrown together. It’s basically a list of legislative and procedural bullet points handed down from on high, with no evidence or explanations provided outlining the scope of the problems or justifying the proposed solutions. We need actual consultations before we can get to a (properly justified) plan.

It ignores the structural aspects (i.e., the business model) that drive for-profit social media companies’ anti-social behaviour. We also highlight the importance of regulating online companies’ activities generally, not just with respect to speech. Finally, we question whether the government currently has the capacity to assess and regulate the digital sphere, and call on the government to re-institute an Economic Council of Canada-type organization, and to reform the relevant regulatory institutions along the lines of Australia’s Competition and Consumer Commission.

Our analysis is based on the work on internet and knowledge governance that we’ve been undertaking for the past several years, which we’ve also been sharing via opeds, most recently in this threepart series, with CIGI, on resetting Canada’s digital-regulation agenda.

Consultation on Proposed Approach to Address Harmful Content Online
Submission by Dr. Natasha Tusikov (York University) and
Dr. Blayne Haggart (Brock University)
24 September 2021

We are writing this submission in our capacity of academics who have researched and written extensively in the areas of platform and internet regulation.

Natasha Tusikov is an Assistant Professor of Criminology at York University. Her book, Chokepoints: Global Private Regulation on the Internet, deals directly with internet companies’ efforts to police illegal and harmful content and activities by their users. She is a research fellow with the Justice and Technoscience Lab (JusTech Lab), School of Regulation and Global Governance (RegNet) at the Australian National University. She has also published scholarly research and opeds in the areas of internet governance, the Internet of Things, smart cities and data governance, and regulating hate speech on social media. Before obtaining her PhD at the Australian National University, she was a strategic criminal intelligence analyst and researcher at the Royal Canadian Mounted Police in Ottawa.

Blayne Haggart is an Associate Professor of Political Science at Brock University and a Senior Fellow with the Centre for International Governance Innovation (CIGI). He is the author most recently of “Democratic Legitimacy in Platform Governance” (Telecommunications Policy 45, no. 6 (2021), with Clara Iglesias Keller) and “Global platform governance and the internet-governance impossibility theorem” (Journal of Digital Media & Policy 11, no. 3 (2020)).

Together we are co-editors (with Prof. Jan Aart Scholte) of the 2021 edited volume, Power and Authority in Internet Governance: Return of the State? (Routledge).We have also published several opeds on the regulation of the digital sphere in The Toronto Star, The Globe and Mail, The Conversation and through the Centre for International Governance Innovation.

We are strongly in favour of government regulation of internet intermediaries and the goal of creating an online environment that is more conducive to socially healthy exchanges. The primary issue when it comes to internet intermediaries is not, should they be regulated by government, but how should government regulate them. However, we have significant substantive and procedural concerns with the government’s proposed approach to address harmful content online. In this note we highlight four in particular:

  1. The presentation of a detailed fait accompli before engaging in meaningful, substantive public consultations;
  2. The lack of evidence presented explaining and justifying these particular interventions;
  3. Its ineffectiveness in addressing fundamental structural problems with social media that give rise to the problems the government says it wants to address.
  4. The focus on regulating social media companies overlooks the necessity of regulating the broader digital economy, including online marketplaces and the financial technology industry. 

Based on these concerns, which we outline below, we call on the government:

  1. To undertake substantive, open consultations to determine how Canada should address these and other related issues.
  2. To present research and evidence outlining the problems being addressed and justifying the government’s chosen approach.
  3. To pursue a regulatory framework that involves structural reforms to address incentives baked into social media companies’ advertising-dependent and data-fuelled business models.
  4. To consider deep institutional reforms to regulate the digital economy, including regulation to address monopolistic behaviour and institutional reforms to strengthen and promote in-house digital policymaking expertise.

We address each of these in turn.

1) Lack of consultation

Most fundamentally, the government has presented Canadians with a whole-cloth policy proposal without first engaging in consultations with Canadians to determine the best way to proceed with regulating online illegal content. This approach is not in keeping with best practices in government policymaking. Instead, it is obvious that the government has looked to other countries for ideas about what we should do here. For example, the requirement that internet intermediaries make content found to be harmful/illegal inaccessible within 24 hours is clearly borrowed from the German NetzDG regime, while the exclusive focus on illegal content (terrorist content, content that incites violence, hate speech, revenge porn and child sexual exploitation) seems to reflect critiques that the UK Online Harms legislation’s focus on illegal and harmful speech may unnecessarily stifle legal (if distasteful) speech.

While one of the advantages of being a laggard when it comes to online intermediary regulation is that we can learn from the experiences of other countries, such policies need to be considered and justified in the Canadian context. This includes considering how we could improve on other countries’ experiences. Internet intermediary regulation, and regulation generally, is not a matter of plug-and-play.

By presenting Canadians with the answer rather than the question, the government is pre-empting the necessary conversation over how these companies should be regulated.

We also note that both the German and UK approaches, which have so obviously inspired the Canadian proposal, were themselves the outcomes of extensive consultative and legislative processes. In particular, the UK Online Safety Bill currently before the UK Parliament was the product of years of consultations beginning in 2017, including a White Paper. The consultation process for the Online Harms White Paper (document here), for example, consisted of 19 questions, including several open-ended ones.

One of the upsides of the UK’s extensive consultations and reporting is that although opposition to it exists (and will continue to exist), nobody can say they’ve been taken by surprise by the resulting legislation, or that it hasn’t been properly considered. Conversely, one of the major downsides of the Canadian proposal is that it reads like a hastily assembled grab-bag of ideas that other countries have implemented, rather than something that has been subject to sound vetting by policy experts and interested Canadians.

Internet intermediaries need to be regulated by government. However, it’s not enough just to do something; we also need to get the legislation right. This requires a much more open and thoughtful process than what the government has put in place. Rather than start with one specific, intricate solution, Canadians and the government need to start with the issue itself, and various options, before settling on one. The UK Online Harms consultation process provides one model in this regard. Another, as we discuss in the context of smart-city regulation, is Brazil’s two-step consultation process leading to its pathbreaking internet bill of rights legislation (Marco Civil): Hold consultations designed as a structured conversation addressing issues of concern; then consult on a specific draft plan.

Recommendation 1

That the federal government scrap the current proposal and engage in actual, two-step consultations with Canadians to address online internet intermediaries’ socially harmful behaviours.

2) Lack of evidence presented

Neither the government’s Discussion Guide nor its Technical paper contain any contextual information or links to research that would allow an interested Canadian or non-expert to understand the nature of the problems being regulated or the implications of the government’s proposed solution. Instead, these documents offer only a highly detailed, legalistic description of several interlocking processes and policies whose implications will be lost on anyone without a deep, technical understanding of the machinery of government and a pre-existing knowledge of the issue areas in question. There is nothing in here to educate Canadians regarding the policies and issues at play.

For example, child sexual exploitation is obviously anathema to any society. However, neither the Discussion Guide nor the government’s Technical Paper contain any information contextualizing either the problem or the proposed solution. Most obviously, the possession and distribution of child sexual abuse images are already illegal. Internet intermediaries already do not allow this material on their services. What is the extent of this problem on, say, Facebook? Our point isn’t that the existence of this illegal content on Facebook isn’t a problem – it very may well be, and if it is, the government should definitely take action. Rather, it is to highlight that both documents fail to even discuss the scope of the problems and how they relate to internet intermediaries. Nothing has been presented to Canadians to justify why these particular issues have been selected and packaged together, and why this particular approach has been proposed. We address some of what’s not covered in this proposal in Points 3 and 4.

The UK Online Harms White Paper offers a sobering contrast to the informational wasteland the government is offering Canadians. It is replete with links to surveys, examples, and other reports – to evidence – contextualizing and justifying its decisions. To its credit, it also indicates (as in Box 29 discussing AI and hate speech) where they still lack a full understanding of the issues and require more research. Nothing in the two documents the government is presenting to Canadians, meanwhile, even tries to justify the government’s response. These are not discussion documents; they’re legislative bullet points that have the practical effect of shutting most Canadians out of this important discussion.

Again, we point to both the UK Online Harms and the Brazilian Marco Civil policy processes as examples for the government to follow. Unlike the Canadian case, both processes were designed to educate citizens about the issues (Brazil’s case explicitly so), not just to present a detailed solution from on high.

No matter how serious the problems being addressed, you still have to explain your proposals. In this case, the government has not even attempted to do so. If the government is interested in pursuing sound, effective, evidence-based internet-regulation policy, it must explain the problem and justify its proposal.

Recommendation 2

That, as part of a revamped consultation process, the government present to Canadians properly researched explanations and justifications explaining the nature of the problems being addressed and why these specific solutions are being proposed.

3) Inadequate consideration of fundamental structural problems

The government’s discussion paper proposes to impose an obligation on regulated entities to monitor their systems for the categories of harmful content, including by establishing flagging, notice, and appeal systems and using automated systems.

This approach effectively asks social media companies to continue their already existing flagging programs. Moreover, these programs are too-often troubled by significant problems with inaccurate or abusive flagging, or are unduly reliant on users’ policing problematic content. The discussion paper also seems to assume that the imposition of fines will encourage platforms to remove problematic content. However, these global companies have shrugged off massive fines in the United States and Europe.

The core problem with this approach is that calling for more flagging systems simply feeds into these companies’ existing reliance on automation and their preference for self-regulation, which are central features of social media companies’ business model. Social media companies minimize costs by automating many activities, and outsourcing the human component of their content-moderation systems to low-paid, often foreign, workers, a pattern similar to the labour offshoring that countless industries have engaged in for decades.

Regulation of social media — indeed, regulation of any online content and services — must begin with an understanding of the business models and, more broadly, the assumptions underlying the digital economy. Social media companies, which make most of their money via advertising, have business models designed to maximize user engagement and to promote viral content. Given their commercial reliance on user engagement as a growth metric, companies are often reluctant to enact measures that deal with bad actors, such as ridding their systems of bots and fake accounts, or setting rules that may limit viral content. Spreading harmful content can be a profitable activity for both platforms and users. During the pandemic, people have profited from pushing fake cures and medical conspiracy theories.

To address this structural problem, there must be structural reforms to social media companies’ business models. In short, governments must consider reforming advertising as a revenue source, with the goal of minimizing social media companies’ reliance on user engagement as a growth metric. Advertising is not the only way that a company can make money. Companies could generate revenues from subscriptions like Netflix or governments could provide funding in the form of nationalizing social media services as public goods. Governments could also get more involved in regulating social media companies’ algorithms so that they respond to democratically determined priorities, rather than reflecting the profit-driven motives of foreign companies.

Recommendation 3

Regulation must entail structural reforms to address and counter negative incentives baked into social media companies’ advertising-dependent and data-fuelled business models.

4) Broader regulation of the digital economy

When it comes to platform regulation, a disproportionate amount of scholarly and public debate focuses on a few social media companies. Along with legislation that addresses the problematic business models of social media companies, the government needs to consider effective regulation of other areas of the digital economy such as competition policy and consumer welfare.

Government action to limit monopolistic online companies is vitally important because of the sheer scope and power of the handful of mostly American companies that dominate the provision of services online. Amazon, in particular, raises monopoly concerns in its dual role as marketplace operator and merchant. As an operator, Amazon is in an unrivalled position to privilege its products and control prices, while as a merchant, it can siphon data from its business rivals to create and push its Amazon-branded products. Similar problems of anti-competitive behaviour are evident in Apple and Google’s duopoly of mobile operating systems and app stores.

One solution to this monopoly problem is a structural separation that would prohibit dominant actors from directly competing with the businesses reliant on their services. Structural separation would not allow search engines, social media, app stores or marketplaces to operate those services and compete directly with third-party businesses reliant upon those services, as is being proposed now in the United States in a suite of antitrust bills.

Another digital sector in vital need of reform are online payment providers. As Dr. Tusikov has argued elsewhere, given the concentration of the online payment industry, payment providers wield significant power to determine what content and services they approve for payment, in what can be called “revenue chokepoints.” Payment providers exert a form of what international political economy scholar Susan Strange called “structural power”: the ability to set the rules under which others operate. Payment providers’ structural power is evident in their decade-long war on sex on the internet in which payment providers, especially those headquartered in the United States and with global operations, have a pattern of denying financial services to people and businesses involved in publishing legal sexual content.

Alongside monopoly problems in the digital economy is the growing role of technology companies such as Apple and Google providing financial services (i.e., fintech), as well as the increasing popularity of cryptocurrencies. To reign in this structural power and to provide effective regulation over the rapidly evolving financial technology (ie., fintech) sector, Canada could follow the lead of the Australian government. Australia has undertaken a parliamentary review of mobile payments and digital wallet services, and a Treasury-commissioned review (the so-called Farrell Report) of the payment system.

One of the Farrell Report’s key recommendations, among those to strengthen licensing and competition requirements, tackles directly another key issue in digital economic governance: platform exceptionalism. Platform exceptionalism contends that services delivered online or via an app should be treated differently than the same services delivered offline (for example, Uber and taxis, or Airbnb and hotels). The Farrell Report calls on Australian regulators to set rules based on the nature of the service, not on the entity providing the service. Under this rule, platforms providing payment services would not be treated differently than traditional financial institutions offering the same services. Simply put, the claim of “platform” would no longer be a perceived or actual regulatory advantage.

As we set out in a three-part series on regulating the online economy for the Centre for International Governance Innovation, what’s needed is a concerted, institutional response to consider deep institutional reforms to regulate the digital economy in the Canadian public’s interest.

We have two suggestions. First, create a successor to the Economic Council of Canada to provide in-house advice to the government of the day on novel economic and social issues. Elsewhere, we and others, chiefly Jim Balsillie, founder and chair of the Centre for International Governance Innovation, have called for such a governmental institute focused on applied policy issues, including the economic and social challenges of a digital/datafied society. The government needs highly qualified, expert in-house analysts to help set policy and evaluate outside advice with an eye to promoting policy in the national interest.

Second, and more importantly, the next government needs to reinvigorate its own bureaucracy to deal with the challenges of the twenty-first century. They could start by reforming or replacing the Competition Bureau, and its enabling laws, with an agency and legislative framework modelled on the Australian Competition and Consumer Commission (ACCC). The ACCC has been at the forefront of mid-sized countries’ attempts to regulate the digital economy, including social media companies. Canada also hasn’t had a dedicated consumer protection agency or industry for decades now — another point we can borrow from the ACCC.

Recommendation 4

The government needs to consider deep institutional reforms to the digital economy, including regulation to address monopolistic behaviour and institutional reforms to strengthen and promote in-house digital policymaking expertise.

This entry was posted in platform governance and tagged , , . Bookmark the permalink.