an interdisciplinary virtual symposium
Digital Regulation in the Pubic Interest: Surveying the Field
This first Digital Regulation in the Public Interest symposium was held on November 7, 2023. The event consisted of three virtual panels and a hybrid book launch. All online sessions included available simultaneous ASL interpretation. The book launch was hosted in Plaza 600F on the Brock University campus.
Presenters were invited to share research focused on public and private governance, widely defined, in our increasingly (commodified) knowledge-driven digital society. In particular:
- The political economy of data, intellectual property, and digital systems
- Critical approaches to digital policies, including platform governance, social media, and machine learning/artificial intelligence
- Social and environmental justice (broadly defined) in the digital/knowledge-driven era
- Power and control in the digital society
- Indigenous digital/knowledge-policy issues
- State and non-state, domestic and transnational governance
- Who benefits in and from the knowledge-driven society? Who loses out?
Find additional events and presentations related to the theme of Digital Regulation in the Public Interest.
(Note that this recording does not include a presentation by Nicole Goodman, Associate Professor of Political Science at Brock University.)
panels and presentations
Panel 1 – Digital and Tech Regulation
Jonathan Obar, York University
The Consumer Privacy Protection Act (CPPA) is a component of Canada’s Bill C-27, currently under review by the federal government. The government describes the CPPA as an attempt to strengthen privacy law governing Canada’s private sector, and emphasizes that improvements to transparency requirements are central to this goal. Transparency here refers to information disclosures by a commercial organization about its engagements with personally identifiable data, expressed in the context of increasing use of big data and artificial intelligence. Meaningful transparency refers to forms of information disclosure helping to facilitate individual understanding about data collection and use, and about associated implications for the individual. Meaningful transparency should be considered a support for meaningful consent processes, providing individuals with access to information that can help ensure safe and informed decision-making online.
This talk will review two sections of the CPPA, the section on openness and transparency, and on consent obtained by deception. Both sections emphasize attempts to encourage better self-regulation regarding transparency, and are each included in a list of CPPA sections linked to monetary penalties for non-compliance. The discussion will address the extent to which these proposed improvements to Canada’s privacy law will lead to corresponding information protections.
View a PDF of this presentation
Visit The Biggest Lie on the Internet, Jonathan Obar’s project site
Sara Bannerman, McMaster University
Copyright is one of the most-lobbied policy areas. Copyright lobbyists can be lumped into several groups: copyright owners, copyright users, and copyright intermediaries. Digital platforms fall into all three groups, and play a growing role in lobbying about copyright in Canada. This paper builds on existing research into tech company lobbying by examining tech company lobbying about copyright in Canada. Building on past studies of copyright policy battles (fought, to a significant extent, by lobbyists), this paper offers a macro view extending over 27 years, from 1996 to the present. Drawing on data from the federal lobbyist registry, it examines who lobbies, who is lobbied, and how these have changed over time. It provides quantitative evidence of the rise of tech lobbying in Canadian copyright, and drills down into the campaigns and targets of that lobbying.
This paper is part of a larger project seeking to understand the lobbying activities of digital platforms (Amazon, Facebook, Google, Sidewalk Labs, Netflix, and Twitter) in Canada.
View a PDF of this presentation
Visit The Tech Lobby, Sara Bannerman’s project site
Nicole Goodman, Brock University
Aleks Essex, Western University
Election technologies including online voter registration, electronic poll books, optical scan tabulators, and electronic forms of voting are becoming part and parcel of cyber elections around the globe. Canada is one of the largest and most-longstanding deployers of election technologies, especially at sub-national and local levels. Yet, despite the frequency and magnitude of use, Canada is the only country in the world that has operated cyber elections for over two decades without any regulatory oversight. This regulatory gap has resulted in repeated cyber incidents and threats to electoral integrity. While other countries that experience such incidents have cancelled or halted electronic voting programmes, however, in Canada, they persist. In response some attempts to regulate election technologies in Canada are underway with one territorial regulation was recently signed into force. While each has promise, regulatory developments in the space have illuminated issues with the standards development process, gaps that remain unaddressed, and challenges in achieving comprehensive management as elections transition from manual to digital.
This chapter critically analyzes the three election technology standard efforts in Canada: online voting, tabulators and electronic poll books. It discusses the benefits and limitations of the processes undertaken to achieve the different standards and the final regulations themselves, problems they will not be able to solve, and how we can achieve and maintain electoral integrity as elections become more digital knowing that regulatory gaps remain. Our argument is two-fold. First, we make the case that changes to existing standard processes could improve outcomes and uptake, and second, that while standards are a useful tool, they alone cannot and will not help us achieve responsible and thorough management of elections. That will only be accomplished by taking a systems approach to the management of election technologies which involves turning the longstanding discourse and siloed approaches to regulation on its head.
PDF of this presentation not currently available
Karen Louise Smith, Brock University
What is an engaged scholar for digital regulation in the public interest? There are many roles that may be considered suitable for academics to assume within the regulatory process associated with digital technologies in Canada. Scholars often conduct empirical research and are called upon to participate in regulatory consultations, sit on advisory committees, or to provide expert testimony as witnesses in the regulation of digital technology. While these roles are important, they represent a limited view of the opportunities to participate in regulating digital technology in the public interest. Drawing upon the author’s 20 year history as a participant at the intersection of technology and the public interest, this presentation outlines three alternative interventionist tactics to support public interest values in digital regulation in Canada. First, the tactic of complaint, including articulating critiques of problematic systems and opting out of them, will be examined as a tactic. Second the practice of information request will be explored; this tactic aligns with asking for one’s personal information from an organization, or seeking public records through freedom of information processes. Third, the tactic of design, to build understanding of technologies, or to create alternative systems will be explored. Case examples for this talk will be drawn from a range of projects spanning infrastructures for universal access, surveillance, e-learning, and open source software development.
View a PDF of this presentation
Panel 2 – Applied Regulation
Kate Cassidy, Brock University
Michelle Chen, Brock University
Generative Artificial Intelligence (AI) technologies have emerged as a powerful force, promising greater efficiency and innovation in the workplace. However, successful integration of generative AI requires more than just technical infrastructure; it demands a careful examination of responsible AI usage.
Our research aims to delve into the human-centered dimensions (those related to people, interactions, and behavior) of AI adoption and utilization within organizations. This includes exploring stakeholder perceptions, organizational culture, group collaboration processes, and individual skills; all factors that play a significant role in the implementation of AI.
Our inquiry will seek answers to the following questions: 1. What does generative AI usage mean in the workplace? 2. How do the human-centered dimensions of the workplace impact AI adoption, and conversely, how does AI usage influence these dimensions? 3. What does responsible AI usage entail? 4. How can we develop policies and training to ensure responsible AI usage that prioritizes the human-centered aspects of the workplace?
By addressing these questions, our work will contribute to the emerging research on generative AI use in the workplace. Specifically, a) our research takes a significant step towards bridging the gap between technology adoption and ethical considerations by emphasizing the concept of responsible AI use, b)it will shed light on aspects often overlooked during AI integration in the workplace, offering a more comprehensive model that integrates human factors as crucial components, and c) it will help develop insight to support policy, training and best practices for the ethical and responsible implementation of generative AI in the workplace.
View a PDF of this presentation
Alison Innes, Brock University
A podcasting project by the Toronto Police Services (TPS) hit the headlines in February 2023 when the CBC revealed the TPS were paying $337,000 for a sole-source contract to produce the audio and video podcast “24 Shades of Blue.” While reporting at the time focused mostly on the financial nature of the TPS contract with Obie & Ax Inc., the story raises larger issues for consideration about the use of podcasts by government agencies to influence public conversation. Documents obtained through FOIA reveal that the police service chose to use the third-party contractor to avoid appearance of copaganda, which is defined as media with a pro-police bias that minimizes the violence and trauma police cause to marginalized communities (Makim 2020; Corbett 2020). The TPS’ attempt to influence public conversation while obfuscating their editorial influence requires closer scrutiny on behalf of the podcast’s subjects and the communities the TPS claims to protect.
Citizens should expect ethical, professional, and transparent communication from their government agencies. Using the ethical guidelines for social media established by Bowen (2013; Bowen and Stacks 2014), I assess the ethical failures of TPS’s “24 Shades of Blue” and compare it to police podcasts in the United States and Australia. Drawing on best practices of professional communication and public relations, I seek to define the characteristics of a professional podcast and use this to assess government agency podcasts. My work also considers how podcasting may be used in a transparent manner to build trust and negotiate power imbalances between government agencies and their citizens (D’Ignazio and Klein 2023; Kahn 2017).
View a PDF of this presentation
Maya Karanouh, Brock University
The exponential growth in user acquisition and popularity of ChatGPT, an artificial intelligence(AI) powered chatbot, was accompanied by widespread mainstream media coverage. This article presents a quantitative data analysis of the early trends and sentiments revealed by conducting text mining and NLP methods onto a corpus of 10,902 mainstream news headlines related to the subject of ChatGPT and artificial intelligence, from the launch of ChatGPT in November 2022 to March 2023. The findings revealed in sentiment analysis, ChatGPT and artificial intelligence, were perceived more positively than negatively in the mainstream media. In regards to word frequency results, over sixty-five percent of the top frequency words were focused on Big Tech issues and actors while topics such as jobs, diversity, ethics, copyright, gender and women were poorly represented or completely absent and only accounted for six percent of the total corpus. This article is a critical analysis into the power structures and collusions between Big Tech and Big Media.
View a PDF of this presentation
Zachary Spicer, York University
Ian Stedman, York University
Cities are increasingly digital in nature. Those that infuse technology into servicing, programming and policy processes are often referred to as ‘Smart Cities’. Despite the presence of some smart city mega-projects, like Toronto’s Quayside project, how these cities are procuring technology, interfacing with private technology vendors and firms and using data-intensive infrastructure is largely unknown. This presentation shares the first comprehensive inventory of smart city design, governance and procurement in Canada, focusing on every Canadian city over 100,000 in population (N=57). This inventory captures the types of technologies being procured by Canadian cities, the size and scope of the internal teams working to install smart city technologies, the privacy and data governance standards being put in place, the amount of community engagement, interaction with private actors and external organizations and the overall maturity of the smart city planning process. Overall, this project finds that while many cities lack a dedicated smart city plan or proposal, they are indeed procuring smart city technology. This procurement is occurring in a rather undirected format with very little consideration for privacy and data governance standards.
View a PDF of this presentation
Panel 3 – Contesting Theories of Knowledge and Technology
John R. Heckman, Brock University
This presentation will explore the Ukrainian Ministry of Digital Transformation and its groundbreaking digital application, “Diia,” as an influential force in state and domestic governance of digital data. Launched in 2020, “Diia” provided Ukrainians with a digital portal to carry their personal documents. However, its impact has transcended mere convenience, transforming into an asset for resistance and resilience in the face of the Russian invasion.
View a PDF of this presentation
Additional Resources:
Ukraine’s digital revolution is proving vital for the country’s war effort, an article by Mykhailo Fedorov published in the Atlantic Council, July 2023
Pascal Lupien, Brock University
This paper presents an analysis of the evolving cyber politics landscape in Latin American countries from the perspective of the region’s Indigenous social movements. It examines the types of regulations and tools that Latin American governments have adopted to assert their control over the online world, how these measures constitute barriers for Indigenous organizations, and how social media and other digital technologies are shifting the balance of power between Indigenous movements and the state. Based on three years of qualitative fieldwork in Ecuador, Bolivia, and Chile, this paper argues that the ability of Indigenous groups to use digital technologies for political engagement is threatened by restrictive communications policies, libel laws, authoritarian national security legislation, and government use of surveillance technologies. Indigenous claims challenge established political and economic structures of power, but communities lack the resources to ensure their own information security, to detect online attacks and to defend themselves against authoritarian practices. As a result, Indigenous organizations are faced with censorship, self-censorship, and accusations of defamation and terrorism. This limits the potential of digital technologies to serve as instruments of democratization.
View a PDF of this presentation
Additional Resources
Pascal Lupien, Gabriel Chiriboga & Soledad Machaca (2021) Indigenous movements, ICTs and the state in Latin America, Journal of Information Technology & Politics, 18:4, 387-400, DOI: 10.1080/19331681.2021.1887039
Liam Midzain-Gobin, Brock University
As Linda Tuhiwai Smith notes in Decolonizing Methodologies: Research and Indigenous Peoples, Indigenous communities are among the most researched communities on earth, with much of that research being used to further colonial projects and undermine Indigenous sovereignty. This situation persists in today’s political climate where many organizations – both public and private – identify reconciliation as a is a chief concern. One of the reasons for this is that control of information remains in the hands of colonial institutions.
Taking university-based research as its focus, this intervention looks to the implementation of Ownership, Control, Access and Possession (OCAP) principles that are intended to address this issue. The intervention offers reflections on some of the difficulties and possibilities associated with implementing OCAP in research projects where settler researchers and working with Indigenous community organizations producing digital data. Using the framework of Digital Sovereignty in relation to Indigenous communities and research practices, the reflections will be based on two community-engaged projects, one with a collection of urban Indigenous organizations in Niagara, and one with First Nations in New Brunswick. Issues addressed include: working with multiple partners, unclear institutional requirements, and working with funding bodies.
View a PDF of this presentation
Sasha Skaidra, University of Alberta
The Canadian Office of the Auditor General’s (OAG) audit of immigration removals in Canada involved the use of data analytics to scrutinize the Canada Border Service Agency’s (CBSA) digitized inventories of refugee, resident, and migrant files. I will discussing the implications of my recently published article Data laundering border violence: Performance measures and immigration enforcement in the journal Public Integrity. I elaborate a gap I identify in in Public Administration literature regarding how digital inventories operate. The OAG noted in its audit of the CBSA that digital inventories helped to obscure the actual number of migrants under state surveillance and in detention because files will move between databases. My presentation discusses the digital aspect of data laundering. My article argues that “performance audits” (used widely in the public sector) are a governmental technology called data laundering and in the case of the OAG and CBSA, will rationalize the violence inherent in immigration enforcement. Data laundering obscures the fact that policing migration depends on broad discretionary powers, leading to opaque and inconsistent data practices. “Laundering” signals auditing’s inability to be sufficiently adversarial with a sector of law enforcement whose poor data-keeping practices maintains an illusion of recordkeeping as a form of power. Audit dependence on quantitative forms of data increases violence against immigrants; when violent deportation and detention measures are quantified, this presumes an acceptable ledger of force that accounts for, and in so doing legitimizes, state enactment of violence upon vulnerable people.
View a PDF of this presentation
Panel 4 (hybrid) – Book Launch
Authors: Blayne Haggart (Brock University), Natasha Tusikov (York University)
Discussant: Liam Midzain-Gobin (Brock University)
From publisher’s website: From the global geopolitical arena to the smart city, control over knowledge—particularly over data and intellectual property—has become a key battleground for the exercise of economic and political power. For companies and governments alike, control over knowledge—what scholar Susan Strange calls the knowledge structure—has become a goal unto itself.
The rising dominance of the knowledge structure is leading to a massive redistribution of power, including from individuals to companies and states. Strong intellectual property rights have concentrated economic benefits in a smaller number of hands, while the “internet of things” is reshaping basic notions of property, ownership, and control. In the scramble to create and control data and intellectual property, governments and companies alike are engaging in ever-more surveillance.
The New Knowledge is a guide to and analysis of these changes, and of the emerging phenomenon of the knowledge-driven society. It highlights how the pursuit of the control over knowledge has become its own ideology, with its own set of experts drawn from those with the ability to collect and manipulate digital data. Haggart and Tusikov propose a workable path forward—knowledge decommodification—to ensure that our new knowledge is not treated simply as a commodity to be bought and sold, but as a way to meet the needs of the individuals and communities that create this knowledge in the first place.
Read a PDF of this open access publication
About the series
Hosted by the Faculty of Social Sciences, this series aims to showcase the variety of work being conducted by faculty and student researchers across Brock University, to uncover an array of perspectives, and to foster potential synergies and collaborations.
Cross-disciplinary and cross-Faculty participation is encouraged.
Learn how to participate in this Symposium Series.