Digital Regulation in the Public Interest

an interdisciplinary virtual symposium

Thursday, March 20, 2025

9 a.m to 4:50 p.m.
online presentations

Register to attend any or all virtual panels (as described below).

5 to 7 p.m.
in-person reception

Panels will be followed by an in-person post-event reception from 5-7 in Plaza Building, Room 600F, open to all.

Join the online audience

Register to receive the link to join this free virtual symposium.

This event is part of 60 Research Talks at Brock, celebrating 60 years of impactful research at Brock University.

Presenters and Abstracts

View the event agenda here.

Opening Remarks

9 to 9:15 a.m.

Regulating knowledge

9:15 to 10:45 a.m.

Tanner Mirrlees, Communication and Digital Media Studies, Ontario Tech University

Building upon the political economy approach to the educational technology (EdTech) industry, and addressing the symposium’s topics of digital platforms, IP, data, power, and control, this paper examines how EdTech firms are reshaping public education institutions through contentious business models borrowed from the digital media and entertainment industries (DMEI). It identifies four dominant EdTech industry business models—platform or surveillance capitalism, public-private partnerships, content monetization, and crowdsourced labor—and probes how these models infiltrate public academic institutions, reshaping teaching and learning practices, and transforming universities and colleges into new sites of revenue generation. The paper interrogates how private market logics, rather than public interest goals, drive EdTech’s purported “disruption” and “transformation” of higher education. Google Workspace for Education’s data-extraction practices, Coursera’s proprietary content subscriptions, ChatGPT’s copyright dispossession, and Course Hero’s crowdsourcing of educational materials are used to illustrate the EdTech industry’s models. The paper highlights some governance risks posed the largely US EdTech industry’s encroachment into Canadian public institutions, emphasizing how they embed their models into higher education, converting it into a new market for their products, services, and data exploits.

View a PDF of this presentation

Karen Louise Smith, Communication, Popular Culture and Film, Brock University

Post-secondary education institutions in Canada regularly conduct privacy impact assessments (PIAs) as educational technology (edtech) is procured. PIAs have historically served to identify and mitigate the risks associated with the collection and processing of personal information. As artificial intelligence is being deployed across edtech systems to automate decision-making, personalize content, conduct facial recognition tasks, generate text and multimedia content, or to automate transcription and other pedagogical tasks, the need to evaluate the impacts of edtech is intensifying. This paper provides a speculative regulatory future, to outline the socio-technical factors that might best serve to increase the readiness of the post-secondary sector to obtain and utilize algorithmic impact assessments (AIAs) in Canada. Possibilities for state privacy and AI regulation, self-regulation by edtech companies and advocacy by coalitions of educational institutions, motivated educators and professionals, or student complaints will be critically considered within a socio-technical constellation of factors.

Visit Dr. Smith’s FACULTY PAGE 

View a PDF of this presentation

Michelle Chen and Kate Cassidy, Communication, Popular Culture and Film, Brock University

This study explores the impact of Generative AI (GAI) on communication as a profession and advocates for an understanding of the human and cultural dynamics influencing GAI’s application. As more organizations employ GAI to assist with or replace communication tasks and roles, it becomes increasingly important to situate GAI within cultural dynamics when examining its use in the workplace. This study seeks to better understand how GAI shapes, and is shaped by, organizational communication and culture, with an eye toward responsible GAI practices. This research is guided by the following research questions: How do individuals within knowledge-based organizations perceive and make sense of the integration of GAI into their daily communication practices, and what are the implications for responsible use? Through interviews with professionals, the study explores perceptions of, attitude towards, and experiences of using GAI in the workplace. Preliminary findings reveal unregulated, untrained, and unsupervised use of GAI, exposing organizations to potential reputational harm. Findings also show that participants use GAI as a collaborator and communicator, and this has changed work team dynamics as well as created tension around ‘effort’. Consequently, the shifting perceptions of ‘original work’, ‘work ethics’ and ‘communication skills’ have broader implications for the communication profession.

Visit Dr. Cassidy’s FACULTY PAGE

Visit Dr. Chen’s FACULTY PAGE

View a PDF of this presentation

Ann Gagné, Centre for Pedagogical Innovation, Brock University

The Web Content Accessibility Guidelines (WCAG) are foundational to digital information sharing on the web, including social media, educational technology, and now generative AI tool usage. The WCAG guidelines should ideally be a space where conversations around public access to information in an accessible manner occur. However, the main barrier to this work is the real difficulty in making the WCAG standards understandable to those who do digital knowledge creation work. To support accessible information sharing and the possibilities of legacy storage and curation of information is a disability justice issue (Sins Invalid, 2015) highlighting the need for sustainability, interdependence, and collective access. This paper will provide concrete strategies on how to open WCAG conversations to the public outside of specialized language and frameworks in order to support more ethical and inclusive digital environments. It will emphasize the trickle-down effect that inaccessible digital content has on ideas, policies, and practices, as well as demonstrate how inaccessible content creates exclusionary silos of information with larger impact on public safety and health.

View a PDF of this presentation

Real and virtual bodies

11 a.m. to 12:30 p.m. 

Elsie Sheppard, Communication and Media Studies, McMaster University

Modern corporations collect massive amounts of personal information used for the behavioural modification of consumers, known as surveillance capitalism. Companies use this data to target advertisements and market products to specific groups to manipulate consumers to increase corporate profits. Many scholars claim the rise of surveillance capitalism in North America began in the mid or late-20th century; however, many economic sectors utilized features of surveillance capitalism in the late 19th century, such as historical patent medicine firms. This paper examines historical records of the Lydia E. Pinkham Medicine Company to help further identify the origins and impact of modern surveillance capitalism. This paper illuminates the difference between what I term traditional and covert market research. While many patent medicine firms and other historical businesses used traditional market research methods such as questionnaires and copy testing, I argue that the Pinkham Medicine Company used covert techniques (such as solicited letters and consumer surveillance) outside the realm of traditional market research and more akin to features of modern surveillance capitalism. This historical discussion of personal data collection helps to situate modern data collection by Big Tech companies and ask questions surrounding the ethics of mass data collection and monopolies of knowledge.

View a PDF of this presentation

Kathleen Cherrington, Gender, Feminist & Women’s Studies, York University

This presentation explores the evolving dynamics of intimacy and consent in the digital age, focusing on AI companions, digital sex work, and human connection. Through qualitative analysis of interviews with companions of AI robots, love dolls, clients of digital sex workers, and an AI robot itself, alongside my auto-erotic ethnography with an AI chatbot named Maximilian, it examines how these technologies challenge traditional understandings of consent and intimacy.

View a PDF of this presentation

Natasha Tusikov, Social Science, York University

Period-tracking apps like Flo and Clue that enable users to monitor their menstruation cycles have existed for about a decade. A newer technology is digital-contraceptive apps that promise a precise prediction of the start of the menstrual cycle and ovulation, essentially operating as an algorithm-fuelled rhythm method. Controversially, several countries, including Canada, now license these contraceptive apps as software-driven medical devices. Health experts, however, warn that these “fertility awareness-based methods” are less effective than other contraceptive methods. Further, these apps raise important governance challenges related to safety, privacy, transparency, and accountability, particularly where abortion is restricted or criminalized, and as users may not understand the efficacy or data-leaking nature of these apps.

View a PDF of this presentation

Recalibrating power  

1:30 to 3 p.m.

Marika Jeziorek, Wilfrid Laurier University / Balsillie School of International Affairs

This paper offers a comparative analysis of two digital platforms designed to support migrant populations: Martynka, a digital humanitarian platform created in Poland by Nastya Podorozhnya, a Ukrainian living in Poland, to support displaced Ukrainian women, and the Newcomer App, developed by Edith Law, a newcomer professor at the University of Waterloo, to assist migrants in Ontario. Drawing on gender-focused theories and concepts of responsibilization, this research examines how these apps function as mediators within migration governance systems shaped by state outsourcing and fragmented responsibilities. Applying a gendered lens, the paper examines the role of these apps in involving migrants, and in particular women, in creating their own solutions, while also evaluating the apps’ impacts, limitations, and contributions to more inclusive migration governance.

View a PDF of this presentation

Kaushar Mahetaji and David Nieborg, Faculty of Information, University of Toronto

Specifically, we offer an empirical case study of TikTok One, a suite of platform tools meant for marketers to develop ad campaigns. We systematically map the individual platform tools that comprise TikTok One—the contexts in which they are used, their connections to other tools and users, and their evolution over time. Our findings illustrate that TikTok structures TikTok One to increase the number and diversity of (1) users and user groups and (2) interactions between its userbase. Essentially, TikTok grows as a “multisided market” using TikTok One, and in the process, the company governs platform labour.

View a PDF of this presentation

Kyle Wyndham-West, Communication Studies & Media Arts, McMaster University

My doctoral research will investigate how blockchain technologies impact visual artists, focusing on the OpenSea and SuperRare platforms, and their use of cryptographic technologies in intellectual property and artists’ rights management. Blockchain introduces novel opportunities for creating, owning, and trading art, such as fractional ownership and decentralized networks. While these technologies promise new artistic, economic, and intellectual property opportunities within Web 3.0, they also risk perpetuating existing inequalities and exclusion within the art market, and the greater technological arena.

View a PDF of this presentation

Ideological and regulatory futures

3:15 to 4:45 p.m.

Stefan Dolgert, Political Science, Brock University

TESCREAL (Transhumanism, Extropianism, Singulatarianism, Cosmism, Rationalism, Effective Altruism, Longtermism) is a newly cohering ideology that sees the future of humanity as inextricably linked with transitioning our biological life into new digital forms, by which we must then spread human consciousness to other planets to avoid eventual extinction. It is backed by many of the wealthiest people on earth (Elon Musk, Peter Thiel, etc.) and is already affecting political decisions in the here and now. This vision for human survival presents itself as a realistic framework for achieving an unlimited human future, but beneath the optimistic sheen lie dangerous assumptions and a radical inegalitarian agenda. TESCREALists assume that humanity’s true purpose can be found in expansion into the solar system and beyond, but for them that means those future (digital) humans populating the universe are infinitely more valuable than the flesh-and-blood humans of the present. Getting us to the stars, they believe, will require radical prioritization and dramatic sacrifices, and only by transforming ourselves into a sufficiently “fit” civilization will we be able to survive evolution’s contest. I will narrate a tour of this seemingly benign futurism, highlighting the eugenics, racism, and elitism that makes TESCREAL a form of techno-fascism.

Visit Dr. Dolgert’s FACULTY PAGE 

View a PDF of this presentation

Nicole Goodman, Political Science, Brock University, and Helen Hayes, Art History and Communication Studies, McGill University

The sheer magnitude of data and advances in technological development that both enable and are enabled by it have transformed our economy, politics, and way of life with equal parts possibility and risk. These circumstances have prompted calls – of varying degrees and at different moments over the last two decades – for regulation. Yet, navigating the complex regulatory web that technologies and stakeholders inevitably weave has meant that while governments in Canada have made some headway to manage risk, in many domains, regulatory progress has been slow or has not had the desired effect. This talk serves as the introduction to an edited collection. It examines the state of the literature, why there is a gap related to the regulation of technologies and data, and argues for how we should define regulation and conceive of the governance of digital and data. We argue that addressing the challenges of the digital age requires comprehensive, ethical, and inclusive policy solutions, which may include a wider network of actors and regulatory agents. In some cases, the solution may lie outside of regulatory regimes entirely.

Visit Dr. Goodman’s FACULTY PAGE

Charles Conteh, Political Science, Brock University

Over the past three decades, breakneck trends in information communications technology, including automation, mass digitization and, more recently, the deployment of artificial intelligence in the production and service-delivery processes, have fundamentally altered all sectors of the economy and society in Canada and worldwide. The Niagara Community Observatory, a Brock University research institute, has been conducting several studies that look broadly at the benefits and challenges of these trends. More specifically, the institute has been investigating data-driven automation, robotics and precision technologies in the agri-food sector. These trends point to the imperatives of “smart” precision agriculture techniques, processes embedded in cyber-physical systems, or platforms that are radically transforming the agri-food value chain. However, there are pervasive and recurring issues and challenges associated with managing, using, harmonizing, sharing, storing and protecting the massive volumes of data that are generated by automation and robotics technology. In particular, such large volumes of data have generated endemic legal and ethical incertitudes regarding security, privacy, and ownership. The proposed presentation will highlight these issues and make the case for more effective and coordinated policy and regulatory frameworks that facilitate the development of standards for data interoperability and address pressing security and privacy issues.

Visit Professor Conteh’s FACULTY PAGE 

View a PDF of this presentation

Blayne Haggart, Political Science, Brock University

Current approaches to AI (however it is defined) and data regulation tend to focus on ensuring that it is, in the words of the recent multilateral Statement on Inclusive and Sustainable Artificial Intelligence for People and the Planet, “human rights based, human-centric, ethical, safe, secure and trustworthy.” The focus on human rights suggests an outcomes-focused approach to AI development and data regulation, where the collection of data, and its use via AI-based technologies is legitimated based on whether or not it supports human rights. A human rights-based approach to regulation, while hard to critique on its face, leaves unsettled questions about which human rights, and whose rights, should take precedence. This vagueness in turn hampers the development of effective regulation. This presentation proposes an alternate approach to data and AI regulation based on Karl Polanyi’s concept of fictitious commodities. A Polanyian framework focuses on initial the act of commodifying data, and emphasizes the contextual, social embeddedness of data. If applied as a regulatory principle, it would more effectively redress the actual harms from AI-based technologies while better highlighting the social cost of AI-based technological development.

Visit Dr. Haggart’s FACULTY PAGE

View a PDF of this presentation

Reception

5 to 7 p.m.

Plaza Building, Room 600F

Come meet our panelists, as well as fellow Brock researchers and students with an interest in digital policy.
All are welcome to join us for light refreshments and deep conversation.

This event is intended to showcase the variety of research and activity related to public and private governance, widely defined, in our increasingly (commodified) knowledge-driven digital society. Part 1 – Digital Regulation in the Public Interest: Surveying the Field was held in November 2023.

About the series

Hosted by the Faculty of Social Sciences, this series aims to showcase the variety of work being conducted by faculty and student researchers across Brock University, to uncover an array of perspectives, and to foster potential synergies and collaborations.

Cross-disciplinary and cross-Faculty participation is encouraged.

Learn how to participate in this Symposium Series.