Beyond the App Store: Reproductive Governance and the Limits of Digital Autonomy

VOLUME 8

ISSUE 5

February 3, 2026

Introduction

Digital contraceptive and fertility-tracking apps are increasingly central to how reproductive autonomy is imagined, marketed and managed. Promising algorithmic precision and personalized insights, platforms such as Flo, Clue and Natural Cycles are often framed as tools of empowerment. Yet beneath this language lies a commercial infrastructure that extracts intimate data on menstruation, sexual activity, hormonal patterns and mood through opaque, transnational systems of platform governance. These platforms do not merely support reproductive care; they reorganize it, embedding users in sociotechnical systems shaped by market logics and regulatory ambiguity.

This paper interrogates how digital contraceptive apps govern reproduction through mechanisms of responsibilization, commodification and consent. While some, like Natural Cycles, are certified as Software as a Medical Device (SaMD), and others, like Clue, emphasize compliance with the European Union’s General Data Protection Regulation (GDPR), regulatory protections remain partial and uneven. Users are expected to navigate complex consent regimes, absorb data-related risk and participate in daily self-monitoring, all while platforms monetize reproductive labour through behavioural analytics, venture capital and health data markets.

This paper contributes to a growing body of scholarship that examines menstrual tracking and digital contraceptive technologies not simply as health tools, but as sociotechnical systems through which power, responsibility and accountability are governed.1 Existing research has analyzed menstrual tracking apps through feminist science and technology studies, critical menstrual studies, and privacy and surveillance frameworks, demonstrating how app design, data practices and normative assumptions shape bodily knowledge, self-management and user responsibility. Policy-oriented analyses further highlight the regulatory challenges posed by fertility and femtech apps that operate as consumer-facing digital platforms while simultaneously claiming health-related legitimacy, often falling between health regulation, privacy law and consumer protection regimes.2 Rather than centring user experience or app efficacy, this paper examines how platforms structure consent, allocate responsibility and manage accountability through legal, technical and commercial infrastructures, positioning digital contraception as a site of reproductive governance shaped by market logics, regulatory ambiguity and unequal distributions of risk.3

Drawing on feminist political economy, data justice and reproductive justice, this paper critically examines how these apps mediate reproductive governance. Through a qualitative, document-based analysis of privacy policies, terms of service and legal materials, including the Canadian lawsuit Lam v Flo Health Inc 2024,4 we show how platforms construct reproductive subjects, assign responsibility and obscure accountability. We argue that digital contraceptive technologies must be understood not as neutral health innovations but as infrastructures of governance: commercial systems that responsibilize users, depoliticize care and entrench structural inequality through discourses of autonomy and innovation.

Theoretical Framework: Feminist Political Economy and Reproductive Data Governance

Digital contraceptive platforms such as Flo, Clue and Natural Cycles do not merely provide information or facilitate health tracking. They operate as commercial infrastructures that extract value from users’ reproductive labour and intimate data, embedding them in sociotechnical systems shaped by market logics, regulatory ambiguity and gendered expectations. This paper draws primarily on feminist political economy to interrogate how these platforms govern reproductive autonomy through commodification, responsibilization and privatized care.

Feminist political economy highlights how capitalism depends on the extraction of reproductive labour — care work, emotional labour and bodily maintenance — that is often unpaid or underpaid and disproportionately performed by women.5 Digital contraceptive apps extend this logic into the platform economy. They demand constant data input, self-monitoring, and behavioural compliance through discourses of empowerment, transforming reproductive self-care into digital labour. These platforms derive value not from improving reproductive outcomes, but from embedding users in commercial ecosystems that monetize intimacy, attention and risk.

This commodification is closely tied to the logic of neoliberal responsibilization, whereby individuals are framed as responsible health citizens tasked with managing reproductive risks through self-surveillance and digital compliance.6 Rather than addressing structural inequalities in access to reproductive care, these platforms displace accountability onto users, rendering care a matter of personal optimization rather than public provision. Whether through SaMD certification, GDPR compliance or feminist branding, platforms offer the appearance of safety and legitimacy while obscuring how they offload risk onto individuals.

These dynamics are not merely economic but deeply political. As Nick Srnicek notes, platform capitalism is a response to declining profitability in traditional sectors, with data extraction serving as a new mode of value generation.7 Digital contraceptive apps exemplify this shift: even when they invoke medical and scientific authority through research partnerships, clinical expertise or regulatory designations, they remain commercial platforms structured to scale engagement and accumulate data while navigating the blurred boundary between wellness and medical classification.

While this paper centres feminist political economy, it draws upon insights from data justice and reproductive justice to foreground how these economic processes intersect with racialized, classed and gendered vulnerabilities. As shown by scholars such as Simone Browne (2015), Ruha Benjamin (2019), and Linnet Taylor (2017), data governance regimes often disproportionately expose marginalized users to surveillance, exclusion and harm.8 These frameworks remind us that the impacts of commodified care are uneven: not all users enter these platforms from the same social position or with equal protections.

While existing scholarship on menstrual tracking and digital reproductive technologies has provided important insights into privacy risks, ethical design and user experience, such approaches often centre individual choice, transparency or informed consent as the primary sites of intervention. A feminist political economy approach shifts attention away from individual users and toward the structural conditions under which digital reproductive technologies operate. It asks how value is extracted from reproductive data and labour, how responsibility and risk are redistributed between platforms and users, and how care is reorganized through market-based infrastructures. By foregrounding these dynamics, this framework reveals how reproductive autonomy is reconfigured under platform capitalism — not as collective or publicly supported care, but as an individualized project of self-management embedded in commercial systems. This perspective strengthens reproductive justice analysis by highlighting how digital contraceptive platforms reproduce unequal distributions of risk and protection, particularly for those already marginalized by gender, race, class or legal status.

Taken together, this theoretical orientation frames digital contraceptive platforms not as neutral technologies but as infrastructures of reproductive governance. They extract value from intimacy, responsibilize users for their own risk, and reinforce longstanding dynamics of reproductive control through economic and technological means. This framework informs our analysis of how platforms construct user consent, manage legal obligations and reproduce inequality under the veneer of empowerment.

Methodology: Document-Based Analysis of Platform Governance

This paper employs a qualitative, document-based analysis to examine how digital contraceptive platforms govern reproductive data and responsibility. Document-based methods are well suited to policy analysis because they allow researchers to examine how governance is articulated, legitimized and operationalized through publicly available texts rather than through user behaviour alone. The analysis focuses on a close, comparative reading of platform privacy policies, terms of service, regulatory certifications, marketing materials and, where relevant, legal documents, including court filings related to Lam v Flo Health Inc (2024).9 These documents were treated as instruments of governance that structure consent, allocate responsibility and define the limits of platform accountability.

Drawing on interpretive policy analysis and critical discourse approaches, the documents were analyzed to identify recurring themes related to consent, risk, responsibility and value extraction, as well as silences and ambiguities within platform disclosures. This interpretive approach aligns with feminist political economy by foregrounding how power operates through legal and commercial infrastructures, rather than through individual choice alone. Rather than evaluating app effectiveness or user experience, the analysis examines how platforms present themselves to users, regulators and courts, and how reproductive autonomy is discursively and materially reorganized through these governance texts.

Three widely used apps — Flo, Clue and Natural Cycles — were selected for analysis due to their global reach, reputational positioning as health tools, and distinctive governance models. All three collect highly sensitive reproductive data and operate across multiple jurisdictions, including within and outside of the European Union’s GDPR regime. Flo, in particular, was chosen due to its involvement in the Canadian class action lawsuit Lam v Flo Health Inc (2024), which foregrounds legal accountability and cross-border data flows.10 Clue was selected as a GDPR-compliant app that markets itself explicitly as feminist and privacy-conscious. Natural Cycles, certified as a software-based medical device, provides a contrasting case that links algorithmic fertility management with regulatory legitimacy.

The analysis draws on feminist political economy, with supporting insights from data justice and reproductive governance, as an interpretive framework. Documents were read critically and comparatively to identify how platforms construct user consent, assign responsibility and frame risk. Attention was paid not only to content but also to silences — what is obscured or omitted in these texts. Following interpretive policy analysis and critical discourse approaches,11 the analysis focused on how reproductive subjects are constructed through language, how normative logics are encoded in policy documents, and how power is exercised through formal and informal regulatory mechanisms.

To emphasize policy relevance, the analysis also identifies regulatory ambiguities and potential leverage points for reform, especially where platforms blur the line between wellness and medical classifications or circumvent data protection obligations. While document-based research cannot capture user experiences or platform back-end operations, it allows for a rigorous examination of how platforms present themselves to regulators, users and the public. The analysis does not seek to test causality but to surface the structural and ideological conditions under which digital reproductive technologies operate.

Comparative Analysis: Governing Reproductive Intimacy Across Platforms

While digital contraceptive apps such as Flo, Clue and Natural Cycles all frame themselves as empowering tools for reproductive self-management, they differ in how they structure data governance, navigate regulatory regimes and assign responsibility to users. This comparative analysis examines how each platform enacts reproductive governance through specific combinations of legal classifications, commercial strategies and technological infrastructures. In doing so, it highlights how these apps operationalize neoliberal and platform-capitalist logics, normalizing the extraction of reproductive data by positioning choice, personalization and convenience as mechanisms of governance.

Flo: Surveillance as Default, Accountability as Exception

Flo exemplifies the commodification of intimacy within platform capitalism. With more than 380 million downloads globally since it was launched in 2015, the app collects highly sensitive reproductive data — including information about menstruation, sexual activity, pregnancy and emotional states.12 While it markets itself as a privacy-conscious wellness tool, Flo has repeatedly faced legal scrutiny for violating user trust. In Lam v Flo Health Inc (2024), a Canadian class action lawsuit, the plaintiff alleged that Flo shared reproductive health data with third parties such as Facebook and Google, despite promising confidentiality.13 The case advanced claims of breach of contract, unjust enrichment and violations of statutory privacy laws across several provinces.14

Although Canada’s federal privacy law, the Personal Information Protection and Electronic Documents Act (PIPEDA), does not permit a private right of action, the court invoked it to assess the standard of meaningful consent. Crucially, the judge rejected Flo’s attempt to apply California law and enforce a class action waiver, ruling that such terms were unconscionable due to the adhesive nature of the platform and the sensitivity of the data. This ruling marked a significant shift away from judicial deference to platform contracts and signalled increasing judicial recognition of the power asymmetries embedded in digital health governance.15

Flo’s platform model illustrates how reproductive responsibility is increasingly outsourced to individuals through opaque consent mechanisms and technical infrastructures. Users must navigate lengthy, complex terms just to access basic features, with little recourse when trust is violated. This design reflects the broader logic of neoliberal responsibilization, whereby individuals are tasked with managing their own reproductive risk while platforms monetize their data with limited oversight.16 Through empowerment-based discourses, Flo embeds users in a system of surveillance-based reproductive governance that normalizes extraction of personal data and obscures accountability.

Following the overturning of Roe v Wade, Flo introduced an optional “Anonymous Mode,” which it frames as a privacy-protective feature designed to decouple reproductive health data from personal identifiers.17 However, this protection is conditional: users must actively opt in, accept functional limitations, and rely on Flo’s technical and contractual assurances, reinforcing a governance model in which responsibility for data protection is individualized rather than structurally guaranteed.

From a feminist political economy perspective, Flo demonstrates how commercial platforms exploit legal ambiguity and platform architecture to externalize risk and capture value from unpaid reproductive labour — both in terms of data and emotional input. The lack of clear regulatory enforcement mechanisms, especially across jurisdictions, further compounds this asymmetry. The Lam case demonstrates the urgent need for stronger regulatory tools, including limitations on forum-shifting clauses, requirements for plain-language consent, and sector-specific protections for reproductive health data.

Clue: Procedural Transparency and Soft Reponsibilization under the GDPR

Clue, developed by Berlin-based BioWink GmbH, presents itself as a feminist, science-based alternative in the reproductive app market. It is explicitly branded around user empowerment, data transparency and ethical governance, positioning itself in contrast to competitors such as Flo.18 The app is subject to the GDPR and emphasizes that it does not sell user data, allows for data export and deletion, and outlines data use in accessible language.

Clue’s privacy policy and user-facing materials frame transparency and informed choice as the primary safeguards against data misuse, distinguishing between data collected for core functionality and that shared with third-party services such as Google Analytics.19 However, this procedural transparency should not be mistaken for structural accountability. Clue still operates within a commercial ecosystem shaped by venture capital imperatives, market metrics and platform logics. Its feminist framing is constrained by the economic demands of platform capitalism, which prioritize scalability, engagement and analytics over collective accountability or systemic equity.

Clue’s model illustrates a form of soft responsibilization: users are encouraged to make informed choices, but those choices are structured within opaque technical and legal infrastructures that most users cannot fully interpret or resist. As Taylor (2017) and Lina Dencik et al. (2019) argue, transparency alone is insufficient for justice.20 From a feminist data justice perspective, ethical branding cannot offset the embedded asymmetries these platforms reproduce.

In this sense, Clue represents a regulatory best-case scenario — a GDPR-compliant app with feminist aesthetics — but still reflects the broader limitations of current governance frameworks. Stronger privacy protections do not necessarily dismantle the economic incentives underpinning reproductive data extraction. As Blayne Haggart and Natasha Tusikov (2023) note, existing regimes often prioritize platform adaptability and self-regulation over enforceable user rights or democratic oversight — even within jurisdictions explicitly committed to rights-based regulation, such as the European Union.21 Clue thus exemplifies the tension between ethical governance rhetoric and platform capitalism’s structural constraints, raising critical questions about whether meaningful accountability is possible without addressing the underlying political economy of digital health.

Natural Cycles: Certification without Accountability

Natural Cycles, developed in Sweden, occupies a distinctive position in the reproductive app landscape due to its SaMD certification, foregrounding its legitimacy through regulatory credentials, including CE marking required for medical devices placed on the European Economic Area market, clearance by the US Food and Drug Administration and certification to standards developed by the International Organization for Standardization (ISO).22 Approved by regulators in Australia, Canada, the European Union and the United States, the app is subject to enhanced standards for product safety and data handling.23 This regulatory status lends it a veneer of clinical legitimacy and distinguishes it from non-certified competitors. Yet certification is often a one-time, static approval that does not entail ongoing oversight of how the platform structures user interaction, monetizes data or responds to harm.

Despite its medical framing, Natural Cycles exemplifies how algorithmic governance reshapes reproductive behaviour through opaque, non-negotiable logics. Users must engage in daily self-tracking — inputting basal body temperature and cycle data — and follow app-generated behavioural instructions to “maintain effectiveness.” This model aligns with neoliberal responsibilization, a governance mode that offloads risk and accountability to individuals while allowing platforms to commercialize reproductive labour with limited scrutiny.24 The burden of reproductive risk is displaced onto users, who must perform continuous digital labour without institutional guarantees or recourse.

This governance model seeks to depoliticize reproductive health, embedding it in algorithmic systems that appear neutral but are shaped by commercial logics. It assumes a user base with digital literacy, reliable internet access and stable life conditions, masking systemic inequalities in access to contraception, care, education and digital infrastructure. Reproductive autonomy is thus reconfigured into a commodified, data-driven process, where responsibility is individualized but profits are centralized.

Although Natural Cycles is included in registries such as ORCHA (Organisation for the Review of Care and Health Apps), which assess digital health apps based on structured compliance checklists, these evaluations focus on formal requirements rather than lived impacts, equity considerations or longer-term accountability.25 The broader regulatory ambiguity, where digital contraception tools oscillate in classification between consumer wellness products and medical devices, allows platforms to invoke scientific credibility while evading obligations typical of traditional health actors.26

From a feminist political economy perspective, this reflects a broader trend in which biomedical legitimacy is co-opted to enable platform expansion and data extraction, with personalized care operating as a legitimating governance frame. Natural Cycles illustrates how the promise of innovation and empowerment can obscure power asymmetries, unpaid digital labour and insufficient oversight. Addressing these governance gaps requires more than compliance checkboxes — it demands enforceable accountability mechanisms, including independent audits, strengthened regulatory mandates and participatory frameworks that centre reproductive rights and user protection over commercial imperatives.

Discussion: Reproductive Autonomy, Risk and the Limits of Platform Governance

Despite differences in design and jurisdiction, all three apps rely on user responsibilization, shifting accountability for reproductive risk management onto individuals while minimizing institutional oversight. This burden is particularly acute for women, gender-diverse people, racialized users and legally precarious individuals, who are disproportionately exposed to the harms of data leakage, algorithmic misjudgement and regulatory ambiguity. As such, these cases support this paper’s central claim: that digital contraceptive platforms, despite their empowering rhetoric, entrench asymmetrical systems of commodification and control that displace risk onto users in the absence of meaningful structural accountability.

This paper’s analysis has shown how digital contraceptive apps, across Flo, Clue and Natural Cycles, enact reproductive governance through distinct legal and market strategies, while reinforcing a shared neoliberal logic that shifts the burden of care, risk and decision-making onto individuals and away from institutions. Flo monetizes reproductive data while framing consent as user control, obscuring the deep asymmetries of knowledge and power in its data practices. Clue, though branding itself as feminist and transparent, relies on a model of soft responsibilization that overlooks wider disparities in access, legal protection or data literacy. Natural Cycles leverages biomedical legitimacy through SaMD certification, yet offloads responsibility onto users, who must engage in continuous digital labour to maintain effectiveness. In all three cases, self-surveillance is normalized as care, and platform governance is presented as choice.

These dynamics reflect what feminist political economy scholars describe as neoliberal responsibilization: a governance logic that reconfigures reproductive autonomy into a project of self-optimization, displacing responsibility away from institutions and onto individuals.27 Users, especially women and gender-diverse individuals, are tasked with managing reproductive risk in datafied environments that offer few structural protections in return. This is not digital emancipation; it is the manufacture of reproductive precarity through empowerment-based responsibilization.

Moreover, the convergence of reproductive governance and platform capitalism reveals a deeper political economy of care. These platforms do not merely provide digital convenience; they extract value from users’ reproductive data, compliance and emotional investment while obscuring the terms of that extraction. Consent becomes the price of participation, and algorithmic authority replaces public accountability.

From a reproductive justice perspective, this governance model is deeply unequal. Not all users engage these platforms from the same social position. Racialized, low-income and precariously situated users face greater risks when data is collected, interpreted or leaked, particularly in jurisdictions with hostile reproductive policies. Taylor (2017) reminds us that data justice demands fairness not only in transparency, but in how people are treated and protected through data systems.28 Haggart and Tusikov (2023) build on this by showing that hybrid governance regimes, where private platforms fill public roles, often reinforce asymmetries of power with little democratic oversight.29

What emerges is a clear disjuncture between the language of choice and the infrastructure of surveillance. These apps mediate intimate decisions, such as whether and when to have children, but do so through systems that commodify care and obscure constraint. If reproductive autonomy is to mean more than digital compliance, governance models must go beyond voluntary certifications and privacy policies. They must confront the commercial logics that structure these platforms, and centre collective accountability, enforceable rights and democratic oversight as core components of digital reproductive governance.

Conclusion

Digital contraceptive platforms present themselves as tools of empowerment, but their underlying architectures tell a different story. Flo, Clue and Natural Cycles do not simply support reproductive tracking — they govern reproductive life through algorithmic routines, data commodification and opaque privacy infrastructures. Far from disrupting historical patterns of control, these platforms digitize and commercialize them.

By reframing reproductive autonomy through neoliberal responsibilization, these apps shift systemic risk onto individuals. Users are expected to interpret algorithms, consent to complex terms, and manage fertility through digital self-discipline, while platforms profit and evade institutional accountability. Although their legal models and branding vary, all three apps reinforce a shared logic: that reproduction is a matter of individual management, not collective care or public responsibility.

A feminist political economy perspective reveals that these are not neutral health tools, but market-oriented infrastructures built on extracting value from reproductive labour. Addressing this requires more than improved transparency or piecemeal regulation. It demands structural change: governance models that centre democratic oversight, enforceable rights and public accountability. Independent audits, clear limitations on data use, and participatory regulatory processes must replace voluntary compliance and opaque platform discretion. The key question is not whether these platforms can empower users, but under what political and economic conditions reproductive technologies can serve justice. Only when data governance frameworks prioritize equity over profit, and autonomy over compliance, can digital contraception support meaningful reproductive freedom.

Endnotes

1. Daniel A. Epstein et al., “Examining Menstrual Tracking to Inform the Design of Personal Informatics Tools,” Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (2017); Johanna Levy and Nuria Romo-Avilés, “‘A Good Little Tool to Get to Know Yourself a Bit Better’: A Qualitative Study on Users’ Experiences of App-Supported Menstrual Tracking in Europe,” BMC Public Health 19, no. 1213 (2019).

2. Katharine Kemp, Your Body, Our Data: Unfair and Unsafe Privacy Practices of Popular Fertility Apps (March 2023).

3. Linnet Taylor, “What Is Data Justice? The Case for Connecting Digital Rights and Freedoms Globally,” Big Data & Society 4, no. 2 (2017); Blayne Haggart and Natasha Tusikov, The New Knowledge: Information, Data and the Remaking of Global Power (Rowman & Littlefield, 2023).

4. Lam v Flo Health Inc, 2024 BCSC 391 (CanLII), Supreme Court of British Columbia, March 7, 2024, https://www.canlii.org/en/bc/bcsc/doc/2024/2024bcsc391/2024bcsc391.html.

5. Nancy Fraser, “Contradictions of Capital and Care,” New Left Review 100 (July–August 2016); Tithi Bhattacharya, ed., Social Reproduction Theory: Remapping Class, Recentering Oppression (Pluto Press, 2017).

6. Adele E. Clarke, Laura Mamo, Jennifer Ruth Fosket, Jennifer R. Fishman and Janet K. Shim, Biomedicalization: Technoscience, Health, and Illness in the U.S. (Duke University Press, 2010); B. J. Brown and Sally Baker, Responsible Citizens: Individuals, Health and Policy under Neoliberalism (Anthem Press, 2012).

7. Nick Srnicek, Platform Capitalism (Polity Press, 2016), 10–14.

8. Simone Browne, Dark Matters: On the Surveillance of Blackness (Duke University Press, 2015); Ruha Benjamin, Race After Technology: Abolitionist Tools for the New Jim Code (Polity Press, 2019); Taylor, “What Is Data Justice?”

9. Lam v Flo Health Inc.

10. Lam v Flo Health Inc.

11. Carol Bacchi, Analysing Policy: What’s the Problem Represented to Be? (Pearson, 2009); Teun A. van Dijk, “Principles of critical discourse analysis,” Discourse & Society 4, no. 2 (1993): 249–283, https://doi.org/10.1177/0957926593004002006.

12. Flo Health Inc., “About Us – Flo,” accessed May 8, 2025, https://flo.health/about-flo.

13. Federal Trade Commission, Decision and Order: In the Matter of Flo Health, Inc., 2021, https://www.ftc.gov/system/files/documents/cases/flo_health_order.pdf; Federal Trade Commission, “FTC Finalizes Order with Flo Health, a Fertility-Tracking App That Shared Sensitive Health Data with Facebook, Google, and Others,” News Release, June 22, 2021, https://www.ftc.gov/news-events/news/press-releases/2021/06/ftc-finalizes-order-flo-health-fertility-tracking-app-shared-sensitive-health-data-facebook-google; Mozilla Foundation, “Flo Ovulation & Period Tracker,” Privacy Not Included, August 9, 2022, https://foundation.mozilla.org/en/privacynotincluded/flo-ovulation-period-tracker/.

14. Federal Trade Commission, Decision and Order, 2021; Federal Trade Commission, “FTC Finalizes Order with Flo Health.”

15. Margaret Jane Radin, Boilerplate: The Fine Print, Vanishing Rights, and the Rule of Law (Princeton University Press, 2012).

16. Brown and Baker, Responsible Citizens.

17. Flo Health Inc., Flo Anonymous Mode: Protecting Reproductive Health Data (September 2022).

18. Clue, “Privacy Policy,” last modified January 20, 2025, https://helloclue.com/privacy.

19. Clue, “Privacy Policy.”

20. Taylor, “What Is Data Justice?”; Lina Dencik, Arne Hintz, Joanna Redden and Emiliano Treré, “Exploring Data Justice: Conceptions, Applications and Directions,” Information, Communication & Society 22, no. 7 (2019): 873–881, https://doi.org/10.1080/1369118X.2019.1606268.

21. Haggart and Tusikov, The New Knowledge.

22. Natural Cycles, “What Are the Certifications Behind NC° Birth Control?” accessed December 13, 2025.

23. Natural Cycles, “Privacy Policy,” last modified December 2, 2024, https://www.naturalcycles.com/app-legal/privacy-policy.

24. Clarke et al., Biomedicalization; Brown and Baker, Responsible Citizens; Deborah Lupton, “‘Mastering Your Fertility’: The Digitised Reproductive Citizen,” in Negotiating Digital Citizenship: Control, Contest, and Culture, eds. Anthony McCosker, Sonja Vivienne and Amelia Johns (Rowman & Littlefield, 2015), 81–93, https://ssrn.com/abstract=2679402.

25. Organisation for the Review of Care and Health Apps (ORCHA), “Data Privacy Matters … Period: ORCHA Report on the Data Security of Period Tracking Apps” (2022).

26. Catriona McMillan, “Contraception, Fertility Tracking, and the Limits of Medical Devices Regulation,” Law, Technology and Humans 5, no. 2 (2023): 78–90, https://doi.org/10.5204/lthj.3047; Catriona McMillan, “Rethinking the Regulation of Digital Contraception under the Medical Devices Regime,” Medical Law International 23, no. 1 (2023): 3–25, https://doi.org/10.1177/09685332231154581.

27. Clarke et al., Biomedicalization; Brown and Baker, Responsible Citizens.

28. Taylor, “What Is Data Justice?”

29. Haggart and Tusikov, The New Knowledge.

ISSN 2563-674X

doi:10.51644/bap85