Bias In Artificial Intelligence Systems And Its Impact On Law

Authors

DOI:

https://doi.org/10.5380/rrddis.v5i9.100107

Keywords:

Law and Technology, Artificial Intelligence, bias

Abstract

The article presents concepts and fundamentals necessary to understand what algorithmic bias is, specifying sources and forms of bias, as well as exemplifying problems and points of attention. The objective is to show that bias in Artificial Intelligence systems is a complex issue and requires a critical analysis from both a legal and technological point of view, without forgetting ethical aspects, pointing to the explainability of Artificial Intelligence. The work adopted a deductive research method and is based on the premise that Artificial Intelligence is composed of algorithms that process data, assuming AI as a non-thing and discussing the repercussions on Law resulting from bias in AI systems. The article presents several considerations that relate from accountability, auditability and accountability to elements of Constitutional Law, Human Rights and Fundamental Rights, also mentioning governance and data protection.

Author Biography

Cinthia Obladen de Almendra Freitas, Pontifícia Universidade Católica do Paraná

Professora Permanente do Programa de Pós-Graduação em Direito (PPGD) da PUCPR.
Doutora em Informática Aplicada pela PUCPR. Mestre em Engenharia Elétrica e In
formática Industrial pela UTFPR. Engenheira Civil pela UFPR. Membro Consultivo da
Comissão de Direito Digital e Proteção de Dados da OAB/PR. Membro Consultivo do
Instituto Nacional de Proteção de Dados (INPD).

References

BAER, Tobias. Understand, manage and prevent algorithmic bias. A guide for businnes users and data scientists. Germany: Apress Media, 2019.

BOLUKBASI, Tolga; CHANG, Kai-Wei; ZOU, James; SALIGRAMA, Venkatesh; KALAI, Adam. Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings. Proc. of 30th Conference on Neural Information Processing Systems (NIPS 2016), Barcelona, Spain, 2016. Disponível em: https://papers.nips.cc/paper/6228-man-is-tocomputer-programmer-as-woman-is-tohomemaker-debiasing-word-embeddings.pdf Acesso em: 14 abr. 2025.

BRASIL. Lei 13.709, de 14 de agosto de 2018, Lei Geral de Proteção de Dados - LGPD, 2018.

BUOLAMWINI, Joy; GEBRU, Timnit. Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Proc. of Machine Learning Research, Conference on Fairness, Accountability, and Transparency, v. 81, 2018. p. 1-15. Disponível em: https://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf Acesso em: 14 abr. 2025.

COLLETT, Clementine; DILLON, Sarah. AI and Gender - Four Proposals for Future Research. Cambridge: The Leverhulme Centre for the Future of Intelligence, 2019. Disponível em: https://api.repository.cam.ac.uk/server/api/core/bitstreams/04e69f8b-ccf1-4c3a-affd-8c28ad273873/content Acesso em: 14 abr. 2025.

DOOLEY, Roger. Como Influenciar a Mente do Consumidor: 100 maneiras de convencer os consumidores com técnicas de neuromarketing. Trad. Luciene Scalzo. São Paulo: Elsevier, 2012.

DWORK, Cynthia, HARDT, Moritz; PITASSI, Toniann; REINGOLD, Omer; ZEMEL, Rich. Fairness through awareness. In: Proceedings of the 3rd Innovations in Theoretical Computer Science Conference (ITCS12), 2012. p. 214-226. Disponível em: https://doi.org/10.1145/2090236.2090255 Acesso em: 14 abr. 2025.

D’IGNAZIO, Catherine. Data is never a raw truthful input and it is never neutral. Entrevista para Zoe Corbyn, The Guardian, 2020. Disponível em: https://www.theguardian.com/technology/2020/mar/21/catherine-dignazio-data-is-never-a--raw-truthful-input-and-it-is-never-neutral Acesso em: 14 abr. 2025.

FERRARA, Emilio. Fairness and Bias in Artificial Intelligence: a brief survey of sources, impacts, and mitigation strategies. 2023. Disponível em: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4615421 Acesso em: 14 abr. 2025.

FILIMOWICZ, Michael. Systemic Bias – algorithm and Society. Routledge Taylor & Francis Group: London, 2022.

FREITAS, Cinthia Obladen de Almendra. O Direito e a Inteligência Artificial como Não-Coisa. CONPEDI Law Review, XIII Encontro Internacional do CONPEDI Uruguai – Montevidéu, v. 10, n. 1, pp. 88 – 109, jul.-dez., 2024.

FREITAS, Cinthia Obladen de Almendra. Riscos e Proteção de Dados Pessoais.

RRDDIS - Revista Rede de Direito Digital, Intelectual & Sociedade, v. 2, p. 225-247, 2023.

FREITAS, Cinthia Obladen de Almendra; BATISTA, Osvaldo Henrique dos Santos. Neuromarketing e as Novas Modalidades de Comércio Eletrônico (m-s-t-f--commerce) frente ao Código de Defesa do Consumidor. Derecho y Cambio Social, v. 42, 2015, p. 1-22.

FREITAS, Cinthia Obladen de Almendra; BARDDAL, Jean Paul. Análise preditiva e decisões judiciais: controvérsia ou realidade?. DEMOCRACIA DIGITAL E GOVERNO ELETRÔNICO, v. 1, p. 107-126, 2019.

HAN, Byung-Chul. Não-coisas: transformações no mundo em que vivemos. Trad. Ana Falcão Bastos. Lisboa: Relógio D’Água Editores, 2022.

MIN, Alfonso. Artificial Intelligence and Bias: challenges, implications, and remedies. 2023. Disponível em: https://www.researchgate.net/publication/374232878_ARTIFICIAL_INTELLIGENCE_AND_BIAS_CHALLENGES_IMPLICATIONS_AND_REMEDIES Acesso em: 14 abr. 2025.

ROSS, Howard J. Everyday bias: identifying and navigating unconscious judgments in our daily lives. London: The Rowman & Littlefield Publishing Group, Inc., 2020.

SMITH, Genevieve; RUSTAGI, Ishita. Mitigating Bias in Artificial Intelligence: An Equity Fluent Leadership Playbook. Berkeley Haas Center for Equity, Gender and Leadership, July, 2020. Disponível em: https://haas.berkeley.edu/wp--content/uploads/UCB_Playbook_R10_V2_spreads2.pdf Acesso em: 14 abr. 202

SPIEGEL, Murray R. Probabilidade e estatística. Coleção Schaum. Trad. Alfredo Alves de Farias. São Paulo: McGraw-Hill do Brasil, 1978.

SVENSSON, Jakob. Modern Mathemagics: values and biases in tech culture. In: Systemic Bias – algorithm and Society. Series Editor: Michael Filimowicz, Routledge Taylor & Francis Group: London, 2022.

WEST, Sarah Myers; WHITTAKER, Meredith; CRAWFORD, Kate. Discriminating Systems: Gender, Race, and Power in AI. AI Now Institute, 2019. Disponível em: https://ainowinstitute.org/publication/discriminating-systems-gender-race--and-power-in-ai-2 Acesso em: 14 abr. 2025

KAZMIER, Leonard J. Estatística Aplica à Economia e Administração. Trad. Carlos Augusto Crusius; Revisão Técnica Jandyra M. Fachel. São Paulo: Pearson Makron Book, 1982

Published

2025-07-22

How to Cite

Freitas, C. O. de A. (2025). Bias In Artificial Intelligence Systems And Its Impact On Law. Revista Rede De Direito Digital, Intelectual & Sociedade, 5(9), 172–198. https://doi.org/10.5380/rrddis.v5i9.100107

Issue

Section

Parte III - Inovação, direito digital e tecnologia

Categories