Criminalizing Algorithmic Bias / تجريم التحيز الخوارزمي

نوع المستند : المقالة الأصلية

المؤلفون

1 قسم القانون الجنائي، كلية الدراسات القانونية، جامعة فاروس، الأسكندرية، جمهورية مصر العربية.

2 كليه الحقوق جامعه حلوان

المستخلص

Algorithmic bias stands as a major ethical challenge throughout society because of artificial intelligence development. The use of algorithms by governments, private bodies, and technology companies to process massive datasets has generated a growing fear of unfair biases in their decision systems. The problem of algorithmic bias worsens because it remains unclear how algorithms operate while it becomes complex to both verify and review their functional processes. The control of these algorithms belongs to private companies, thereby making the issue even more complicated. Traditional legal rules face a crisis because algorithmic bias increasingly affects the way laws interact with society. Some suggest that modern discriminatory practices need legislative penalties, which would defend equality principles from physical spaces into virtual spaces. meanwhile the study holds critical significance because it addresses the practical bias problems that emerge from artificial intelligence in employment and website applications and gender disparities. Individual rights suffer egregious violations because algorithmic neutrality decreases from developer-held secret algorithm function and design details which include sorting and analyzing, classification and execution of automated decisions that may demonstrate bias based on race background or age or other factors. Making clear how biased algorithms affect criminal law requires more understanding because of both legal ambiguity and proprietary information secrecy.
 
يُعد الانحياز الخوارزمى تحدياً أخلاقياً كبيراً في المجتمع نتيجة لتطور الذكاء الاصطناعي. فاعتماد الحكومات والهيئات الخاصة وشركات التكنولوجيا على الخوارزميات لمعالجة كميات ضخمة من البيانات أثار مخاوف متزايدة من الانحياز غير العادل في أنظمة اتخاذ القرار. وتتفاقم مشكلة الانحياز الخوارزمى بسبب الغموض المحيط بكيفية عمل هذه الخوارزميات، مما يصعب عملية التحقق والمراجعة لوظائفها. كما أن ملكية هذه الخوارزميات تخضع في الغالب لشركات خاصة، مما يزيد من تعقيد القضية. وتواجه القواعد القانونية التقليدية أزمة حقيقية، إذ أصبح الانحياز الخوارزمى يؤثر بشكل متزايد في كيفية تفاعل القوانين مع المجتمع. ويقترح بعض الباحثين فرض عقوبات تشريعية على الممارسات التمييزية الحديثة، من أجل حماية مبادئ المساواة ونقلها من المجال الواقعي إلى الفضاء الرقمي. وتكتسب هذه الدراسة أهمية بالغة لأنها تتناول المشكلات العملية للانحياز الناتج عن الذكاء الاصطناعي، خاصة في مجالات التوظيف، وتطبيقات المواقع الإلكترونية، والفروقات بين الجنسين. كما أن حقوق الأفراد تتعرض لانتهاكات جسيمة نتيجة لانخفاض حيادية الخوارزميات، بسبب احتفاظ المطورين بسرية تفاصيل عمل وتصميم هذه الخوارزميات، التي تشمل الفرز، والتحليل، والتصنيف، وتنفيذ القرارات، والتي قد تعكس انحيازاً مبنياً على العرق أو الخلفية أو العمر أو غيرها من العوامل. أما توضيح كيفية تأثير الخوارزميات المنحازة على القانون الجنائي، فلا يزال يتطلب فهماً أعمق، نتيجة للغموض القانوني وسرية المعلومات المملوكة تجارياً.

الكلمات الرئيسية


  1. The author, A. Collosa, talks about recent court decisions that deal with algorithms, biases, and discrimination in their use.

https://www.ciat.org/ciatblog-algorithms-biases-and-discrimination-in-their-useThe Algorithmic Accountability Act addresses biases in artificial intelligence.

http://www.jonesday.com; Jones Day. Algorithmic Accountability Act of 2019;

https://www.jonesday.com/en/insights/2019/06/proposed-algorithmic-accountability-act. https://www.wyden.senate.gov/imo/media/doc/Algorithmi%20Accountability%20Act%20of%202019%20Bill%20Text.pdf?utm_campaign=the_algorithm.unpaid.engagement&utm_source=hs_email&utm_medium=email&_hsenc=p2ANqtz-QlmnG4HQ1A. Accessed 23 Jan 2020 ↑

  1. - Allsop, C.J.: Technology and the future of the judiciary. The Federal Court of Australia. https://www.fedcourt.gov.au/digital-law-library/judges-speeches/chief-justice-alls. Accessed on 26 March 2019.
  2. Cath, C.: Regulating artificial intelligence: ethical, legal, and technical prospects and obstacles. Philosophical Transactions of the Royal Society A: Mathematical, Physical, and Engineering Sciences, Volume 376, Issue 2133, Article 20180080 (2018).

https://doi.org/10.1098/rsta.2018.0080 ↑

 4-      Cliskan, A., Bryson, J.J., and Narayanan, A.: Mechanically extracted semantics from    language corpora reveal human-like biases. Science 356(6334), pages 183–186.

5- https://science.sciencemag.org/content/356/6 334 /18 3 (2017). Accessed 4 Jul 2020 ↑

6- Carlson, The article discusses the importance of transparency in the current era of predictive sentencing algorithms. Iowa Law Review.

https://ilr.law.uiowa.edu/assets/uploads/Uploads/ILR-103-1-Carls on.pdf (2017)

7- Starr, S.B.: evidence-based sentencing and the scientific rationalization of discrimination. Stand. Law Rev. 66(4), 815–816 (2014) ↑

8- Discrimination. Business Ethics Quarterly, Vol. 21, I.4, 2015, pp. 633 (4) Accessed 20-8-2023, On this website: -price-of-quarterly/article/ethics-ethics ↑

9- Following complaints of gender discrimination, Apple Card is being investigated. On Twitter, a well-known software engineer claimed that the credit card was "sexist" since it discriminated against women who applied for credit. On this website: https://www.nytimes.com/2019/11/10/business/Apple-credit-card-investigation.html, accessed June 9, 2023. ↑

10- Manheim, K. & Kaplan, L., Artificial Intelligence: Risks to Privacy and 2019, Democracy, p. 113,

https://papers.ssrn.com/so13/papers.cfm?abstract_id=3273016; Wickersham, G., & al., IBA Global Employment Institute, Artificial Intelligence and Robotics and Their Impact on the Workplace, 2017, p. 10. ↑

11- Big Data: An analysis of algorithmic systems, potential, and civil liberties. Office of the President of the United States. https://permanent.fdlp.gov/gpo90618/2016_0504_data_discrimination.pdf (2016). Accessed 16 September 2020.

12- Borgesius, F.Z.: Discrimination, artificial intelligence, and algorithmic decision-making. Council of Europe.

https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d

13- FRA: In summary—large datasets, algorithms, and bias. Agency for Fundamental Rights , European Union.

https://fra.europa.eu/en/publication/2018/brief-big-data-algorithms-and-discrimination (201

14- Miller, C.C. Hidden bias: when algorithms discriminate. The New York Times. 

https://www.nytimes.com/2015/07/10/upshot/when-algorithms-discriminate.html (2015)

15- Electronic Privacy Information Center: EPIC—Algorithms in the Criminal Justice System: Pre-Trial Risk Assessment Tools. Epic.org.

https://epic.org/algorithmic-transparency/crim-justice/(2014).

16-https://law.justia.com/cases/wisconsin/supreme-court/2016/2015ap000157-cr.html available Jan 2025

17-https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm available Jan 2025

18- Johnson, R.C.: Overcoming AI Bias with AI Fairness. Commun. ACM.

https://cacm.acm.org/news/233224-overcoming-ai-bias-with-ai-fairness/fulltext (. Accessed 16 Sep 2020.

19- Quantification in criminal courts. Medium.

https://medium.com/ab-malek/quanti-fic-ation-i-n-criminal-courts-d. Accessed 22 Jan 2021.

20- McSherry, B.: risk evaluation, prediction models, and preemptive justice. In: Pratt, J., & Anderson, J. (Eds.) The title of the book is "Criminal Justice, Risk, and the Revolt Against Uncertainty." Palgrave Studies in Risk, Crime, and Society. Palgrave Macmillan, Cham (2020). https://doi.org/10.1007/978-3-030-37948-32

21- Article 53 of the Constitution stipulates that citizens are equal before the law and have equal rights, freedoms, and public duties. There shall be no discrimination between them based on religion, belief, gender, origin, race, color, language, disability, social status, political affiliation, geography, or any other reason. Discrimination and incitement to hatred are crimes punishable by law. The state is obligated to take the necessary measures to eliminate all forms of discrimination, and the law regulates the establishment of an independent commission for this purpose.

22- In a recent judicial precedent of the Supreme Constitutional Court in the public session held on Saturday, September 2, 2023 AD, corresponding to Safar 17, 1445 AH, in the case registered in the Supreme Constitutional Court’s docket under No. 70 for the 43rd judicial year “Constitutional”, it ruled that: “Whereas the Constitution has adopted, pursuant to the text of Article 4 thereof, the principle of equality, as it is - along with the principles of justice and equal opportunities - a basis for building society and preserving its national unity, and in confirmation of this, the Constitution was keen in Article 53 thereof to guarantee the achievement of equality for all citizens before the law in rights, freedoms and public duties, without discrimination between them for any reason, but this does not mean - according to what has been established by the jurisprudence of this Court - that their categories, despite the differences in their legal positions, are treated with equal legal treatment. Likewise, this principle is not based on opposing all forms of discrimination, as some of them are based on objective foundations and do not therefore involve a violation of the text of Articles 4 and 53.”

23- Article (161) bis of the Egyptian Penal Code, as amended by Law No. 126 of 2011, stipulates that anyone who commits an act or refrains from an act that would cause discrimination between individuals or against a group of people based on gender, origin, language, religion or interest, and this discrimination results in the issuance of the principle of equal opportunities or social justice or the enhancement of public peace, shall be punished by imprisonment and a fine of not less than fifty thousand pounds and not exceeding one hundred thousand pounds, or by one of these two penalties, if the crime referred to in the first paragraph of this article is committed by a public employee, public worker or any person charged with public service. Article 176 of the Egyptian Penal Code, as amended by Law No. 197 of 2006, stipulates that anyone who incites, by any of the aforementioned methods, discrimination against a group of people based on gender or origin, language, religion, or belief if such incitement is likely to disturb public peace.

24- https://fra.europa.eu/en/law-reference/european-convention-human-rights-article-47

25- Chander, A.: The discriminatory algorithm? Michigan Law Review, 115(6), 1023.

http://michiganlawreview.org/wp-content/uploads/2017/04/115MichLRev1023_Ch

26- Christin, A., Rosenblat, A., and Boyd, D.: Predictive algorithms and courts. Data and civil rights: a novel epoch in law enforcement and justice. https://www.law.nyu.edu/sites/default/files/upload_documents/Angele%20Christin.pdf (2015, accessed 18 September 2020)

27- Borgesius, F.Z.: Discrimination, artificial intelligence, and algorithmic decision-making. Council of Europe.

https://rm.coe.int/discrimination-artificial-intelligence-and-algorithmic-decision-making/1680925d

28- https://www.thomsonreuters.com/en-us/posts/legal/ai-enabled-anti-black-bias/ available in January 2025

29- https://www.washington.edu/news/2024/10/31/ai-bias-resume-screening-race-gender/available Jan 2025.

30- Corbett Davies, S., et al.: Algorithmic Decision-Making and the Expense of Equity. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.

https://5harad.com/papers/fairness.pdf (2017). Accessed 18 Sep 2020.

31- Crawford, K.: The Concealed Prejudices in Extensive Data. Harvard Business Review. https://hbr.org/2013/04/the-hidden-biases-in-big-data (2018). Accessed on July 23, 2020.

32- Danziger, S., Levav, J., & Avnaim-Pesso, L.: Extraneous influences on judicial determinations. Proceedings of the National Academy of Sciences 108(17), 6889–6892.

https://www.pnas.org/content/108/17/6889 (2011). Accessed on February 2, 2020.

33- Dieterich et al. performed a study on COMPAS risk scales, which exhibited accuracy, equity, and predictive parity. Technical Report, Northpointe Inc.

https://go.volar.isgro.up.com/rs/430-MBX-989/images/ProPublica_Commen tary_Final_070616.pdf (201. Accessed 18 Sept 2020

34-Dignum, V.: Regarding prejudice, opaque systems, and the pursuit of transparency in artificial intelligence. Delft Institute for Design and Values. https://www.delft-designforvalues.nl/2018/on-bias-black-boxes-and-the-quest-for. Accessed on April 22, 2020.

35- Dignum, V. (2019). Accountable artificial intelligence: methods for the responsible development and utilization of AI, p. 59. Springer.

36- Dressel, J., Farid, H.: The accuracy, fairness, and limits of predicting recidivism. Sci. Adv. 4(1), eaao5580. 

https://dvances.sciencemag.org/content/advances/4/1/eaao5580.full.pdf (2018). Accessed 22 Jan 2021.

37- Dzindolet, M.T., et al., The Significance of Trust in Dependence on Automation. International Journal of Human-Computer Studies, Volume 58, Issue 6, Pages 697–718 (2003)

38-Eckhouse, L.: Opinion | Big data may be perpetuating racial prejudice inside the criminal justice system. The Washington Post. https://www.washingtonpost.com/opinions/big-data-may-be-reinforcing-racial-bias-in-the-criminal-justice-system/2017/02/10/d63de518-ee3a-11e6-9973-c5efb7ccfb0d_story.html (2017)

39- 1. Electronic Privacy Information Centre: EPIC—algorithms in the criminal justice system: pre-trial risk assessment tools. Epic.org.https://epic.org/algorithmic-transparency/crim-justice/(2014)

40- European Commission for the Efficiency of Justice (CEPEJ): CEPEJ European Ethical Charter on the application of artificial intelligence (AI) in judicial systems and their contexts. European Commission for the Efficiency of Justice (CEPEJ). 

https://www.coe.int/en/web/cepej/cepej-european-ethica l-charter-on-the-use-of-artificial-intelligence-ai-in-judicial-systems-and-their-environment (2021). Accessed 22 Jan 2021.

41- FRA: In brief—big data, algorithms, and discrimination. European Union Agency for Fundamental Rights.

https://fra.europa.eu/en/publication/2018/brief-big-data-algorithms-and-discrimination (2021)

42- Flores, A.: False positives, false negatives, and false analyses: a rejoinder to “machine bias: there’s software used across the country to predict future criminals. And it’s biased against Blacks.”

http://www.crj.org/assets/2017/07/9_Machine_bias_rejoinder.pdf (2017). Accessed 22 Jan 2021

43- Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. TOIS 14(3), 330–347 (1996)

44- Global Legal Monitor: A court in the Netherlands has prohibited the government from using AI software to detect welfare fraud.

https://www.loc.gov/law/foreign-news/article/netherlands-court-prohibits-governments-use-of-ai. Accessed 22 Jan 2021

45- It is worth noting that the Egyptian Personal Data Protection Law No. 151 of 2020 stipulated in Article (1) that personal data is defined as: “Any data related to a specific natural person, or who can be identified directly or indirectly by linking this data to any other data such as name, voice, image, or identification number, or an online identity identifier, or data that identifies a psychological, health, economic, cultural, or social identity.”

46- Veale M, et al., ‘Fairness and Accountability Design Needs for Algorithmic Support in High-Stakes Public Sector Decision Making’ in p.4. doi: 10/ct4s. Proceedings of the ACM Conference on Human Factors in Computing Systems, CHI 2018 (ACM Press), 8102, 7,

47- Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons regarding the processing of personal data and on the free movement of such data, and repealing Directive

48- ibid.

49- Andrew D. Selbst and Julia Powles. Meaningful information and the right to explanation, International Data Privacy Law, Volume 7, Issue 4, November 2017, Pages 233–242, https://doi.org/10.1093/idpl/ipx022. (last visited 1/5/2020). Reports are available for January 2025.

50- http://julkaisut.valtioneuvosto.fi/bitstream/hadle/10024/160

51- Deloitte and Reform, the State of the State 2018-19 (2019)

52 - https://www2.deloitte.com/content/campaigns/uk/thestate-of-the-state/the-state-of-the-state/the-state-of-the-state.htm

53- http://www.gazzettaufficiale.it/atto/serie_generale/Decreto Legislative 10 Agosto 2018, n. 101, caricaDettaglioAtto /originario?atto. dataPubblicazi oneGazzetta= 20180904 &atto.codiceRedazionale=18G00129&elenco30giorni=tru ↑

54- Lege 190/2018 privind nasuri de punere in aplicare a Regulamentului (UE) 2016/679 al Parlamentului European si al Consiliului din 27 aprilie 2016

https://www.senat.ro/legis/PDF/2018/18L294FP.pdf ↑

55http://www.riksdagen.se/sv/dokumentlagar/dokument/svenskforfattningssamling

https://www.retsinformation.dk/Forms/R0710.aspx?id=201319.

  • wttp://prawo.sejm.gov.pl/isap.nsf/download.xsp/WDU20180001000/O/D20181000.pdf.

57 - https://likumi.lv/ta/id/300099-fizisko-personu-datu-apstrades-likums. ↑

58-https://iapp.org/media/pdf/resource_center/Spanish_data-protection-law.pdf. ↑

60- Margot Kaminski, ‘The Right to Explanation, Explained’, 2018. See also Emre Bayamlioglu, ‘Contesting Automated Decisions’: European Data Protection Law Review 4, no. 4 (2018): 43346, https://doi.org/10.21552/edpl/2018/4/6. ↑

61- Office of Science and Technology Policy

https://www.whitehouse.gov/ostp/ ↑