**Algo(r)etica e immigrazione** by Fabio Ciracì explores the intersection of algorithmic ethics and immigration, particularly focusing on the implicit biases embedded within automated classification systems. This study, part of the "G.E.A. - Global Green Generative and Equal Educational Activities" project, examines how ostensibly neutral algorithms can perpetuate institutional discrimination. Examples include insurance risk assessments and predictive analytics used in various societal contexts. The author highlights the inherent context-dependence and potential prejudice in algorithmic decision-making, emphasizing the need for critical oversight and transparent practices. Ciracì discusses the systemic racism that can be codified in algorithms, especially within migration processes, where decisions influenced by biased algorithms can significantly impact the lives of migrants and refugees. The report underscores the necessity of scrutinizing these technologies to prevent discriminatory practices and ensure the ethical use of AI. The text concludes with a call to transition from "algoretica" (ethics based on algorithms) to "algoetica" (ethics rooted in compassion and empathy), advocating for transparency, accountability, and the safeguarding of human rights in the application of advanced digital technologies.

Algo(r)etica e immigrazione

Fabio Ciracì
2024-01-01

Abstract

**Algo(r)etica e immigrazione** by Fabio Ciracì explores the intersection of algorithmic ethics and immigration, particularly focusing on the implicit biases embedded within automated classification systems. This study, part of the "G.E.A. - Global Green Generative and Equal Educational Activities" project, examines how ostensibly neutral algorithms can perpetuate institutional discrimination. Examples include insurance risk assessments and predictive analytics used in various societal contexts. The author highlights the inherent context-dependence and potential prejudice in algorithmic decision-making, emphasizing the need for critical oversight and transparent practices. Ciracì discusses the systemic racism that can be codified in algorithms, especially within migration processes, where decisions influenced by biased algorithms can significantly impact the lives of migrants and refugees. The report underscores the necessity of scrutinizing these technologies to prevent discriminatory practices and ensure the ethical use of AI. The text concludes with a call to transition from "algoretica" (ethics based on algorithms) to "algoetica" (ethics rooted in compassion and empathy), advocating for transparency, accountability, and the safeguarding of human rights in the application of advanced digital technologies.
File in questo prodotto:
File Dimensione Formato  
28766-146544-1-PB.pdf

accesso aperto

Descrizione: pdf volume
Tipologia: Versione editoriale
Licenza: Dominio pubblico
Dimensione 6.61 MB
Formato Adobe PDF
6.61 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11587/521986
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact