DIKUL - logo
E-viri
Celotno besedilo
Recenzirano Odprti dostop
  • The moon, the ghetto and ar...
    Fountain, Jane E.

    Government information quarterly, April 2022, 2022-04-00, Letnik: 39, Številka: 2
    Journal Article

    Computational algorithms and automated decision making systems that include them offer potential to improve public policy and organizations. But computational algorithms based on biased data encode those biases into algorithms, models and their outputs. Systemic racism is institutionalized bias with respect to race, ethnicity and related attributes. Such bias is located in data that encode the results and outputs of decisions that have been discriminatory, in procedures and processes that may intentionally or unintentionally disadvantage people based on race, and in policies that may discriminate by race. Computational algorithms may exacerbate systemic racism if they are not designed, developed, and used–that is, enacted–with attention to identifying and remedying bias specific to race. Advancing social equity in digital governance requires systematic, ongoing efforts to assure that automated decision making systems, and their enactment in complex public organizational arrangements, are free from bias. •Computational algorithms are powerful tools but may replicate biases.•Biases, including systemic racism, in underlying data bias algorithms•Automated decision making systems that discriminate harm people.•Careful scrutiny of data, processes, variables and algorithms may reduce bias.