Glossary: Technological Neutrality
Neutrality is defined as “the freedom of individuals and organizations to choose the most appropriate technology adequate to their needs and requirements for development, acquisition, use or commercialization, without knowledge dependencies involved as information or data” (Ríos, 2013; also see Carrillo, 2019). The assumption with neutrality is that technology can be neutral because it is designed to be so. However, in actual practice, it’s hard to be neutral — often times the programmers who design “neutral technology” are hidden behind their algorithms. For example, if the programmer has limited knowledge of the code, he or she is confined to his or her own viewpoint, and cannot acquire and understand knowledge from other programmers. When this is the case, the programmers display many biases all stemming from personal experiences, resulting in a lack of both neutrality and interconnection (Duster, 2019). Another example that exemplifies a concern of “neutral” data is if a technology is designed by one party for a specific purpose but is used by another party for a completely different purpose (Dalton and Thatcher, 2014). The technology can be repurposed and utilized to achieve different means based on whoever has access to the technology, therefore it cannot be completely neutral, regardless of the original intentions for the technology. Technology is always subject to the interpretation of the user, which removes any aspect of neutrality that the technology once held (CJ Bruns). One important distinction to make is that a robot learns like a human: through practice, past robot interactions, and human user experience. Regardless of the intention of human creating such technology, its consequences can have a discriminatory impact, especially on the most powerless in society. Ultimately, it is important to understand that technological neutrality does not exist such that we can better deal with the consequences of biased technology.
Carrillo, Azahara Benito. “What Is Technological Neutrality?: Blog Xnoccio Viavansi.” Blog De Viafirma | Especializado En Firma Electrónica e Identidad Digital, 21 Feb. 2018,
Dalton, Craig, and Jim Thatcher. “WHAT DOES A CRITICAL DATA STUDIES LOOK LIKE, AND WHY DO WE CARE?” Society & Space, 12 May 2014.
Duster, Troy. Foreword, Captivating Technology. (Duke University Press, 2019).
Student Editors: Matthew Der, Haoyuan Chen and Jia-Lin Chen. We would like to thank additional student editors who would like to remain anonymous for their contributions.