Skip to content

Identity

'Identity' illustration by Johnny Lighthands, Creative Commons Attribution-ShareAlike 4.0 International. ‘Identity' illustration by Johnny Lighthands, Creative Commons Attribution-ShareAlike 4.0 International.

The pillar of identity addresses the social character of data. It aims to do so by problematising the construction and categorisation of data, which is shaped by the social, cultural, and historical contexts from which it is derived.

The examination of identity in this framework considers three core dimensions: interrogating, understanding, and critiquing harmful categorisations, challenging reification and erasure, and combatting harms of representation. The following sections explore each of these elements in detail, offering a framework for investigating concerns of identity in the context of data and data-intensive technologies.

For a brief overview of the identity pillar, take a look at the infographic video below.

Interrogate, understand, and critique harmful categorisations

The construction and categorisation of data, particularly when it is about people, is a fundamentally social activity that is undertaken by humans whose views of the world are, in part, the product of cultural contexts and historical contingencies. As such, the construction and categorisation of data is shaped by the sociocultural conditions and historical contexts from which it is derived. The social character of data coupled with the sorting and clustering that proceeds from its cleaning and pre-processing can lead to categorisations that are racialised, misgendered, or otherwise discriminatory. This can involve the employment of binary categorisations and constructions—for example, gender binaries (male/female) or racial binaries (white/non-white)—that are oriented to dominant groups and that ought to be critically scrutinised and questioned. Data justice calls for examining, exposing, and critiquing histories of racialisation and discriminatory systems of categorisation reflected in the way data is classified and the social contexts underlying the production of these classifications.

Illustrative example: Feminist Internet

Although the internet has allowed individuals and communities across the globe to interact and exchange information, the rise of platforms on the internet has been accompanied by a plethora of incidents of misogyny, sexism, hate, and abuse. The algorithms that have come to define contemporary cyberspace have also contributed to discriminatory practices, from biased image recognition to inadequate monitoring, that endanger many individuals, particularly women, trans folk, and people of colour. With historical exclusion and contemporary societal practices, scores of vulnerable groups lack access to the internet which, in many cases, has allowed for the predominance of transatlantic male-perspectives and objectives1.

The Feminist Principles of the Internet arose in reaction to the current internet environment by drafting 17 principles organised into five main categories: Access, Movements, Economy, Expression, and Embodiment1. Such a framework is aimed at enabling women’s movements to navigate technology-related issues. Similarly, the Feminist Internet has also been established in London as a collective aimed at promoting equity of rights, freedoms, privacy, and data protection irrespective of race, class, gender, gender identity, age, beliefs, or abilities. The collective works towards providing tools and a space to engage in critical creative practices to address issues including online abuse and harassment, anti-transgender media representation, and censorship, amongst others.

Challenge reification and erasure

In the construction and categorisation of data, system designers and developers can mistakenly treat socially constructed, contested, and negotiated categories of

identity as fixed and natural classes. When this happens, the way that these designers and developers categorise identities can become naturalised and reified. This can lead to the inequitable imposition of fixed attributes to classify people who do ascribe to these categorisations or who view them as fluid and inapplicable to the way they identify or regard their themselves.

Illustrative example

For instance, the designers of a data system may group together a variety of non-majority racial identities under the category of “non-white”, or they may record gender only in terms of binary classification and erase the identity claims of non-binary and trans people.

In a similar way, designers and developers can produce and use data systems that disparately injure people who possess unacknowledged intersectional characteristics of identity which render them vulnerable to harm, but which are not recognised in the bias mitigation and performance testing measures taken by development teams.

Illustrative example

For instance, a facial recognition system could be trained on a dataset that is primarily populated by images of white men, thereby causing the trained system to systematically perform poorly for women with darker skin. If the designers of this system have not taken into account the vulnerable intersectional identity (in this case, women with darker skin) in their bias mitigation and performance testing activities, this identity group becomes invisible, and so too do injuries done to its members2.

Focus on how struggles for recognition can combat harms of representation

Struggles for the rectification of moral injuries to identity claims that are suffered at the hands of discriminatory data practices should be understood as struggles for recognitional justice—struggles to establish the equal dignity and autonomy, and the equal moral status, of every person through the affirmation of reciprocal moral, political, legal, and cultural regard.

Illustrative example: First Nations Technology Council (FNTC), Canada

While Indigenous communities account for 4.9% of Canada’s population, they make up only 1.9% of the ICT jobs. Colonial practices have perpetuated and deepened digital divides. Such gaps continue to manifest in areas of representation, rights, and respect as these relate Indigenous communities’ ability to access to digital technologies and to seize opportunities to develop and use them3. The First Nations Technology Council (FNTC) is an Indigenous-led, NPO that works to promote a robust Indigenous Innovation Ecosystem by connecting all 204 First Nations communities across British Columbia (BC) in Canada. The driving force behind the FNTC is a belief that intersectionality and diversity in innovative technology environments are critical for progress for all in Canada. They primarily work within the mandates of digital skill development, connectivity, information management, and technical support and services.

FNTC has published extensive research alongside the launch of education programmes such as the notable flagship Foundations and Futures in Innovation and Technology (FiT) project, which guides students through their digital skills development journey4. Divided into two—Foundations and Futures—the project implements programmes to cater to Indigenous individuals at every level of the development of their digital skills. They provide student support through living allowances, tuition fees, and a Digital Elder-in-Residence. The curriculum for the Foundations has been designed by Indigenous specialists to ensure safe and navigable access, while the advanced Futures programme was designed by industry and partners.


  1. Feminist Internet. (n.d.). What is a feminist internet? ComputerAid. Retrieved 10 March 2022 from https://www.computeraid.org/about-us/blog/what-feminist-internet 

  2. Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on fairness, accountability and transparency (pp. 77-91). PMLR. https://proceedings.mlr.press/v81/buolamwini18a.html 

  3. Pierre, D. (2022, February 7). Why The Time For Indigenous-led Innovation In Tech Is Now, And How To Support It. First Nations Technology Council. https://technologycouncil.ca/2022/02/07/why-the-time-for-indigenous-led-innovation-in-tech-is-now/ 

  4. First Nations Technology Council. (n.d.). Education Programs – First Nations Technology Council. Retrieved 10 March 2022 from https://technologycouncil.ca/education/