Power¶
‘Power' illustration by Johnny Lighthands, Creative Commons Attribution-ShareAlike 4.0 International.
The first pillar, power, refers to the levels at which power operates and manifests in the collection, analysis, and use of data in the world. It provides a basis from which to examine power and to raise critical awareness of its importance, presence, and influence.
This pillar guides the examination of power through three main considerations: interrogating and critiquing power, challenging power, and empowering people. The following sections explore each of these elements in detail, offering a framework for investigating the manifestation of power in the context of data and data-intensive technologies, such as AI.
For a brief overview of the power pillar, take a look at the infographic video below.
Interrogate and critique power¶
Power dynamics are present and manifest in many different places through a variety of ways. It is therefore crucial to: - understand the levels at which power operates in data innovation ecosystems; - understand how power manifests and materialises in data innovation ecosystems; and - question power asymmetries at their sources and raise critical awareness of their effects, presence, and influence.
Understand the levels at which power operates in data innovation ecosystems¶
Identifying and reflecting on the levels at which power operates is critical for critiquing power dynamics present in data innovation ecosystems. The following diagram aims to guide this interrogation, by identifying the different levels of power operating in the collection and use of data in the world.
Understanding the levels at which power operates in the collection and use of data.
The geopolitical level¶
For example, high-income nation-states and transnational corporate actors can control access to technological capabilities and pursue their own interests on the global stage. In doing this, they can exercise significant influence on which countries or regions are able to access and develop digital and data processing capacities.
The infrastructural level¶
For example, States and large corporations can decide which impacted communities, domestically and globally, are able to access the benefits of connectivity and data innovation, and they can control the provision of essential digital goods and services that directly affect the public interest.
The organisational and political level¶
For example, governments and companies can control data collection and use in intrusive and involuntary ways—especially where the public have no choice but to utilise the services they provide or must work in the environments they manage and administer.
The policy, legal, and regulatory level¶
For example, large international standards bodies, transnational corporations, trade associations, and nation states, can exercise disproportionate amounts of influence in setting international policies, standards, and regulation related to the governance of digital goods and services and data innovation. Large corporations can exercise their power to lobby governments to prevent regulatory oversight.
The cultural level¶
For example, power can operate through the way that large tech companies use relevance-ranking, popularity-sorting, and trend-predicting algorithms to sort users into different, and potentially polarising, digital publics or groups.
The psychological level¶
For example, tech companies can use algorithmically personalised services to curate the desires of targeted data subjects. This can allow for the control or manipulation of consumer behaviour but also play an active and sometimes damaging role in identity formation, mental well-being, and personal development.
Illustrative example: Homeland Card - Identity Document, Venezuela
Venezuela’s Homeland Card (Carnet de La Patria) is a national ID card that serves as a digital payment system and is used by the Venezuelan government to provide access to food, healthcare, pensions, and other social benefits to citizens. Citizens are incentivised to enrol in the card programme via rewards including bonuses, with over half of the population being enrolled in the card programme.
The Homeland Card has been critiqued by activists who stress that it is being used as a surveillance tool aided by digital telecommunication corporations. It has been reported that the card is linked to databases storing cardholders’ personal information including medical history, social media presence, residential addresses, and political party membership. The use of the Homeland card in the context of a humanitarian emergency where most citizens depend on benefits for survival has raised concerns for opponents that citizens’ information is being used to exclude individuals from accessing vital services based on their behaviours and political affiliations and that the ID system is serving as a method of control and electoral coercion1.
### Understand how power manifests and materialises in data innovation ecosystems
How power manifests and materialises in the collection and use of data in the world.
Power can surface in several different ways. The following diagram aims to guide this critique of the different ways power manifests and materialises in the collection and use of data in the world.
Decision-making power¶
Here, an individual or organisational actor A has power over B to the extent that A can influence the preferences of B. Decision-making power is seen, for instance, in the way that government agencies collect and use data to build predictive risk models about citizens and data subjects or to allocate the provision of social services (and then act on the corresponding algorithmic outputs).
Ideological power¶
This kind of power is exercised where people’s perceptions, understandings, and preferences are shaped by a system of ideas or beliefs in a way which leads them—frequently against their own interests—to accept or even welcome their place in the existing social order and power hierarchy. For example, the priorities of “attention capture” and “screen-time maximisation”, that are pursued by certain social media and internet platforms, can groom users within the growing ecosystem of compulsion-forming reputational platforms to embrace the algorithmically manufactured comforts of life-logging, status-updating, and influencer-watching all while avoiding confrontation with realities of expanding inequality and social stagnation.
Normalising power¶
Normalising power manifests in the way that the ensemble of dominant knowledge structures, scientifically authoritative institutions, administrative techniques, and regulatory decisions work in tandem to maintain and ‘make normal’ the status quo of power relations. Where tools of data science and statistical expertise come to be used as techniques of knowledge production that claim to yield a scientific grasp on the inner states or properties of observed individuals, forms of normalising or disciplinary power can arise. Data subjects who are treated merely as objects of prediction or classification and who are therefore subjugated as objects of authoritative knowledge become sitting targets of disciplinary control and scientific management.
Agenda-setting power¶
Here, an individual or organisational actor A has power over B to the extent that A sets the agenda that B then must fall in line with by virtue of A’s control over the terms of engagement that set practical options within A’s sphere of influence and interest. Agenda-setting power means that A can shoehorn the behaviour of B into a range of possibilities that is to A acceptable, tolerable, or desired. This kind of power is explicit, for example, in practices of regulatory capture, where large tech corporations secure light touch regulation through robust lobbying and legal intervention.
Illustrative example: Workplace surveillance
With the proliferation of productivity tools has come the rise of workplace surveillance. These apps, when employed by companies, can pose serious challenges to data justice. For instance, Crossover, a talent management company, produced a productivity tool entitled WorkSmart. One of the facets of WorkSmart includes taking screenshots of employees’ workstations and producing ‘focus scores’ and ‘intensity scores’ based on their keystrokes and app use. The producers of similar workplace surveillance software often cite preventing insider trading, sexual harassment, and inappropriate behaviour as primary incentives behind the development of these types of apps2.
Other workplace surveillance apps include Wiretap, which monitors workplace chat forums for threats, intimidation, and other forms of harassment, as well as Digital Reasoning, which ‘searches more for subtle indicators of possible wrongdoing, such as context switching’, e.g., switching off a workplace app like Slack to use encrypted apps like Signal instead2. In an explainer piece by Mateescu and Nguyen3, various other types of employee monitoring and surveillance technologies are detailed including behavioural prediction and flagging tools, biometric and health data tracking, remote monitoring and time-tracking, and gamification and algorithmic management. The authors also explore the range of harms these technologies can exact. These include the augmentation of biased and discriminatory workplace practices, the creation of power imbalances between employees and their managers/organisation, and the decrease in workers’ autonomy and agency. Additionally, the authors point out that the use of granular digital tracking and surveillance apps is often motivated by employers’ desire to bolster ‘cost-cutting’ practices surrounding worker pay, benefits, and standards.
Another type of workplace surveillance app closely related to these is the fitness tracker. Biometric wearables are becoming more common across company wellness programmes. While these technologies have often been cited to help decrease company health insurance premiums34 the data gathered by some of these fitness apps can reveal physical location and occasionally sensitive information including family medical histories and diets. In one case, an employer incentivised employees through a US$1 a day gift card to use Ovia, a pregnancy-tracking app. This allowed the company to see aggregated health data collected via the app5. While the reasons that employers cite for using apps like these range from boosting employee well-being to decreasing overall company healthcare spending, there remain significant risks of data reidentification and intrusive tracking, among others5.
Increasing workplace surveillance and monitoring has also been accompanied by higher expectations for employee outputs and quotas. However, achieving these objectives has led to numerous instances of strain and injury in labour-heavy industries. For example, the second largest employer in the United States, Amazon, has monitored and evaluated employees through ADAPT, a proprietary software that not only evaluates employee productivity but also automates termination6. This automated management structure continues to operate despite thousands of injury reports and medical advice against the creation of strenuous conditions.
Understand power at its sources and raise critical awareness of its presence and influence¶
Interrogations of where and how power operates are first steps in a longer journey of questioning and critical analysis. An active awareness of power dynamics in data innovation ecosystems should also lead to further questions:
- What are the interests of those who wield power or benefit from existing social hierarchy?
- How do these interests differ from other stakeholders who are impacted by or impact data practices and their governance?
- How do power imbalances shape the differing distribution of benefits and risks among different groups who possess varying levels of power?
- How do power imbalances result in potentially unjust outcomes for marginalised, vulnerable, or historically discriminated against groups?
Challenge power¶
Mobilise to push back against societally and historically entrenched power structures and to work toward more just and equitable futures. while the questioning and critiquing of power are essential dimensions of data justice, its purpose of achieving a more just society demands that unequal power dynamics that harm of marginalise impacted individuals and communities must be challenged and transformed.
Illustrative example: Haki na Sheria (HSI), Kenya
Established in 2010, Haki na Sheria Initiative (HSI) is an NGO in Kenya. Driven by the inequalities faced by marginalised communities in Northern Kenya, HSI seeks to end systematic discrimination and empower communities to ‘understand, demand, and effectively claim their human rights and obligations’7. Amongst its work, HSI focuses on issues of environmental justice, gender, and statelessness.
Kenyan Somalis, Nubians, and other minoritised communities have long faced structural obstacles when applying for government IDs. Unlike other Kenyans, they are often required to provide additional supporting documents or go through long vetting processes. Not only does this deprive them from a smooth process— and in most cases, from Kenyan citizenship documents—, but also from services and opportunities these documents enable, such as access to schools, work, travel, public services, and the opening of bank accounts8. In addition to this, the government has recently introduced the National Integrated Identity Management System (NIIMS), which aims to digitise identity management. However, because proof of identity is required to obtain a digital ID, marginalised communities suffer from discrimination and exclusion9.
In response to the far-reaching harmful impacts of the use of digital IDs, HSI has raised awareness of the adverse effects of NIIMS on already marginalised communities. This includes a lack of proper safeguards to protect the rights of minoritised communities. HSI has also challenged the legality of the system, and in 2013 it created a platform where community members can receive advice on how to obtain citizenship documentation. The HSI paralegal team additionally provides people with information on how to ‘navigate government bureaucracy’9 and demand justice and equality.
Empower people¶
People must be empowered to marshal democratic agency and collective will to pursue social solidarity, political equity, and liberation. It is common to think about power primarily in a negative way, that is, in terms of unjust exercise of control (abuses of power), coercion, or self-interested influence. But this tells only half the story. Power can also be understood, more positively, in terms of its role in attaining social goods, for example through democracy or collective action. A response within the data justice movement to problematic imbalances of power is empowerment. Empowerment can be understood in terms of concerted and collective use of power through strategic action. When people and communities come together in the shared pursuit of social justice through mutually beneficial practices of knowledge sharing, pedagogy, deliberation, collaboration, dialogue, action, and resistance, power becomes constructive and opens transformative possibilities for the advancement of data justice, social solidarity, and political equity.
Illustrative example: Pollicy, Uganda
Pollicy is a feminist collective that gathers data scientists, technologists, creatives, and academics to improve data and craft better life experiences. Their work emerges in response to the domination of large and foreign technology companies in African markets and to policies implemented by some governments that, according to Pollicy, have undermined freedom of expression, digital inclusion, and access to information and markets. Their mission is to improve data literacy among different stakeholders, promote the use of responsible data within civil society organisations and government agencies to improve service delivery, and foster the debate about how to use data in an ethical and responsible way. Underlying their work is the certainty of the need to re-imagine digital futures that consider the needs of traditionally marginalised groups rather than trying to fit them within existing frameworks. Therefore, they envision design processes of data collection, analysis, and data release where traditionally marginalised groups are consulted and have a seat at the table.
Pollicy is currently undertaking projects that cover research, trainings, workshops, and toolkits. It has explored, for instance, digital extractivism in Africa. Within the project “Automated Imperialism, Expansionist Dreams”, they documented existing or potential policy responses to a set of problems, ranging from the hiring of digital workers in African countries by tech companies in a context of unequal power dynamics and lack of labour protections to the collection of users’ data through zero-rating and other communication solutions and their exploitation for profit. The document also provided recommendations on how to address the root causes of these problems. Pollicy has also developed a project titled “Afro Feminist Data Futures”, which approaches the production, sharing, and use of gender data as a way to empower feminist movements in sub-Saharan Africa. They explored the ways in which this knowledge can be translated into actionable recommendations for technology companies sharing non-commercial datasets10.
-
Berwick, A. (2018, November 14). A new Venezuelan ID, created with China’s ZTE, tracks citizen behavior. Reuters. https://www.reuters.com/investigates/special-report/venezuela-zte/ ↩
-
Solon, O. (2017). Big Brother isn’t just watching: Workplace surveillance can track your every move. The Guardian.https://www.theguardian.com/world/2017/nov/06/workplace-surveillance-big-brother-technology ↩↩
-
Mateescu, A., & Nguyen, A. (2019). Explainer: Workplace Monitoring & Surveillance. Data & Society. https://datasociety.net/wp-content/uploads/2019/02/DS_Workplace_Monitoring_Surveillance_Explainer.pdf ↩↩
-
Bort, J. (2014, July 8). This Company Saved A Lot Of Money By Tracking Their Employees With Fitbits. Business Insider. https://www.businessinsider.com/company-saved-money-with-fitbits-2014-7 ↩
-
Harwell, D. (April 2019). Is your pregnancy app sharing your intimate data with your boss?. The Washington Post. https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-more-public-than-you-think/?arc404=true ↩↩
-
For a more in-depth evaluation of physical injury and harms at Amazon’s fulfilment centre, read: Evans, W. (2019, November 25). Ruthless Quotas at Amazon Are Maiming Employees. The Atlantic. https://www.theatlantic.com/technology/archive/2019/11/amazon-warehouse-reports-show-worker- injuries/602530/ ↩
-
See Haki na Sheria at http://hakinasheria.org/ ↩
-
Bashir, Y. (2020, November 18). Viewpoint: Using Community Paralegals to Promote Inclusion in Digital IDs in Kenya. Good ID. https://www.good-id.org/en/articles/power-to-the-people-using-community-paralegals-to-promote-inclusion-in-digital-ids-in-kenya/ ↩
-
Wired Opinion. (2020, February 5). Digital IDs Make Systemic Bias Worse. Wired. https://www.wired.com/story/opinion-digital-ids-make-systemic-bias-worse ↩↩
-
Iyer, N., Achieng, G., Borokini, F., Ludger, U., Iyer, N., Syabani, Y., & Syabani, Y. (2021). Automated Imperialism, Expansionist Dreams: Exploring Digital Extractivism in Africa. Pollicy. https://archive.pollicy.org/wp-content/uploads/2021/06/Automated-Imperialism-Expansionist-Dreams-Exploring-Digital-Extractivism-in-Africa.pdf ↩