Women of Uganda Network (WOUGNET)
organized a Side Session on May
6th, 2022 during the Online Global Digital Development Forum (GDDF) 2022 to
share perspectives on the
interaction of data and social justice, informed by the data justice research
conducted in Bolivia, Cameroon and Uganda by Internet Bolivia Foundation, AfroLeadership, and WOUGNET respectively with financial
support from the Alan Turing Institute (ATI) and the International Center for
Expertise in Montréal on Artificial Intelligence (CEIMIA). The conversation had
an emphasis on privacy and social justice in terms of access to, visibility,
and representation of data used in the development of artificial intelligence (AI)/machine
learning (ML) systems, participation of the citizens in the decision-making,
and management of databases.
This session was submitted by Sandra Aceng, WOUGNET’s Program Manager; and
facilitated by Peace Oliver Amuge, WOUGNET’s Executive Director.
The Definition of “Data Justice” in Different
Regions
The session panelists included Eliana Quiroz, Research Director
at Internet Bolivia Foundation; Louis Fendji (Ph.D.), Research Director on
Innovative Technologies at AfroLeadership; and Isaac Amuku, the Research
Specialist at the Women of Uganda Network who shared their definitions of data
justice in Africa and Latin America.
Louis
Fendji informed the audience that data justice refers to access to means to
produce and control data. However, the challenge comes in how people are
represented in data. He emphasized that data justice can have different
meanings from one region to another. He cited that the diversity in sub-Saharan
Africa is about change in data justice.
Isaac
Amuku defined data justice as some of the forces (social, historical, economic,
and political) that inform data collection, analysis, and use. These forces may
influence how people are treated in terms of fairness, fair play, equity,
peace, genuine respect for people, rightfulness, and lawfulness.
Eliana
Quiroz shared that there is not an easy translation of data justice to Spanish,
although people are living with several data injustices daily. This brings
difficulty when making research on this topic. In order to build common ground
with the people and be able to talk about data justice from that point, the use
of examples was the best strategy.
The Interaction of Data and Social Justice
including Data Justice Research Findings
The speakers gave their perspectives on the
interaction of data and social justice including the data justice research
findings, with emphasis on privacy and social justice in terms of access to,
visibility, and representation in data used in the development of AI/ML
systems, participation of the citizens in the decision making and management of
databases.
The understanding of data justice can vary
from region to region. Louis Fendji
(Ph.D.), informed
participants that data especially in the context of Cameroon is mainly
collected by big tech companies due to a lack of national strategies by the
government to collect data, produce and control data that is produced in the
country including the lack of laws on data especially data protection and data
privacy in Cameroon. In Cameroon, there
is the existence of laws on electronic communications, and consumer usage of
services however there is no law on privacy or data collection. This makes
those with data have the power to do whatever they would like to do with
existing data. There is therefore a power imbalance in the Cameroon context
where developers should be aware of how data can be harmful to people that use
their system. Additionally, people on the conflict side of Cameroon do not have
the power to manage data.
In terms of the participation pillar, data can
be harmful to people based on compliance because they do not have enough funds
to support the understanding of the perspectives of marginalized communities on
data because the process is expensive although developers are used to just
collecting data and dropping it.
The platform for passport issuance by the
government has some communities missing because their identities are missing.
There is a need to make sure data is issued fairly to benefit all the
communities.
Eliana Quiroz mentioned that the legal authority for
example the communication authority accepted the reform of the unethical and
legal ways in terms of internet service providers (ISPs). “We found power
imbalance based on the knowledge of technology because if we want to say
something as users, people are not involved in the ecosystem of governance,
hence limited information and knowledge,” Quiroz added. This is because the
government makes it hard for people to understand the debate as being technical
and inaccessible. People are not included in the decision-making of data
management. We have been putting pressure on the government to open data but
there are gaps due to limited skills to collect, clean, and analyze data in
terms of data presentation. There is violence against lesbian couples and the
existing law cannot be used because the law favours violent crimes towards a
woman by a man but not between two women.
Isaac Amuku spoke about the research by
WOUGNET which aimed at broadening the understanding of data justice based on
the six pillars (power, equity, access, participation, identity, and knowledge)
among policymakers, impacted communities, and developers in urban, peri-urban,
and rural areas of Uganda.
In the context of Uganda, there is the data
protection and privacy act 2019, however, Non-Consensual Intimate Images (NCII)
and Online Gender-Based Violence (OGBV) keep happening although no action has
been taken because people in power do not act hence the power imbalance. Many
people interviewed were moderately aware of data justice and the policymakers
need to engage and know the perspectives of everyone to be integrated into the
policy.
In terms of the equity pillar, it is usually
the government collecting data and there is no clarity on where the data collected
goes. For instance, the registration for a National Identity (ID) card is
mandatory for everyone in Uganda to be able to access services although the
data collection in Uganda is informed by culture and norms. This affects the
LGBTIQ communities because they may not have certain services for being LGBTIQ
and having missed data. The PWDs also have a lot of discrimination in terms of
access to information and lack of representation when accessing the services.
In terms of the access pillar, there is no
transparency in terms of surveillance done by the state because no one has
access to this information collected from the citizens.
In terms of the identity pillar, there are so
many identities of people collected who are not reflected. For instance, the refugees
in Uganda do not know where their data go once collected.
In terms of participation and knowledge
pillar, women do not participate, including various stakeholders interviewed
who said they do not participate in the process of data collection.
Read more: Advancing Data Justice Research Outputs by 12 Policy Pilot
Partners.
Question and Answer Session
The question and answer session followed from
the participants to the speakers. Participants asked:
In this era of datafication, I’m
worried about Artificial Intelligence algorithms. Tech companies are using AI
algorithms to increase productivity but these algorithms sometimes exclude the
minorities and the marginalized communities. How can we influence tech giants
and tech professionals to develop unbiased AI algorithms in the fight for
social justice?
Eliana Quiroz: There are very few Apps in local languages
and the elderly need support from younger relatives otherwise they do not
benefit from information on the internet but they are also suffering from data
collection. For example, Facebook collects data from users and non-users and
yet elderly people do not receive any benefit from the data collected which is
the same case as the government who are collecting data from citizens yet they
do not benefit from it.
Article 3: Under the principles of Data Protection in Data
Protection and Privacy act 2019
(1)A data collector, Processor, or Controller or any person who
collects, processes, holds, or uses personal data shall; and (c) Collect,
process, use or hold adequate relevant and not excessive or unnecessary
personal data. Do you really see this
being applied when it comes to data in the Ugandan context?
Isaac Amuku: The Ugandan law is okay
and they have minimal standards on data collection but there is power misuse
and participation in data collection such as in passport processes and National
ID processes. There is a need for continued conversation on data protection and
privacy to define data collection standards in Uganda.
A participant urged the need to bear
in mind the universality aspect of human rights and have data justice
inclusivity. In the Ugandan context, there is the misuse of data during data
collection which is a violation of rights. He added that the means and ways to
mitigate the effects of all processes of data collection are essential because
the aspects of intersectionality need analysis in terms of data collection and
planning. This is because a lot of data is left uncollected which affects the
planning of the country.
A female participant added that most
of the information collected by these companies can be easily used by third
parties to acquire people's personal data.
Louis Fendji (Ph.D.) said that data justice is about the
representation of minorities and marginalized communities in the data. However,
there is a need to ask ourselves why these communities are not included.
Telecom companies are present in areas where they can get a return on their
investments since they are profit makers.
It is expensive to include these communities, especially for developers
because developers sometimes lack funds to include marginalized communities in
AI algorithms systems. However, developers need to be educated about the
dangers of not including these communities and design national strategies to
provide awareness to the policymakers about the inclusion of marginalized
communities to be represented in data and have the power and ability to create
the awareness.
YouTube Video: Introducing Data Justice
By Sandra Aceng - Manager Information Sharing and Networking