REFLECTIONS

authenticity &

equity

[starlinglab-reactapp "https://images.journalism.starlinglab.org/wp_image_plugin/index.js" id="imageday64top" path="https://images.journalism.starlinglab.org/fs1/2020-11-11T010159Z_1616336682_RC2P0K9TE19U_RTRMADP_3_USA-ELECTION-PROTESTS-cai.jpg" method="StarlingImagePlugin" version="1.1.0"/]

The Implications of Authentication of Media to Support the Discourse of Lived Experiences by Racialized Communities

BY Muriam Fancy

As a researcher at Starling Lab, I am constantly questioning how our technology is able to navigate the ethical dimensions of authenticating data. As a researcher that conducts their work with the mission of understanding the social implications of how technology can be designed and deployed as a tool to better represent racialized communities, while also noting the potential harms it can produce. 

 

I continue to wonder how principles of equity and inclusion can become the standard for developing the narrative of truth especially in a world where other emerging technologies are not engineered with similar principles and have the potential to create a variety of harms. 

 

What I have understood as a means for the standard to develop the narrative of truth is recognizing the lived experiences of racialized communities, so that their perspectives are authentically included in weaving together the pieces of truth that reflect the experiences of historic events. 

 

Structural frameworks through media, policy, and law have created parameters for what they have defined as a version of the truth that “should” be told. 

 

This mechanism perpetuates structural inequalities that disproportionately impact racialized communities resulting in their lived experiences not being heard, but also are not effectively archived or included in developing a holistic narrative to understand the ramifications from a historic event. 

I would argue that one mechanism to combat this form of inequality is through the authentication of captured media by racialized communities, with the goal of including the voices and perspectives of these lived experiences from these historic events. 

 

At Starling Lab, we have developed a prototype technology using blockchain to authenticate imagery and video data. Our ability to verify the authenticity of captured media data is vital to piece together a discourse of truth by affirming witnesses’ perspectives on global present or past events. At Starling Lab we recognized that the lack of inclusion is a continual perpetuation of a form of digital inequality which disproportionately impacts racialized communities. 

 

Thus, our framework for weaving in equity and inclusion at the authentication phase is one mechanism we hope to build to develop the standard for developing the narrative of truth proposed above. I would like to note that principles of inclusion and equity are included throughout the methodology of the solution architecture of Starling as well as beyond the technology itself, but for the purpose of this article, I will be discussing it in the form of its impact in data.

 

A threat to this mission is the number of sources that create false data, an issue in which we currently do not have many governing or regulatory mechanisms to control for this threat. If we cannot account for the multiplicity of false data, we cannot begin to understand who is creating that data, and thus their intention in changing the discord of truth. An example of a source of invalid data is synthetically generated data by artificial intelligence (AI).

 

The threat of AI-generated synthetic media can create a barrier in authenticating data but also can catalyze a false narrative which proposals disinformation through invalid data. Synthetic generated media can use original media or create new inauthentic data with the objective to distort pieces of the evidence in an attempt to present one or a series of alternative truths. 

 

This data, with the intent of disinformation, seeks to further exasperate the systemic divide for racialized communities to have their stories, perspectives, and evidence recorded in history. Circulating false data is also a means to cause disinformation which can manifest itself through structural bias. Such is a thesis that is also supported by Witness Org who finds that the degree of harm that AI presents actively can discriminate against racialized communities’ experiences and can result in surveillance of racialized communities due to a negative and false discourse in the media.

 

This strategy of manipulating the narrative of present and/or historical events is not novel, rather this tactic has been employed for generations through various technologies and continues to cause various forms of harm by allowing for the potential erasure of racialized communities’ experiences through evidence. The lack of regulation on this issue area threatens racialized communities’ ability to practice fundamental human rights such as freedom of expression. Through Starling’s prototype, we hope that by developing the product and the organization on principles of equity and inclusion, we can dispel conspiracies while protecting communities’ experiences through authenticating the data they provide.

 

We recognize that the technology created at Starling Lab is not a solution, but rather is a tool, and it cannot solve the systematic divide and discrimination caused by already existing power structures. However, Starling can authenticate through verification the originality of the media provided, and therefore the perspectives of racialized communities through data in an attempt to shatter layers of doubt on their experiences and the timeline of historical events. 

 

It is precisely at this intersection of understanding how to protect the lived experiences through data of racialized communities, that through authenticating media that Starling Lab prototype becomes a tool to promote human rights (a tool that is not prescribed or forced upon communities, but offered in support). I believe that by making these principles the core of Starling’s methodology is Starling’s most important contribution.

 

Inclusion and equity at every stage of development from the conceptualization of valuing racialized communities as part of the organizations thesis to the architecture of the technology to include these principles is what makes Starling value profound. 

 

It is by utilizing this methodology that Starling hopes to add value to the spaces and narratives carved by racialized communities, and truly being a tool to support them. Again, Starling is not taking ownership or credit for the experiences heard or felt by racialized communities, but is a tool that is continuously iterating to support their needs to tell their stories and have it heard where they choose.

 

As mentioned, the architecture of the technology is implementing the methodology described above. It is through an open-source approach, our prototype enables trust in facts due to sourcing from a plurality of experts and witnesses. Thus, decentralization is a potent strategy because it allows for the most diverse set of voices to help shape the narrative to achieve this standard of truth. Although authentication occurs through decentralization, the media and perspectives brought forward by racialized communities are still and will continue to be owned by them. As Dotan mentions in his article by the Independent, for human rights defenders to effectively capture the events from diverse views, the ability to present original media remains vital to archive evidence for accountability in the future.

 

Verifying or authenticating data is not the permanent end state that we desire to achieve. Rather verification, as noted with our transparent open-source methodology, is a state that continues to be questioned and examined. Although we continue to analyze the evolution of our output to best serve the needs of the racialized communities, the positive outcomes that arise from media being authenticated does allow for the data provided by racialized communities to contribute to the larger discourse of piecing together the truth of a historical or present event. We hope that by being transparent in our processes we can protect the veracity of the evidence by racialized communities.

 

Muriam Fancy is a researcher at Starling Lab. She is finishing her Masters of Global Affairs from the University of Toronto, with a background in security and technology policy. She is also a deep tech ethics researcher with the goal of understanding and analyzing the social implications of technology and the harms it can produce.