Tuesday, November 19, 2024

Whose data is it anyway? Indigenous voices call for accountability and data sovereignty in AI

Calgary Innovation Week runs from Nov. 13-21, 2024.


By Abigail Gamble
November 18, 2024
DIGITAL JOURNAL

Derek Eiteneier, Hayden Godfrey, Natasha Rabsatt, and Renard Jenkins (left to right) spoke at a Indigitech Destiny Innovation Summit panel on Monday, Nov. 18
. — Photo by Jennifer Friesen, Digital Journal

“Any usage of data that does not support data sovereignty, that does not support our economic reconciliation, does not support our interests, constitutes data genocide,” said Hayden Godfrey, director of Indigenous relations at Eighth Cortex.

Godfrey was on stage at Platform Calgary Innovation Centre this morning for the Indigitech Destiny Innovation Summit talking about data sovereignty and how to ethically integrate Indigenous knowledge, language and cultural data into artificial intelligence (AI) systems.

He was joined at the Calgary Innovation Week event by fellow panelists Natasha Rabsatt, founder of With These Lands to Talk, Renard Jenkins, president of I282 Technology Studios and Labs, and Derek Eiteneier, CEO of Outdoor Gala.

Together, they explored protecting Indigenous cultures, and what steps are necessary to build AI systems that foster inclusion rather than exploitation.

For Jenkins, AI tech itself isn’t what presents the greatest challenge to inclusion, but rather the people and power structures behind it.

“We should not be so concerned about the technology [of AI], but we should be concerned about who’s wielding the technology and who’s controlling the technology, and where that center of power comes in, with the technology,” he said.Renard Jenkins is the president and CEO of I2A2 Technologies, Studios & Labs. — Photo by Jennifer Friesen, Digital Journal
Here’s what data sovereignty means and why it matters so much

Data sovereignty ensures that data remains under the control of the communities it represents, a concept Jenkins sees as fundamental to ethical use of AI.

“One of the key things that we have to pay attention to is what data is being used for the foundational model of whichever AI system that you’re using,” he explained.

“That’s where we have the biggest opportunity right now to make sure that our foundational models look like the world that we live in — instead of looking like sometimes the individuals or the groups that actually build the models.”

Adding to the discussion, Eiteneier noted that often with AI tools “there’s bias in the overall data, or it’s missing data altogether.”

This incomplete picture can lead to misinformation or skewed representations — especially of minority or marginalized communities — if not carefully addressed.

“When we’re looking at Native and Indigenous communities, I think there is a lot of apprehension around how these technologies can actually be used, and be representative of cultures that were not historically [well] represented,” Rabsatt noted as well.

But she also emphasized the necessary balance of protecting cultural sovereignty, while embracing AI’s potential.

“I think if we see AI as a tool to augment our intelligence, and to automate, I think we can do something positive with that — with mindfulness, of course — and working together with other people and communities that have the same values.”

Natasha Rabsatt is the co-founder of If These Lands Could Talk. — Photo by Jennifer Friesen, Digital Journal

What the path forward for data sovereignty protection could (should?) look like

When it comes to establishing data sovereignty policies, Rabsatt highlighted the importance of determining what information should remain private and what should be shared.

“Especially with culturally sensitive information, it’s about asking: ‘What is it we don’t want in there? What shouldn’t be open source?’ Then we decide what information we do want to input, ensuring that it creates economic advantages for our community,” she said.

Jenkins emphasized the need for global collaboration in building ethical AI systems, warning against centralized power in the hands of, say, a few large companies.

“There are literally about seven or eight large language models that the majority of the artificial intelligence tools … are actually built upon. At this time, we do not have access to how those models were built, whose data was used for those models,” he explained.

That said, he also brought up some of the challenges of figuring out how to gain access to — and establish remuneration models for — culturally specific AI data.

— Photo by Jennifer Friesen, Digital Journal

“If we go into a regulated state where all of a sudden, individuals are forced to have to reveal what’s in their models, they’re forced to actually compensate the individuals whose IP has been utilized, we may see a lot of these models actually implode, because the cost will be much higher than what they could actually sustain.”

From a regulatory perspective, Godfrey proposed the creation of a binding code of ethics to ensure that AI developers respect Indigenous sovereignty and approach their work with transparency and accountability.

“I would like to see the development of a code of ethics that tech professionals need to abide by, not optionally, but have a mandate to abide by in interacting with this data,” he said.

“We need to ensure that technology is aligned with Indigenous values, that it serves as a tool for justice and reconciliation rather than exploitation. And that starts with respecting sovereignty, one ethical choice at a time.”

No comments: