OTTAWA -- The Assembly of First Nations is warning it could take the Liberal government to court over its proposed privacy and artificial intelligence bill.
And former tech executive Jim Balsillie is telling MPs studying the bill that he considers the legislation "anti-democratic."
The government has already been criticized for failing to consult widely enough on Bill C-27, which updates privacy laws and introduces the Artificial Intelligence and Data Act.
Balsillie, the former co-CEO of BlackBerry pioneer Research In Motion, says the government did no public consultations and relied too heavily on feedback from industry rather than civil society.
Indigenous leaders said First Nations weren't consulted at all.
"As a result, the minister did not hear First Nations, does not understand First Nations, and it shows in the legislation," the Assembly of First Nations said in a brief submitted to the House of Commons industry committee.
It said the bill infringes on the rights of First Nations, including on data sovereignty, and that litigation is "likely" if the government doesn鈥檛 meet its obligations.
The committee also heard from Christelle Tessono, a tech policy researcher at the University of Toronto, who said the bill doesn鈥檛 address human rights risks that AI systems can cause.
She said at a minimum, the preamble to the bill should "acknowledge the well-established disproportionate impact these systems have on historically marginalized groups," such as Indigenous Peoples, people of colour, members of the LGBTQ2S+ community and economically disadvantaged individuals.
During his testimony, Balsillie outlined some of what he called "countless" incidents of harm by AI systems. He said that includes cases where they have facilitated housing discrimination, made racist associations, shown job postings to men but not women and recommended longer prison sentences for visible minorities.
The Assembly of First Nations also said it has concerns about AI, including racial profiling.
鈥淔irst Nations have been treated as criminals when they try to open bank accounts and they have been subject to racial profiling in the health sector, by police, and government officials,鈥 it said in its brief.
鈥淚magine the potential for such abuse to continue or even worsen when biased and prejudiced individuals and organizations are building AI systems that will implicate First Nations.鈥
The bill does little to reassure First Nations, it said.
Balsillie said the bill needs to be sent back to the drawing board.
鈥淩ushing to pass legislation so seriously flawed will only deepen citizens鈥 fears about AI because AIDA merely proves that policymakers can't effectively prevent current and emerging harms from emerging technologies."
This report by The Canadian Press was first published Feb. 14, 2024.