The Globe and Mail with Petra Molnar 31 May 2018
The Canadian government is hoping to use artificial intelligence to inform how it mounts legal challenges to immigration and refugee claims as part of a new pilot program, set to move ahead in June.
Ottawa plans on using the emerging technology to reduce government lawyers’ need to perform their own legal research, a costly and time-consuming process. But there are already concerns that the nuanced and difficult nature of many refugee and immigration claims may be lost on those government computer systems, leading to massive human-rights implications.
And the government admits that it hopes to use artificial intelligence and machine learning, eventually, to help determine refugee applications themselves.
The plan is all laid out in a request for information submitted to industry in April.
The request for information asks industry to submit their own technological solutions for a joint pilot project, run by Citizenship and Immigration Canada and Justice Canada, to “support case law and legal research, facilitate trend analysis in litigation, predict litigation outcomes, and help develop legal advice and assessments,” according to a government spokesperson.
The government hopes that, if the pilot is a success, front-line immigration officials in Canada and abroad could use this technology “to aid in their assessment of the merits of an application before decisions are finalized.”
Petra Molnar, a researcher at the University of Toronto’s International Human Rights Program who specializes in immigration law, told The Globe and Mail there is a large, outstanding question regarding what, exactly, the point of this program is. “Is it money saved? Time saved? Applications accepted or denied?” she said.
Lex Gill, a research fellow with the Citizen Lab at the Munk School of Global Affairs who specializes in artificial intelligence and public policy, said replacing human work with algorithms, especially if the data that goes into them is itself flawed, will always pose potential human-rights concerns. “There is the risk that instead of improving the situation, these technologies may just entrench or encourage unfair or discriminatory practices,” she said.
But while the debate around those risks continues, Ottawa is already getting set to roll out the pilot project, which will apply specifically to those applying for pre-removal risk assessments and for immigration status based on humanitarian and compassionate grounds – two types of immigration claims that are designed to protect those with strong ties to Canada and those who could face risks to their lives, should they be deported.
“These applications are often the very last resort for extremely deserving cases,” Ms. Molnar said. “Errors can have serious consequences for their lives, including deportations back to danger, prolonged family separation and serious safety concerns.”
She added that these systems may further the “asymmetry” of a legal system that is already largely stacked against refugee claimants. There is growing worry that legal aid and support for refugees isn’t keeping pace with the influx of requests, with the Canadian Bar Association calling the current situation a “crisis” in a 2017 letter to the federal Justice Minister.
A spokesperson for Immigration, Refugees and Citizenship Minister Ahmed Hussen said the new tool being sought by his department would provide “new insights” and promote efficiency.
He stressed that the department is “only gathering information at this stage” and that “any tool the department acquires or develops would be tested and piloted before implementation to ensure it is working as intended.”
The spokesperson did not answer specific questions as to whether the government has conducted analysis thus far of the risks of this technology, or whether specific regulations or guidelines were being produced to govern how it would be used.
Ms. Gill did see positive signs in the documents, noting that the request for information asks potential partners on the project how the algorithms can “be developed to ensure biases or potential biases are not introduced,” and how they could ensure transparency in the algorithms.
The government set a deadline of June 7 for its request for information, and contracts for the technology may be awarded from there.