Are Decision-Makers Being Replaced with AI?

The Federal Government’s Directive on Automated Decision-Making (the “Directive”) was revamped in April 2023. Originally introduced in 2019, the Directive was aimed at improving the service delivery of administrative decisions using artificial intelligence (“AI”), with a goal of both making and supporting administrative decision-making.[1] But even with its revamp, the Directive does not necessarily mean that important administrative decisions are being made solely by AI – yet.

For now, the “AI” administrative decision-making seems limited to more periphery processes and not the actual decisions. For example, Chinook is a program used by Immigration, Refugees, and Citizenship Canada (“IRCC”) to increase efficiency in processing immigration files. Chinook displays the IRCC’s Global Case Management System in a more user-friendly way, aiming to increase user productivity. It is not an AI decision-maker, rather, it is a tool designed to consolidate information in one place.[2]

While Chinook does not make decisions (yet) for the IRCC, the use of Chinook as an AI “decision-maker” has raised concerns for visa applicants affected by the program. In Haghshenas v. (Citizenship and Immigration), 2023 FC 464, a visa applicant argued concerns about the reliability and efficacy of the software and that Chinook cannot be termed “reasonable” until it is elaborated how the machine learning replaced human input and how it affected visa application outcomes. The Federal Court rejected this argument, stating, “whether a decision is reasonable or unreasonable will determine if it is upheld or set aside, whether or not artificial intelligence was used.”

The Court further stated that the use of AI was irrelevant because a visa officer ultimately made the decision.[3] Similar findings were made in cases raising the same issue, such as Khorasgani v. Canada (Immigration, Refugees, and Citizenship), 2023 FC 1581 and Kumar v. Canada (Citizenship and Immigration), 2024 FC 81. It seems clear that the Federal Court is not currently concerned about the implications of AI decision-making on procedural fairness or administrative decision-making generally. Programs like Chinook only organize information and do not currently decide the fate of applicants.

The implementation of AI into administrative decision-making processes will not go untested or unknown by those affected. The Directive sets out several requirements for AI programs before an automated decision system is implemented. For example, programs must undergo the Algorithmic Impact Assessment before use and notice must be provided that a decision is being made with automated decision systems. Enforcement provisions pursuant to the Financial Administration Act, R.S.C. 1985 c. F-11 apply to penalize any non-compliance with the requirements set out under the Directive.[4]

Hearing that vital life-changing decisions can be made by AI may sound concerning to those affected; however, it seems that the aim of the Directive is to improve service delivery. This could mean faster more thorough decisions completed by AI that can assemble information in minutes that would take one individual weeks. There are potential downsides to decisions made by AI. AI decisions may be more streamlined and consistent, but AI bias, the phenomena where AI perpetuates human bias, could also lead to procedural fairness concerns.[5]

The introduction of AI into more administrative decision-making processes will lead to many improvements, but it is likely to face a few hiccups once decision-makers begin to rely solely on AI to make decisions on the outcome of the issues. For now, we will need to wait and see. Only time will tell how the courts will assess the reasonableness and procedural fairness of decisions made solely by AI decision-makers.

If you have questions about administrative processes generally, please contact Miranda Wardman at [email protected] or by phone at 250-869-1278.

[1] https://www.tbs-sct.canada.ca/pol/doc-eng.aspx?id=32592
[2] https://www.canada.ca/en/immigration-refugees-citizenship/corporate/transparency/committees/cimm-feb-15-17-2022/chinook-development-implementation-decision-making.html
[3] Haghshenas v. (Citizenship and Immigration), 2023 FC 464 at paras. 22-24, 28.
[4] https://www.tbs-sct.canada.ca/pol/doc-eng.aspx?id=32592
[5] https://www.ibm.com/blog/shedding-light-on-ai-bias-with-real-world-examples/

The content made available on this website has been provided solely for general informational purposes as of the date published and should NOT be treated as or relied upon as legal advice. It is not to be construed as a representation, warranty, or guarantee, and may not be accurate, current, complete, or fit for a particular purpose or circumstance. If you are seeking legal advice, a professional at Pushor Mitchell LLP would be pleased to assist you in resolving your legal concerns in the context of your particular circumstances.

It is prohibited to reproduce, modify, republish, or in any way use content from this website without express written permission from the Chief Operating Officer or the Managing Partner at Pushor Mitchell LLP. Third party content that references this publication is not endorsed by Pushor Mitchell LLP and in no way represents the views of the firm. We do not guarantee the accuracy of, nor accept responsibility for the content of any source that may link, quote, or reference this publication.

Please read and understand our full Website Terms of Use and Disclaimer here.

Legal Alert, Pushor Mitchell’s free monthly e-newsletter