Jul 02, 2025 .

When AI Speaks: How Ghanaian Courts Might Decide Moffatt v Air Canada under the Electronic Transactions Act

Introduction: The Chatbot That Spoke Out of Turn

Picture this: you’re grieving the death of a loved one and urgently need to fly home. You visit your airline’s website and, like many of us in the digital age, you engage with the friendly chatbot who pops up on your screen. You ask about bereavement fares, and the AI politely informs you that you can buy a regular ticket and claim a refund later. Relieved, you purchase your ticket—only to discover later that the airline’s real policy doesn’t allow such refunds.

This isn’t fiction. This is precisely what happened in Moffatt v. Air Canada, a real-life legal drama decided earlier this year by the British Columbia Civil Resolution Tribunal (CRT). The chatbot’s misstatement cost the passenger, Mr. Moffatt, hundreds of dollars. Air Canada tried to argue that its AI chatbot was a separate entity and that it shouldn’t be held responsible for the bot’s bad advice. The tribunal rejected that notion entirely and held the airline liable.

Now, let’s bring this story home. Suppose a Ghanaian airline used an AI chatbot and the same misrepresentation occurred under Ghanaian law. How would our courts handle it? Would Ghana’s legal system hold the airline liable, or would it accept the defence that a chatbot is somehow legally separate from its creator?

Buckle up: we’re about to explore how Ghana’s Electronic Transactions Act, 2008 (Act 772) stands uniquely prepared for the age of AI misadventures—and why the Ghanaian legal landscape is surprisingly progressive in addressing the liabilities that AI-powered solutions might trigger.

The Canadian Case: Moffatt v Air Canada in Brief

Let’s first zoom in on the facts.

  • Mr. Moffatt asked Air Canada’s website chatbot about bereavement fares.
  • The chatbot responded inaccurately, advising him he could book a regular ticket and claim a partial refund afterwards.
  • Air Canada’s actual policy required bereavement fares to be arranged in advance.
  • Mr. Moffatt relied on the chatbot, bought his ticket, and was later denied the refund.
  • Air Canada argued the chatbot was “a separate legal entity.”
  • The CRT rejected that defence and held Air Canada liable for the misinformation.

The CRT’s reasoning was simple but powerful: a company is responsible for representations made through its digital systems, AI or otherwise. The tribunal declared that consumers have the right to rely on information they receive from official business channels—even if those channels are automated. It’s a case destined to be a tech-law classic.

Ghana’s Legal Framework: The Electronic Transactions Act, 2008 (Act 772)

Ghana’s Electronic Transactions Act, 2008 (Act 772) predates the explosive rise of AI, but its language is broad and forward-looking. It defines an “electronic agent” as:

“a computer programme or an electronic or other automated means used independently to initiate an action or respond to electronic records or performances in whole or in part, in an automated transaction.”

Let’s pause here. That language is beautifully open-ended. It doesn’t limit the definition to basic rule-based systems. It comfortably includes modern AI chatbots, machine learning algorithms, and other smart technologies that “initiate” actions or “respond” autonomously.

So, in Ghana, Air Canada’s chatbot would undoubtedly qualify as an electronic agent. The real question becomes: if a chatbot misleads a customer, who bears the legal consequences?

The Ghanaian Scenario: Air Ghana’s Hypothetical Chatbot

Let’s create a hypothetical:

  • Air Ghana deploys an AI chatbot on its website.
  • A passenger, Ms. Adjoa, inquires about bereavement fares.
  • The chatbot erroneously tells her she can book a normal fare and get a refund later.
  • Ms. Adjoa buys a ticket based on this advice.
  • Later, Air Ghana refuses the refund, citing its real policy requiring prior arrangement.

Would Air Ghana escape liability in Ghana’s courts by claiming, like Air Canada, that the chatbot was an independent actor?

The Law in Action: Section 116 and the Principle of Imputed Intent

Under Section 116 of Act 772, the Ghanaian law is crystal clear:

“A person who uses any electronic medium or any electronic agent whether in part or in whole is deemed to intend to cause or contribute to causing the event which results from the use or intervention of the electronic medium or agent.”

That’s powerful. It means that when a company uses a chatbot, the law presumes that the company intends the outcomes generated by the chatbot’s communications. Unlike Air Canada’s argument, Ghanaian law doesn’t allow companies to hide behind the “independence” of their electronic agents.

In Ms. Adjoa’s case, Air Ghana would be deemed to have intended the chatbot’s representation about refunds. That’s enough to put the airline on the hook for any resulting loss.


Transparency and Access: Section 17(2) and (3)

But there’s even more in Act 772 to protect consumers like Ms. Adjoa. Under Section 17(2):

“A party interacting with an electronic agent to make an agreement is not bound by the terms of the agreement unless the terms were capable at first of being accessed by the party prior to the formation of the contract.”

So if Air Ghana’s real bereavement policy wasn’t visible or easily accessible when Ms. Adjoa bought her ticket, the airline couldn’t enforce the contract terms that denied her a refund.

Moreover, Section 17(3) adds:

“An electronic contract is not valid where an individual interacts directly with the electronic agent and has made a material error… and the electronic agent did not provide that person with an easy opportunity to prevent or correct the error.”

Here, the chatbot’s bad advice creates a “material error.” If Air Ghana’s website didn’t offer Ms. Adjoa a clear way to verify the policy or correct her misunderstanding, the contract might be invalid altogether.

In short, Ghanaian law has built-in safeguards for consumers dealing with chatbots and AI systems. That’s remarkably forward-thinking for legislation passed in 2008.

Analogies: The Automated Teller Machine (ATM) Analogy

To grasp how Ghanaian courts might view this, consider an analogy closer to home: the ATM. Suppose an ATM displayed a wrong balance and dispensed more cash than it should have. Banks can’t argue the ATM acted independently and disclaim responsibility. The machine, while automated, is part of the bank’s operational structure. Courts would unquestionably hold the bank liable for the machine’s mistake.

The chatbot functions in precisely the same way. It’s merely a digital extension of the company. In legal terms, it’s the company’s “electronic hand.” Ghanaian courts would likely take that view without hesitation.

A Progressive Legal Framework

Far from lagging behind, Ghana’s legal system is strikingly and relatively well-equipped for AI than some other jurisdictions. The Electronic Transactions Act was crafted to foster trust and legal certainty in digital transactions. Its broad definitions and consumer safeguards show remarkable foresight. In Moffatt’s Canadian case, the CRT leaned heavily on fairness and consumer reliance principles. Ghanaian law goes further by codifying similar principles directly into statutory language. Where Canada’s decision was innovative, Ghana’s law is structurally ready.

Moreover, Ghana’s courts have demonstrated a willingness to engage with technological disputes. For example, in Atuguba & Associates v Scipion Capital (UK) Ltd and Holman Fenwick Willan LLP [2019] Civil Appeal No. J4/04/2019, the Supreme Court recognised the enforceability of electronic evidence, showing an appetite for integrating technology into legal processes. It’s reasonable to predict the judiciary would handle AI liability with similar pragmatism.

Predicting the Outcome: Ms. Adjoa Wins

Putting all these provisions together, here’s how a Ghanaian court would likely rule in our hypothetical:

  1. The chatbot is legally an electronic agent under section 144.
  2. Air Ghana is deemed to intend the chatbot’s statements under section 116.
  3. The airline can’t enforce hidden terms if they weren’t accessible under section 17(2).
  4. Any contract formed based on the chatbot’s error could be invalid under section 17(3).
  5. Air Ghana would be liable for Ms. Adjoa’s financial loss, just like Air Canada was held liable in Canada.

The Future Is Now

The moral of the story is simple: AI systems are powerful tools—but they’re also legal minefields if not properly managed. Businesses can’t deploy chatbots, virtual assistants, or other electronic agents and then disclaim responsibility for their errors. In Ghana, the law already recognises this truth.

The Electronic Transactions Act, 2008, remains a shining example of technology-neutral drafting that gracefully accommodates new digital realities like AI. It may have been passed when iPhones were still a novelty, but its principles resonate loudly in the age of chatbots and machine learning. In the world of tech law, Ghana’s message is clear: when your AI speaks, it’s you speaking. And for consumers, that’s a reassuring note in an increasingly automated world.


Footnotes

  1. Electronic Transactions Act, 2008 (Act 772), s. 144.
  2. Ibid., s. 116.
  3. Ibid., s. 17(2).
  4. Ibid., s. 17(3).
  5. Atuguba & Associates v Scipion Capital (UK) Ltd and Holman Fenwick Willan LLP [2019] Civil Appeal No. J4/04/2019 (SC).
Cart (0 items)