Comptroller-General v. Emotional Perception AI Appeal: Hearing update
16
May
2024
Our highlights and takeaways from this highly anticipated appeal hearing

In continuation of our series of articles on on AI-based inventions and Emotional Perception AI Limited and the Comptroller-General of Patents, Designs & Trade Marks, the highly anticipated appeal hearing has now occurred. This crucial case, which could redefine the scope of patentability for Artificial Neural Networks (ANNs) in the UK, was the focus of extensive deliberations on May 14th and 15th at the UK Court of Appeal in front of Lord Justice Arnold, Lord Justice Birss and Lady Justice Nicola Davies.

With the judgement reserved for now, the Tech team here at Carpmaels & Ransford has been considering the key issues and discussions from the hearing. In this article, Robert Mitchell, Thomas Nichols, and Laura Johnson present their personal highlights and takeaways.

What is a computer? What is a program for a computer?

The first ground of appeal raised by the Comptroller indicated that the lower court was wrong to hold that the “program for a computer…as such” exclusion was not engaged. Arnold LJ opined that the first ground revolved around a matter of law – specifically, the meaning of “computer” and “program” within Section 1(2) of the Patents Act 1977, and whether these terms were restricted to “if-then” architectures or encompassed any kind of machine for processing information.

The Comptroller argued that a trained ANN could be conceptually de-coupled into two separate things – first, the generic ANN itself (i.e., a specific structure of nodes and edges arranged in layers); and second, the set of weights and biases acquired by training that enable the ANN to perform its function. The Comptroller acknowledged that the architecture of a generic ANN differed from that of a conventional digital computer but asserted that it was nevertheless still a valid form of computer (regardless of implementation) by virtue of its function: it takes input data, processes it via some mathematical manipulation, and outputs the calculated result. The Comptroller then asserted that the set of weights and biases in a trained ANN were “a set of instructions that make a generic computer perform a particular task”, and therefore amounted to a program for a computer. As support for their position, the Comptroller pointed to the wide range of different types and architectures of computer that exist (digital or analogue, 8-bit or 32-bit, classical or quantum), noting that the word “computer” has always been a broad term, and therefore rejecting the argument that an ANN could not be a computer merely because it does not use a conventional sequential imperative programming language taking the form of serial, logical, “if-then” type statements written by a human programmer and defining exactly what the computer should do.

In response to the Comptroller, Emotional Perception argued that the above definition of “computer” was much too broad, that it ignored the meaning of the term as used in common parlance, and that it encompassed entities which would never have been intended to be excluded by the Act (such as sextants, mechanical adding machines, hardware filters for mobile phones, slide rules, and even human beings). The term “computer”, when construed correctly (they said), would exclude an ANN since the latter has no central intelligence unit configured to sequentially execute instructions. Even if an ANN could be considered a computer, they argued, then the weights and biases could not be construed as a “program”, since definitions for “computer program” all referenced “instructions” that make a computer do something. Emotional Perception argued that “instructions” implied an element of imperative or command to the computer that is lacking in an ANN. An “instruction”, they said, is not the same as a component which just happens to have a downstream effect on something else. Emotional Perception also argued against the conceptual division of an ANN into its structure and its “program” of weights and biases, which they said was artificial, since a real ANN is a complex electronic device with an intermingled overall structure that operates as a composite whole.

The discussion highlights the lack of established definition of “computer” and “program for a computer” for the purpose of the Patents Act 1977. Such a definition is particularly challenging to articulate, especially considering the significant evolution of computer technology since the inception of the Patents Act. It is interesting to consider how a fair balance might be struck that accounts for future new computing paradigms and architectures, without unfairly excluding a technical device just because it is “configurable” or “programmable”.

Does the mathematical method exclusion apply to artificial neural networks?

The third ground of appeal raised by the Comptroller was that the mathematical method exclusion should have been considered by the lower courts. The Comptroller proposed that, as an alternative to the trained ANN being excluded from patentability as a program for a computer, the trained ANN should be excluded as a mathematical method. In particular, the Comptroller sought to suggest that, if it is possible to decouple the trained ANN from the wider system in which it sits, then decoupling leaves only a wholly mathematical method. In essence, the mathematical method exclusion is a further exclusion to consider should the Court of Appeal decide that the ANN could be decoupled from the underlying software implementing it in such a way that it was not a program for a computer. Indeed, Arnold LJ himself made reference in the hearing to the recent European Patent Office (EPO) decision, Mitsubishi (T 0702/20), in which the EPO found an ANN to be a mathematical method.

Emotional Perception suggested that the claimed ANN was not a mathematical method, despite being able to be described mathematically. It was also argued that, in contrast to Mitsubishi, the claimed ANN has a practical application and is therefore not a mathematical method as such.

The discussions surrounding this ground reveal that decisions of the EPO remain a highly relevant authority for the UK. Indeed, earlier in the hearing Arnold LJ questioned the Comptroller on whether the UK Intellectual Property Office has the same view as the EPO of what is technical (i.e. not excluded), which the Comptroller answered in the affirmative.

Is the requirement for a technical purpose for excluded matter fair?

The fourth ground of appeal concerned whether the claimed invention did have a technical contribution that would take it outside of the exclusion defined by Section 1(2) of the Patents Act 1977. Does the claimed invention relate to a program for a computer and/or a mathematical method “as such”?

Central to this discussion was the Aerotel test, which both the Comptroller and Emotional Perception agreed remains the appropriate test for determining if an invention pertains solely to excluded matter. There was some discussion about the nature of the “actual contribution” of the claimed invention, which forms the second step of the Aerotel test, and although the Comptroller and Emotional Perception agreed that the actual contribution was “sending of an improved recommendation message”, this was questioned by Arnold LJ, who suggested that the actual contribution might lie in the training process of the ANN rather than the output it provides. This demonstrates how challenging determining the “actual contribution” for the Aerotel test is in practical terms.

Viewing the actual contribution as sending an improved recommendation message, the Comptroller contended that this was insufficient to take the claimed invention out of the exclusion, as a “technical purpose” beyond merely sending a message or alert was necessary. They noted that the recommended file was only “better” in an aesthetic sense, not a technical one, aligning with EPO case law in Yahoo (T 0306/10). Emotional Perception argued that the mere act of sending a recommendation message took the claim outside of the exclusion, referring to the nexus between the computer program and its underlying hardware.

When discussing whether the actual contribution might be in the training of the ANN, the Comptroller admitted that a clever computer program is still a computer program nonetheless, and without a further technical purpose the exclusion is not avoided. The Comptroller explained that if the technology were analogue electronics such as a 1950s jukebox that also provided improved recommendations, the requirement for a technical purpose (in the form of a technical output) would not arise, but that it is a policy decision of the legislature that the requirement does arise for a computer program having the same output. This raises interesting questions about whether the technical purpose requirement for inventions involving excluded matter is fair, and whether the UK Intellectual Property Office’s established principle of “substance not form” is truly being followed.

Conclusion

After two days of intense discussions at the hearing, we await the forthcoming judgment with keen interest. Our analysis today has highlighted some of the pivotal moments and key insights that emerged from the proceedings. As always, we will report on the judgment once it is delivered and analyse how it might affect the patentability of ANNs and related computer technology in the UK.