‘Godfather of AI’ among duo awarded Nobel Prize in Physics

Published October 9, 2024 Updated October 9, 2024 11:20am

 Geoffrey Hinton
Geoffrey Hinton

PARIS: The Nobel Prize in Physics was awarded to two scientists on Tuesday, for discoveries that laid the groundwork for artificial intelligence (AI) used by hugely popular tools such as ChatGPT.

British-Canadian Geoffrey Hinton, known as a “godfather of AI,” and US physicist John Hopfield were given the prize for “discoveries and inventions that enable machine learning with artificial neural networks,” the Nobel jury said. But what are those, and what does this all mean? Here are some answers.

What are neural networks and machine learning?

Mark van der Wilk, an ex­­p­ert in machine learning at the University of Oxford, said an artificial neural network is a mathematical construct “lo­­os­ely inspired” by the human br­­ain. Our brains have a network of cells called neurons, which respond to outside stimuli — such as things our eyes have seen or ears have heard — by sending signals to each other.

When we learn things, some connections between neurons get stronger, while others get weaker. Unlike traditional computing, which works more like reading a recipe, artificial neural networks roughly mimic this process.

The biological neurons are replaced with simple calculations sometimes called “nod­es” — and the incoming stimuli they learn from is replaced by training data. The idea is that this could allow the network to learn over time — hence the term machine learning.

Geoffrey Hinton and US physicist John Hopfield given award for ‘discoveries and inventions that enable machine learning’

What did Hopfield discover?

But before machines would be able to learn, another human trait was necessary: memory.

Ever struggle to remember a word? Consider the goose. You might cycle through similar words — goon, good, ghoul — before striking upon goose.

“If you are given a pattern that’s not exactly the thing that you need to remember, you need to fill in the blanks,” van der Wilk said. “That’s how you remember a particular memory.” This was the idea behind the “Hopfield network” — also called “associative memory” — which the physicist developed back in the early 1980s.

Hopfield’s contribution meant that when an artificial neural network is given something that is slightly wrong, it can cycle through previously stored patterns to find the closest match.

This proved a major step forward for AI.

What about Hinton?

 John Hopfield
John Hopfield

In 1985, Hinton revealed his own contribution to the field — or at least one of them — called the Boltzmann machine.

Named after 19th-century physicist Ludwig Boltzmann, the concept introduced an element of randomness. This randomness was ultimately why today’s AI-powered image generators can produce endless variations to the same prompt.

Hinton also showed that the more layers a network has, “the more complex its behaviour can be”.

This in turn made it easier to “efficiently learn a desired behaviour”, French machine learning researcher Francis Bach said.

Despite these ideas being in place, many scientists lost interest in the field in the 1990s.

Machine learning required enormously powerful computers capable of handling vast amounts of information. It takes millions of images of dogs for these algorithms to be able to tell a dog from a cat. So it was not until the 2010s that a wave of breakthroughs “revolutionised everything related to image processing and natural language processing,” Bach said.

From reading medical scans to directing self-driving cars, forecasting the weather to creating deepfakes, the uses of AI are now too numerous to count.

But is it really physics?

Hinton had already won the Turing award, which is considered the Nobel for computer science. But several experts said his was a well-deserved Nobel win in the field of physics, which started science down the road that would lead to AI.

French researcher Damien Querlioz pointed out that these algorithms were originally “inspired by physics, by transposing the concept of energy onto the field of computing”.

Van der Wilk said the first Nobel “for the methodological development of AI” acknowledged the contribution of the physics community, as well as the winners.

And while ChatGPT can sometimes make AI seem genuinely creative, it is important to remember the “machine” part of machine learning. “Ultimately, everything in AI is multiplications and additions,” van der Wilk emphasised.

Published in Dawn, October 9th, 2024

Opinion

Editorial

Palestine MPC
Updated 09 Oct, 2024

Palestine MPC

It's a matter of concern that PTI did not attend the Palestine MPC. Political differences should be put aside when showing solidarity with Palestine.
A welcome reform
09 Oct, 2024

A welcome reform

THE Punjab government’s decision to abolish the corruption-ridden and inefficient food department, and replace it...
Water paradox
09 Oct, 2024

Water paradox

A FULLY fledged water crisis is unfolding across the world, with 2023 recorded as the driest year for rivers in over...
Terrorism upsurge
Updated 08 Oct, 2024

Terrorism upsurge

The state cannot afford major security lapses. It may well be that the Chinese nationals were targeted to sabotage SCO event.
Ban hammer
08 Oct, 2024

Ban hammer

THE decision to ban the PTM under the Anti-Terrorism Act is yet another ill-advised move by the state. Although the...
Water tensions
08 Oct, 2024

Water tensions

THE unresolved tensions over Indus water distribution under the 1991 Water Apportionment Accord demand a revision of...