Geoffrey Hinton
Geoffrey Hinton

PARIS: The Nobel Prize in Physics was awarded to two scientists on Tuesday, for discoveries that laid the groundwork for artificial intelligence (AI) used by hugely popular tools such as ChatGPT.

British-Canadian Geoffrey Hinton, known as a “godfather of AI,” and US physicist John Hopfield were given the prize for “discoveries and inventions that enable machine learning with artificial neural networks,” the Nobel jury said. But what are those, and what does this all mean? Here are some answers.

What are neural networks and machine learning?

Mark van der Wilk, an ex­­p­ert in machine learning at the University of Oxford, said an artificial neural network is a mathematical construct “lo­­os­ely inspired” by the human br­­ain. Our brains have a network of cells called neurons, which respond to outside stimuli — such as things our eyes have seen or ears have heard — by sending signals to each other.

When we learn things, some connections between neurons get stronger, while others get weaker. Unlike traditional computing, which works more like reading a recipe, artificial neural networks roughly mimic this process.

The biological neurons are replaced with simple calculations sometimes called “nod­es” — and the incoming stimuli they learn from is replaced by training data. The idea is that this could allow the network to learn over time — hence the term machine learning.

Geoffrey Hinton and US physicist John Hopfield given award for ‘discoveries and inventions that enable machine learning’

What did Hopfield discover?

But before machines would be able to learn, another human trait was necessary: memory.

Ever struggle to remember a word? Consider the goose. You might cycle through similar words — goon, good, ghoul — before striking upon goose.

“If you are given a pattern that’s not exactly the thing that you need to remember, you need to fill in the blanks,” van der Wilk said. “That’s how you remember a particular memory.” This was the idea behind the “Hopfield network” — also called “associative memory” — which the physicist developed back in the early 1980s.

Hopfield’s contribution meant that when an artificial neural network is given something that is slightly wrong, it can cycle through previously stored patterns to find the closest match.

This proved a major step forward for AI.

What about Hinton?

 John Hopfield
John Hopfield

In 1985, Hinton revealed his own contribution to the field — or at least one of them — called the Boltzmann machine.

Named after 19th-century physicist Ludwig Boltzmann, the concept introduced an element of randomness. This randomness was ultimately why today’s AI-powered image generators can produce endless variations to the same prompt.

Hinton also showed that the more layers a network has, “the more complex its behaviour can be”.

This in turn made it easier to “efficiently learn a desired behaviour”, French machine learning researcher Francis Bach said.

Despite these ideas being in place, many scientists lost interest in the field in the 1990s.

Machine learning required enormously powerful computers capable of handling vast amounts of information. It takes millions of images of dogs for these algorithms to be able to tell a dog from a cat. So it was not until the 2010s that a wave of breakthroughs “revolutionised everything related to image processing and natural language processing,” Bach said.

From reading medical scans to directing self-driving cars, forecasting the weather to creating deepfakes, the uses of AI are now too numerous to count.

But is it really physics?

Hinton had already won the Turing award, which is considered the Nobel for computer science. But several experts said his was a well-deserved Nobel win in the field of physics, which started science down the road that would lead to AI.

French researcher Damien Querlioz pointed out that these algorithms were originally “inspired by physics, by transposing the concept of energy onto the field of computing”.

Van der Wilk said the first Nobel “for the methodological development of AI” acknowledged the contribution of the physics community, as well as the winners.

And while ChatGPT can sometimes make AI seem genuinely creative, it is important to remember the “machine” part of machine learning. “Ultimately, everything in AI is multiplications and additions,” van der Wilk emphasised.

Published in Dawn, October 9th, 2024

Opinion

Editorial

China security ties
Updated 14 Nov, 2024

China security ties

If China's security concerns aren't addressed satisfactorily, it may affect bilateral ties. CT cooperation should be pursued instead of having foreign forces here.
Steep price
14 Nov, 2024

Steep price

THE Hindu Kush-Himalayan region is in big trouble. A new study unveiled at the ongoing COP29 reveals that if high...
A high-cost plan
14 Nov, 2024

A high-cost plan

THE government has approved an expensive plan for FBR in the hope of tackling its deep-seated inefficiencies. The...
United stance
Updated 13 Nov, 2024

United stance

It would've been better if the OIC-Arab League summit had announced practical measures to punish Israel.
Unscheduled visit
13 Nov, 2024

Unscheduled visit

Unusual IMF visit shows the lender will closely watch implementation of programme goals to prevent it from derailing.
Bara’s businesswomen
13 Nov, 2024

Bara’s businesswomen

Bara’s brave women have proven that with the right support, societal barriers can be overcome.