PARIS: Users of the Replika “virtual companion” just wanted company. Some of them wanted romantic relationships, or even explicit chat.

But late last year users started to complain that the bot was coming on too strong with racy texts and images — sexual harassment, some alleged.

Regulators in Italy did not like what they saw and last week barred the firm from gathering data after finding breaches of Europe’s massive data protection law, the General Data Protection Regulation (GDPR).

The company behind Replika has not publicly commented on the move.

The GDPR is the bane of big tech firms, whose repeated rule breaches have landed them with billions of dollars in fines, and the Italian decision suggests it could still be a potent foe for the latest generation of chatbots.

Replika was trained on an in-house version of a GPT-3 model borrowed from OpenAI, the company behind the ChatGPT bot, which uses vast troves of data from the internet in algorithms that then generate unique responses to user queries.

These bots, and the so-called generative AI that underpins them, promise to revolutionise internet search and much more.

But experts warn that there is plenty for regulators to be worried about, particularly when the bots get so good that it becomes impossible to tell them apart from humans.

High tension

Right now, the European Union is the centre for discussions on regulation of these new bots _ its AI Act has been grinding through the corridors of power for many months and could be finalised this year.

But the GDPR already obliges firms to justify the way they handle data, and AI models are very much on the radar of Europe’s regulators.

“We have seen that ChatGPT can be used to create very convincing phishing messages,” Bertrand Pailhes, who runs a dedicated AI team at France’s data regulator Cnil, said.

He said generative AI was not necessarily a huge risk, but Cnil was already looking at potential problems including how AI models used personal data.

“At some point we will see high tension between the GDPR and generative AI models,” German lawyer Dennis Hillemann, an expert in the field, said.

The latest chatbots, he said, were completely different from the kind of AI algorithms that suggest videos on TikTok or search terms on Google.

“The AI that was created by Google, for example, already has a specific use case _ completing your search,” he said.

But with generative AI the user can shape the whole purpose of the bot. “I can say, for example: act as a lawyer or an educator. Or if I’m clever enough to bypass all the safeguards in ChatGPT, I could say: `Act as a terrorist and make a plan’,” he said.

OpenAI’s latest model, GPT-4, is scheduled for release soon and is rumoured to be so good that it will be impossible to distinguish from a human.

Published in Dawn, February 13th, 2023

Opinion

Accessing the RSF

Accessing the RSF

RSF can help catalyse private sector inves­tment encouraging investment flows, build upon institutional partnerships with MDBs, other financial institutions.

Editorial

Madressah oversight
Updated 19 Dec, 2024

Madressah oversight

Bill should be reconsidered and Directorate General of Religious Education, formed to oversee seminaries, should not be rolled back.
Kurram’s misery
Updated 19 Dec, 2024

Kurram’s misery

The state must recognise that allowing such hardship to continue undermines its basic duty to protect citizens’ well-being.
Hiking gas rates
19 Dec, 2024

Hiking gas rates

IMPLEMENTATION of a new Ogra recommendation to increase the gas prices by an average 8.7pc or Rs142.45 per mmBtu in...
Geopolitical games
Updated 18 Dec, 2024

Geopolitical games

While Assad may be gone — and not many are mourning the end of his brutal rule — Syria’s future does not look promising.
Polio’s toll
18 Dec, 2024

Polio’s toll

MONDAY’s attacks on polio workers in Karak and Bannu that martyred Constable Irfanullah and wounded two ...
Development expenditure
18 Dec, 2024

Development expenditure

PAKISTAN’S infrastructure development woes are wide and deep. The country must annually spend at least 10pc of its...