Artificial intelligence’s (AI) latest iteration of a generative AI, ChatGPT, is drawing as much scrutiny as it is fervor across Europe. Privacy violations, as well as other concerns, have prompted G7 digital ministers to agree upon adopting “risk-based” regulation.
Privacy Concerns Persist with AI
On April 3, Italy’s data protection watchdog organization, Garante, banned ChatGPT due to privacy violations. The organization discovered that a mass data collection protocol was taking place, violating the country’s regulation on data collection. Furthermore, the system lacked an age-verification system.
Although the country has since lifted the ban, more complaints from other countries have ensued.
For example, French data regulators reported receiving two complaints related to ChatGPT just days after Italy’s move to ban it. And as a result, France, along with Ireland and Germany, have also joined Italy’s stance on Open AI’s ChatGPT.
ChatGPT is also banned in North Korea, Iran, China, and Russia. And Canada’s own data regulator has also launched an investigation into OpenAI.
In a joint statement, the G7 ministers agreed that regulation should “preserve an open and enabling environment” in order for AI tech innovation to flourish and be supported by democratic values.
In the statement, they also noted:
“We plan to convene future G7 discussions on generative AI which could include topics such as governance, how to safeguard intellectual property rights including copyright, promote transparency, address disinformation.”
Outside of intellectual property concerns, G7 countries acknowledged that there were also potential security risks.
“Generative AI…produces fake news and disruptive solutions to the society if the data it’s based is fake,” said Taro Kono, Japan’s digital minister, during a press conference after the agreement.
Jean-Noel Barrot, French Minister for Digital Transition, told Reuters that “pausing (AI development) is not the right response—innovation should keep developing but within certain guardrails that democracies have to set.”