[ad_1]
UNESCO on Thursday published its first guidance on use of Generative AI (GenAI) for education, urging governmental agencies to regulate the use of the technology, including protection of data privacy and putting an age limit for users.
Launched by Microsoft-backed OpenAI in November, GenAI chatbot ChatGPT has become the world’s fastest growing app to date, and its emergence has prompted the release of rivals, such as Google‘s Bard.
Students have also taken a liking for GenAI, which can generate anything from essays to mathematical calculations with just a few line of prompts.
“We are struggling to align the speed of transformation of the education system to the speed of the change in technological progress and advancement in these machine learning models,” Stefania Giannini, assistant director-general for education, told Reuters.
“In many cases, governments and schools are embracing a radically unfamiliar technology that even leading technologists do not claim to understand,” she said.
Among a series of guidelines in a 64-page report, UNESCO stressed on the need for government-sanctioned AI curricula for school education, in technical and vocational education and training.
“GenAI providers should be held responsible for ensuring adherence to core values and lawful purposes, respecting intellectual property, and upholding ethical practices, while also preventing the spread of disinformation and hate speech,” UNESCO said.
It also called for prevention of GenAI where it would deprive learners of opportunities to develop cognitive abilities and social skills through observations of the real world, empirical practices such as experiments, discussions with other humans, and independent logical reasoning.
While China has formulated rules on GenAI, the European Union’s AI Act is likely to be approved later this year. Other countries are far behind in drafting their own AI laws.
The Paris-based agency also sought to protect the rights of teachers and researchers and the value of their practices when using GenAI.
© Thomson Reuters 2023
[ad_2]
Source link