AI will be essential in education — but do kids know how to use it?

외신뉴스
2024-02-29 21:57 PM

Savannah Fortis11 hours agoAI will be essential in education — but do kids know how to use it?As AI becomes increasingly prevalent in education, concerns arise over children’s proficiency in navigating its complexities and potential pitfalls.851 Total views7 Total sharesListen to article 0:00NewsOwn this piece of crypto historyCollect this article as NFTJoin us on social networksArtificial intelligence (AI) is everywhere — it can rewrite famous novels, compose songs and create realistic videos. While older generations fear its takeover, the youth of today are embracing the technology. 


A November 2023 study from the Pew National Research Center in the United States revealed that nearly one-in-five teenagers between the ages of 13–17 who have heard of the AI chatbot ChatGPT have used it to help with schoolwork, which amounts to roughly 13% of all teens in the United States.


The same study showed that seven out of 10 teens said it is okay to use the chatbot when researching something new and exploring a topic is okay. However, this raises concerns, particularly when it comes to AI producing misleading or false information.


This was highlighted in the recent example of Google’s AI chatbot Gemini producing “woke” and inaccurate depictions of historical scenes, for which it apologized.


There is no going back from introducing AI to the next generation. However, questions remain about the best AI practices for youth, particularly regarding education.


Cointelegraph spoke with Brandon Da Silva, the CEO of ArenaX Labs, to better understand how AI can be implemented productively and safely into youth education.AI boosts tech-savviness 


ArenaX Labs recently released AI Arena, a player-vs-player fighter game in which players train AI models to battle each other autonomously with the goal of using play to boost AI literacy.


Da Silva said teaching young people how to train or program AI holds significant importance beyond simply asking questions when using tools like ChatGPT.


“If you’re using ChatGPT and it starts to give you a weird answer, it’s important to understand why,” he explained. “Otherwise, some people might start to think that what ChatGPT tells them is basically the gospel, without concerns that it might not be correct.” He added:“It’s essential to understand the limitations of these tools, because you can only truly understand where, why, and how they might go wrong when you know how they work.”


He said kids who begin to interact with AI at a young age will more likely become a lot more “tech-savvy” than their peers who do not:“We believe that AI will transform society and that it will be part of everyone’s lives going forward, and because of that, it’s important that people get familiar with it starting from a young age.”


Related:OpenAI accuses New York Times of hacking AI models in copyright lawsuit


Da Silva drew a parallel between those who grew up learning how to program from a young age. “Many of these people were better at programming as high schoolers than some full-time employees who have done programming for 10 years as adults,” he said.


However, the issue is a multifaceted one; “there are both benefits and risks,” he continued. The tendency for kids to become “glued” to technology like iPads and phones could be translated into AI usage if not monitored.


He also pointed to the risk of AI’s inherited bias, like with Google’s Gemini.“This type of thing can be very dangerous. As adults, it’s easier to recognize bias. But as a child, you don’t know.”AI in education


This is where proper AI education and attention from educators who are using AI come into play. 


Similar to the way people will need to develop the discernment skills to spot deepfakes, Da Silva said we will all need to learn how to ask these important questions surrounding bias when AI is involved, starting with the youth. He said:“Educators need to emphasize the importance of critical thinking skills when it comes to student interactions with AI.”


Da Silva said it will also be important to consider the user when teaching how to interact with AI; there is no “one-size-fits-all solution.” He said having different “levels” of communication with AI for different kinds of learners is important.


Another important aspect for educators and those dealing with the youth’s interactions with AI is the emotional relationships one can develop with an AI. Recent research from the Digital Wellness Lab said children can form “parasocial relationships,” or one-way emotional attachments with AI-enabled digital assistants. 


It cited a study of children aged six to 10, in which 93% of participants said that a “digital assistant” they were familiar with was smart, with 65% responding that the device could be a friend.


De Silva said:“Developing an emotional connection with an AI can help students become more invested in their learning. At the same time, having an emotional connection with something like an AI has a risk of believing what it says more than one should.”


In such cases, objectivity could play a lesser role because that kind of connection can lead one to feel that the AI is a trusted authority that doesn’t need to be fact-checked — especially among youth. 


It is a critical moment for society as AI’s evolution continues at light speed while humans still try to wrap their heads around the technology itself. However, this is a moment for today’s youth to safely learn and engage with a tool that will most likely shape their future. 


Magazine:Google to fix diversity-borked Gemini AI, ChatGPT goes insane: AI Eye# Google# Education# Gemini# AI# ChatGPT# OpenAIAdd reactionAdd reaction

외신뉴스
Crypto news


함께 보면 좋은 콘텐츠

All posts
Top