Devers calls out Red Sox front office for asking him to play first base

Concerns are mounting over artificial intelligence relationships appearing in schools across Connecticut. Students are turning to AI chatbots for company, which can soon become overwhelming.

At Ellington Middle School, the counselors mention that they consistently strive to establish a secure atmosphere for their students.

This also makes certain that students feel at ease when discussing their experiences at school.

At present, they are searching for a novel concern.

“We are still working through this; it’s equally new territory for us as it is for parents and children,” stated Scott Raiola, a counselor within the district.

Students are experiencing AI relationships by using chatbots as if they were boyfriends or girlfriends.

“Its human nature. We are all desperate for human connection,” said Raiola.

However, this does not constitute a genuine human connection. Raiola points out that the use of “Ellington” is extremely uncommon, yet it has appeared, leading to actions taken by the school district.

He mentioned that the primary issues are mental well-being and security.

“This is opening a whole new door in terms of mental health,” Raiola said.

He notes you can’t replace a real relationship with a person and shouldn’t. But he understands students see this as a “low risk” type of relationship because there is no rejection at the onset. But it also isn’t real.

Experts agree.

“Maintaining a connection with a computer isn’t inherently negative, but if it detracts from genuine interpersonal interactions, it could impair students’ social abilities,” stated Marcus Pierre, a graduate student at Quinnipiac University.

Pierre also runs the local nonprofit, Digital Defenders, teaching kids the dangers that lurk online, and teaching them skills to navigate an ever-changing online landscape.

He said in his work with kids he’s seeing they are using chatbots for virtually everything. On top of mental health concerns, there are also data concerns in the event of a breach. Students sharing personal information without knowing the danger can leave them vulnerable in the event of a breach.

Connections are likewise becoming a developing concern.

“Kids that are exploring obviously, as they are growing, they are asking questions, and they don’t have a proper way to release it, and they speak to a chatbot about it and now this inappropriate relationship forms,” Pierre explained.

He noted the technology can quickly move into a sexual or violent direction on certain platforms, leaving kids exposed to graphic content or threatening material.

“Now it’s talking about sharing this personal information, sharing pictures, and can go all the way up to talking about hurting somebody or hurting themselves,” Pierre said.

In Ellington, Raiola said AI technology isn’t going away, so education is paramount, and districts will continue to roll it out.

“Kids find new technology all the time and new technology finds them,” he said.

But he also wants parents to be aware and educate themselves. He recommended a book titled “The Anxious Generation.” He suggests it to parents and professionals all the time.

He said the district being there to help in instances like this is critical.

“We try to give them advice, like if it was our kids what we would do,” Raiola said.

You May Also Like