Boston

‘Domesticate AI': How Boston higher ed is responding to the advancing technology

Ahead of the 2023-24 school year, local colleges weigh in on the opportunities and risks of artificial intelligence for students and teachers

NBC Universal, Inc.

As artificial intelligence technology advances, Boston universities are preparing to tackle AI within the classroom — but not in the way many may assume.

Despite the ever-increasing ability for students to find answers at the click of a button, experts explain AI can enhance college degrees more than it can hinder them.

STAY IN THE KNOW

icon

Watch NBC10 Boston news for free, 24/7, wherever you are.

icon

Get Boston local news, weather forecasts, lifestyle and entertainment stories to your inbox. Sign up for NBC Boston’s newsletters.

Newly released software such as ChatGPT has earned student usage of AI a bad reputation, and many assume the technology is a threat to education as we know it. Despite this, experts say the technology shouldn't be feared, it should be embraced.

Kosmas Karadimitriou, president of Nova Science Ventures who has worked with AI for over 20 years, says AI is everywhere "whether we know it or not."

"AI is being used all around us now, from influencing what kind of news we see in our news feeds, to what products we buy, when big websites make recommendations for products," he said. "[It controls] all kinds of systems."

This influence has quickly made its way into classrooms all over the world, demanding a re-examination of traditional educational practices. With AI comes newfound risks and opportunities, and a need for out-of-the-box solutions that benefit teachers and students alike.

"[Colleges] definitely need to address the current changes and the fact that the students are using AI," Karadimitriou said. "Whether they want it or not, whether it is forbidden or allowed, they will be using it."

What are the risks of AI in the classroom?

In many regards, AI can be seen as a threat to higher education. With easy-to-access tools such as Matlab or ChatGPT, finding answers is easier than ever. For administration, this raises concerns about plagiarism and academic dishonesty.

"[AI] leads to the big problem— the biggest problem— which is this reduced autonomy for the students and critical thinking," Karadimitriou said.

In a statement, one local college expressed the need to brainstorm new ways to promote academic integrity amongst students.

"The emergence and power of AI has initiated deep conversation about how the technology can and will impact higher education," Emerson College told NBC Boston in a statement. "There are several positive attributes associated with the use of AI, however, new policies are likely needed to ensure it is being used ethically, and appropriately, in educational settings."

Similarly, Dr. Memo Ergezer, associate professor in Wentworth's School of Computing and Data Science, added that the ever-advancing essence of AI will require a new set of guidelines for the student body.

"In the case of AI-based tools, the effects can be quick, wide-reaching, and hard to regulate, as seen with psychological profile-based political ads published by Cambridge Analytica," Dr. Ergerzer said in a statement. "Therefore, it is of utmost importance to work with our students not only to understand the theory/power of AI, but also its ethical ramifications."

"I'm one of the supporters of the development of regulations and guidelines for AI technologies to promote responsible innovation, build trust, and ultimately contribute to the betterment of humanity," he added.

How can administration promote academic integrity?

One potential solution to the fear that students will plagiarize their way through college is to flip the classroom entirely.

"One solution, I would say, is the idea of flipped classroom or inverted classroom, in which professors assign the work of learning at home," Karadimitriou said. "The students learn at home, but then they go to the classroom and that is where they do their work."

In Karadimitriou's expert opinion, however, rules and regulations will only go so far. Ethical use of AI will require a student's desire to independently learn. It is in the student's best interest to actively participate in their education rather than use AI as a substitute.

"AI can give them the initial push, but after that, [students] have to make sure they don't completely rely on AI," he said. "At the end of the day, that's why they go to college: to learn and to acquire skills. And if they outsource all this to AI, then what do they go to college for? Why do they pay so much money?"

This, he says, is a skill that students will take beyond the classroom and into their professional lives.

"This is the world that they will enter— a world that is going to be full of AIs all around us," Karadimitriou said. "And they need to be able to work and live in this world effectively. Be aware of all the AI tools, maybe use them to some extent, but also make sure that they do what they're supposed to do in college."

What opportunities does AI bring to colleges?

According to Azer Bestavros, associate provost for computing and data sciences at Boston University, the opportunities with AI are multi-faceted.

“So there's two questions," he said in an interview with NBC Boston. "One is how is AI changing how we teach? And the other question is how do you teach AI?” 

Within the classroom, AI has the potential to offer an invaluable ability: students can now access one-on-one lessons at their fingertips.

“[With AI], we have a personalized teacher for every student," Karadimitriou said. "No matter what day or time it is, if [the students] have a question, if they need help with homework, and so on, they can turn to AI, which can answer any question they have, provide feedback, even track their progress when they're doing work.” 

Though some may view this possibility as a threat to professors, Bestavros challenges this mindset, defending AI as a tool to benefit education as we know it, not destroy it.

"We should start getting used to AI being part of the workforce," he said. "We have to learn how to use AI as both a tool and a collaborator. It really is about the combination of the human and AI, so now the students will learn how to do that."

How can schools adapt to integrate AI?

To do this, colleges must set out to "teach AI" to not just students, but professors alike.

 "One idea is for the colleges to have seminars with the faculty to make them aware of what the students have available to them," said Karadimitriou. "It could be the case that students know about all these tools and use them, and the professor is totally unaware that [they] even exist. They need to be at least at the same playing field, so to speak."

At Boston University, this idea manifested itself in a student-designed policy which has been adopted by the Computing and Data Sciences Department. The policy details ground rules for student and professor AI usage, emphasizing a commitment to innovation and ethics.

"At BU, we saw [the need for universal training], and [the policy] gives us the space to do this, where both the faculty and the students now are on board," Bestavros said. "We decided that [AI] is something to embrace, and to use actually to elevate the game for everybody. It's about making teachers better teachers, and making students better students."

So what will come of AI at universities in the near future, and how will education change with the advancement of the technology?

In his call for colleges to relinquish fear of AI, Bestavros makes a comparison to a book entitled "Program or Be Programmed" by Douglas Rushkoff.

"This book is actually perfect because it says the following: 'you have a choice,'" he said.

"[Students] have four years, to subjugate AI, domesticate AI. Make it like your pet, as opposed to the one to fear. It's something [they] can control," Bestavros added. "Domesticate AI, or be domesticated by it."

Contact Us