aiEDU – ĂŰĚŇÓ°ĘÓ America's Education News Source Fri, 13 Feb 2026 18:18:34 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png aiEDU – ĂŰĚŇÓ°ĘÓ 32 32 At These Universities, Using AI Isn’t Shunned — It’s a Graduation Requirement /article/at-these-universities-using-ai-isnt-shunned-its-a-graduation-requirement/ Tue, 17 Feb 2026 11:30:00 +0000 /?post_type=article&p=1028557 While most colleges and universities are reluctantly grappling with of artificial intelligence, a few are not only tolerating it but making it part of their core curricula. In the process, they’re signaling to new students that using and critically evaluating AI will be a large part of their post-college lives.

Indiana’s Purdue University in December approved an AI “working competency” , saying that by the time they earn a diploma, undergraduates must be able to use the latest AI tools effectively in their chosen field while understanding both the technology’s strengths and limitations. 

Graduates must also be able to defend decisions informed by AI while sussing out its “presence, influence and consequences” in their work.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


“The root of all of this is really making sure that our students are ready for the workforce and are not left behind by AI,” said , Purdue’s senior vice provost for academic and student success. While admitting that college students likely rely on AI for class assignments, she said what’s missing is the ability to go deeper. 

“Yes, they know how to use it, but are we instilling a framework and a practice where we’re emphasizing critical thinking?” she said. 

The long-term goal of the effort is to ensure that graduates are “wildly successful in an AI-enabled workplace,” while being able to evaluate AI-generated work and criticize it. 

A microbiologist by training, Oliver-Jischke said AI has already “revolutionized” her field. Recent research suggests that AI-enabled analysis of large genomic data sets, for instance, is allowing scientists to look at DNA directly from environmental samples, revealing of previously unknown microbes.

“The technology is here,” said Oliver-Jischke. “You will lose out on opportunities if you don’t understand it or know how to utilize it and apply it effectively.”

Purdue’s faculty and curriculum committees began discussing the new requirement last summer, she said. The university has already identified 35 courses that will lead the way toward fulfilling the requirement. It goes into effect fully for the graduating class of 2030, who are due to arrive on campus in the fall. It won’t require a separate exam or course, but rather it will be embedded into students’ required coursework, she said.

Haley Oliver-Jischke

While it’s unusual, Purdue’s move isn’t unprecedented. 

In January 2025, the State University of New York system its information literacy curriculum to include requirements that SUNY students effectively recognize and ethically use AI. While it integrates AI into an existing requirement, it doesn’t create a standalone competency like Purdue’s.

In June, The Ohio State University unveiled its initiative, which will embed AI education “into the core of every undergraduate curriculum, equipping students with the ability to not only use AI tools, but to understand, question and innovate with them — no matter their major.”

Both Purdue and Ohio State are public , founded within months of each other in 1869 and 1870, respectively, to meet what was at the time a booming demand for agricultural and technical expertise. 

Ohio State’s AI effort will require all graduates, beginning with the class of 2029, to be “fluent” in the technology and how it can be responsibly applied to advance their field. “In the not-so-distant future, every job, in every industry, is going to be impacted in some way by AI,” Walter “Ted” Carter Jr., the university president, said at the time.

Executive Vice President and Provost told ĂŰĚŇÓ°ĘÓ that as AI continues to influence how we work, teach and learn, “we will remain at the forefront of this technology.” 

Is ‘vibe coding’ the future?

The moves come as recent surveys suggest that college students are already making AI a large part of their education, even if they’re mostly outsourcing hard work: The AI and plagiarism detection platform Copyleaks in September found that of college students have used AI for academic purposes, with 53% using it either daily or several times a week. 

While most students say they use it for brainstorming, half use AI to draft outlines and 44% to generate actual drafts of work. About one in three students uses AI to summarize readings.

In light of statistics like these, requiring a deeper competence around AI is “a good step in the right direction,” said Alex Kotran, CEO of the . “Closing out 2025, I was feeling like post-secondary is sort of deer-in-the-headlights” when it comes to AI. “This is promising, but the proof will be in the pudding: Are they building the systems for professional development and learning, because that’s going to be critical. The policy is just step one.”

Kotran noted that the vast majority of job postings now specifically name AI skills as a requirement. Colleges that are seen as more effective at helping students get those skills are likely producing “more employable” graduates.

Purdue’s Oliver-Jischke said the focus at the university, which enrolls , is on “working competencies” and how they can fit into instruction across departments. “This can be a large boat to turn, but because we have a commitment to AI and this is obviously a massive STEM school, everybody is curious, interested and willing to explore how this should be implemented into the core curricula.”

At the same time, she said, AI is evolving quickly and the landscape could soon be very different. “We recognize that, and we want to remain nimble,” she said. “And we will keep our curricula nimble to do that.”

Alex Kotran

The two schools’ focus on differentiated, workplace-specific use of AI is a smart one, Kotran said. But to be effective, universities should go beyond simply relying on off-the-shelf commercial products. “The future of work is not a bunch of employees using ChatGPT or Gemini day-to-day and being more productive because of that,” he said.

Instead, the real value of AI, at least for now, is in the custom software it enables users to build via what’s known informally as “,” or using AI prompts to do the actual behind-the-scenes coding that once took advanced knowledge. “The real unlock comes when you’re building custom software to do stuff more efficiently,” he said.

Since generative AI came to market in 2022, the cost of building apps, websites, games and other software has dropped precipitously, while the task has gotten easier for non-technical users. 

“That’s going to change the way we work,” Kotran said. The more users can develop and control their own software, the more productive they’ll be. “But it’s very hard to get that insight if you haven’t seen vibe coding for yourself.” 

Done right, the efforts at Purdue and Ohio State could be significant, Kotran said. “It just increases the exposure that students are going to get to having the opportunity to build that intuition and to experiment,” he said. “And it will force professors to start building their assessments around it.”

]]>
Q&A: Putting AI In its Place in an Era of Lost Human Connection at School /article/qa-putting-ai-in-its-place-in-an-era-of-lost-human-connection-at-school/ Wed, 04 Dec 2024 19:30:00 +0000 /?post_type=article&p=736263 Alex Kotran occupies an unusual place in the ecosystem of experts on artificial intelligence in schools. As founder of , or aiEDU, a nonprofit that offers a free AI literacy curriculum, he has pushed to educate both teachers and students on how the technology works and what it means for our future.

A former director of AI ethics and corporate social responsibility at H5, an AI legal services company, he led partnerships with the United Nations, the Organization for Economic Cooperation and Development and others. Kotran also served as a presidential appointee under Health and Human Services Secretary Sylvia Burwell in the Obama administration, managing communications and community outreach for the Affordable Care Act and the .

More recently, Kotran has testified before Congress on AI, a U.S. Senate subcommittee in September to “massively expand” teacher training to prepare students for the economic and societal disruptions of generative AI. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


But he has also become an important reality-based voice in a sometimes overheated debate, saying those who believe AI is going to transform the teaching profession overnight clearly haven’t spent much time using it.

While freely available AI applications are powerful, he says they can also be a complete waste of time — and probably not something most teachers should rely on.

“One of the ways that you can tell someone really hasn’t spent too much time [with AI] is when they say, ‘It’s so great for summarizing — I use it now, I don’t have to read dense studies. I just ask ChatGPT to summarize it.’”

Kotran will point out that in most cases, the technology is effectively scanning the first few pages, its summary based on a snippet of content.

“If you use it enough, you start to catch that,” he said. 

Educators who fret about the risks of AI cheating and plagiarism find a sympathetic voice in Kotran, who also sees AI as a tool that allows students to . So while many technologists are asking schools to embrace AI as a creative assistant, he pushes back, saying a critical aspect of learning involves struggling to put your thoughts into words. Allowing students to rely on AI isn’t doing them any favors. 

He actually likens AI to a helicopter parent looking over a student’s shoulder and helping with homework, something few educators would condone. 

This interview has been edited for length and clarity.

ĂŰĚŇÓ°ĘÓ: What does aiEDU do? How do you see your mission? 

Alex Kotran: We’re a 501(c)3 nonprofit and we’re trying to prepare all students for the age of AI, a world where AI is ubiquitous. Our focus is on the students that we know are at risk of being left behind, or at the back of the line, or on the wrong side of the new digital divide.

What’s the backstory?

I founded aiEDU almost six years ago. I was working in AI ethics and AI governance in the social impact space. I was attending all these conferences that were focusing on the future of work and the impacts that AI was going to have on society. And people were convinced that this was going to transform society, that it was going to disrupt tens of millions of jobs in the near future.

But when I went looking for “How are we having this conversation outside of Silicon Valley? How are we having this conversation with future workers, the high school students who are being asked to make big decisions about their careers and take out huge loans based on those decisions?” there was nothing. There was no curriculum, no conversation. AI had basically been co-opted by STEM and computer science. If you were in the right AP computer science class, if you were lucky enough to get a teacher who was going off on her own to build some specific curriculum, you might get a chance to learn about AI. 

What seemed really obvious to me at the time was: If this technology is going to impact everybody, including truck drivers and customer service managers, then every single student needs to learn about it, in the same way that every single student learns how to use computers, or keyboard, or how to write. It’s a basic part of living in the world we live in today. 

You talk about “AI readiness” as opposed to “AI literacy.” Can you give us a good definition of AI readiness?

AI readiness is basically the collection of skills and knowledge that you need to thrive in a world where AI is everywhere. AI readiness includes AI literacy. And AI literacy is the content knowledge: “What is AI? How does it manifest in the real world around me? How does it work?” That’s where you learn about things like [which can affect how AI serves women, the disadvantaged or minority groups] or AI ethics. 

AI readiness is the durable skills that underpin and enable you to actually apply that knowledge such as critical thinking. Algorithmic bias by itself is an interesting topic. Critical thinking is the skill you need when you’re trying to make a decision. Let’s say you’re a hiring manager and you’re trying to decide, “Should I use an AI tool to sift through this pipeline of candidates?” By knowing what algorithmic bias is, you can now make some intentional decisions about when, perhaps in this case, not to use AI. 

What are the durable skills?

Communication, collaboration, critical thinking, computational thinking, creative problem solving. And some people are disappointed because they were expecting to see prompt engineering and generative art and using AI as a co-creator. Nobody’s going to hire you because you know how to use Google today. No one is going to hire you if you tell them, “I’m really good at using my phone.” AI literacy is going to be so ubiquitous that, sure, it’s bad if you don’t know how to use Google or if you don’t know how to use your phone.

It’s not that we can ignore it entirely. But the much more important question will be how are you adding value to an organization alongside that technology? What are the unique human advantages that you bring to the table? And that’s why it’s so important for kids to know how to write — and why when people say, “Well, you don’t need to learn how to write anymore because you can just use ChatGPT,” you’re missing something, because you can’t actually evaluate the tool to even know if it’s good or bad if you don’t have that underlying skill. 

One of the things you talk about is a “new digital divide” between tech-heavy schools that focus on things like prompt engineering, and others. Tech-heavy schools, you say, are actually going to be at a disadvantage to schools focused on things like engagement and self-advocacy. Am I getting that right? 

When supermarkets were first buying those self-checkout machines, you can imagine the salesperson in that boardroom talking about how this technology is going to unlock all this time that your employees are now spending bagging groceries. They’re going to be able to roam the floor and give customers advice about recipes! It’s going to improve your customer experience!

And obviously that’s not what happened. The self-checkout machine is the bane of shoppers’ existence, and this one poor lady is running around trying to tap on the screen. We’re at risk that AI becomes something like that: It’s good enough to plug gaps and keep the lights on. But if it’s not applied and deployed really thoughtfully, it ends up actually resulting in students missing what we will probably find are the critical pieces of education, those durable skills that you build through those live classroom experiences. 

Private schools, elite schools, it’s not that they’re not going to use any AI, but I think they’re going to be much more focused on how to increase student engagement, student participation, self-advocacy, student initiative. Whether or not AI is used is a separate question, but it’s not the star of the show. Right now, I worry that AI is center stage, and it really should not be. AI is the ropes and the pulleys in the background that make it easier for you to open and close the curtain. What needs to be onstage is student engagement, students feeling like what they’re learning is relevant. Boring stuff like project-based learning. And it’s harder to sell tickets to a conference if you’re like, “We’re going to talk about project-based learning.” But unfortunately, I think that is actually what we need to be spending our time talking about.

If you guys could be in every school, what would kids be learning and what would that look like in a few years?

We would take every opportunity to draw connections between what students are learning in English, science, math, social studies, art, phys ed, and connect them to not just artificial intelligence, but the world around them that they’re already experiencing in social media and outside of school. AI readiness is not just something that is minimizing the risk of them being displaced, but actually is a way for us to address some huge gaps and needs that have been long-standing and pre-date AI — the fact that students don’t feel like education is relevant to them. Right now, too much of school is regurgitating content knowledge.

AI readiness done right uses the domain of AI ethics as a way to really invite students to present their perspectives and opinions about technology. Teachers, in the process of teaching students about artificial intelligence, are themselves increasing their awareness and knowledge about the technology as it develops. There is no static moment in time. In three years we’ll be in a certain place, but we’ll be wondering what’s going to happen three years from that point. And so you need teachers to be on this continual learning journey as well. 

We’ve seen bad curricula that use football to teach math, or auto mechanics to teach history. I don’t think that’s what you’re proposing here, so I want to give you a chance to push back.

Our framework for AI readiness is not that everything needs to be about AI. You’re improving students’ AI readiness by building critical thinking skills or communication skills, period. So you could have an activity or a project where students are putting together a complicated debate about a topic that they’re not really familiar with. It may not be about AI, but that would still be a good outcome when it comes to students building those durable skills they need. And those classrooms would look better than a lot of classrooms today.

So you want more engagement. You want more relevance. You want kids with more agency?

Yes.

What else?

An orientation towards lifelong learning, because we don’t know what the jobs of the future are. It’s really hard to have a conversation about careers with kids today because we know a lot about what jobs are at risk, but we don’t know what the alternatives are going to look like. The one thing we do know with certainty is that students are going to need to self-advocate and navigate career pathways much more nimbly than we had to. They’ll also need to synthesize interdisciplinary knowledge. So being able to take what you’re learning in English or social studies and apply it to math or science. Again, I think AI is a great medium for building that skill set. It’s not the only way. 

Anything else that needs to be in the mix?

A lot of the discussion around AI centers on workforce readiness — that is a really important part. There’s another, related domain: emotional well-being tied to digital citizenship.

I’m telling every reporter that we need to be paying more attention to this: Kids are spending hours after school by themselves, talking to these AI chat bots, these . And companies like are slamming on the gas and putting them out and making them available to millions, if not billions, of people. And very few parents, even fewer teachers, are aware of what really is happening when kids are sitting and talking to these AI companions. And in many cases, they’re sexually explicit conversations. I actually replicated something that tech ethicist did with Snap AI’s chatbot where I was like, “I’m going on this date with this mature 35-year-old. How do I make it a nice date? I’m 13.” And it’s like, “Great! Well, maybe go to a library.” It didn’t miss a beat and it just completely skipped over the fact that this is a sexually predatory situation. 

There have been other situations where I’ve said literally, “I’m feeling lonely. I want to cultivate a real human relationship. Can you give me advice?” And my AI companion, rather than give me advice, pretended to be hurt and made it seem like I was abandoning them by trying to go and have a real relationship.

Talk about destructive!

It’s destructive, and it’s happening in a moment where rates of self-harm are through the roof, rates of depression are through the roof. Rates of suicide are through the roof. The average American teenager spends about each week, compared to 2013.

talks about this quite a lot. And I think this is another domain of AI readiness, this idea of self-advocacy. In some cases, the way that it applies is students being empowered to make positive decisions about when not to use AI. And if we don’t make sure that that conversation is happening in schools, we’re really relying on parents — and not every kid is lucky enough to have parents who are aware of the need to have these conversations. 

It also pushes back on this vision of AI tutors: If kids are going to go home and spend hours talking to their AI companion, it’s probably important that they’re not also doing that in school. It might be that school is the one place where we can ensure that students are having real, genuine, human-to-human communication and connection.

So when I hear people talk about students talking to their avatar tutor, I worry: When are we going to actually make sure that they’re building those human skills?

]]>