Deep Dive

Please Use Responsibly: AI in Literacy

By
Lance Pauker
Posted
March 19, 2025
Classroom with a blackboard that's morphing into an AI digital illustration.

Just as a chalkboard once revolutionized the classroom, Generative AI (GenAI) is the latest in a long line of technologies that seek to upend the educational landscape.

School of Education Chair on the New York City Campus Francine Falk-Ross, PhD, specializes in literacy development for all ages. Professor Peter McDermott, PhD, specializes in literacy and has conducted and presented research on incorporating technology into reading lessons. As educators, both professors underscore the importance of not shying away from but rather understanding the ways in which technologies like GenAI have and will affect teaching and learning in the years to come.

Through their own experimentation with GenAI, they stress the importance of responsible and effective use so that it can empower students and educators rather than serve as an intellectual hindrance. In the Q+A below, we discuss how AI may threaten the ability to develop expertise through rigorous citation evaluation and research, while also identifying the ways in which GenAI has and can provide immense benefits to the learning and teaching experience.

You both have presented research regarding AI in Literacy. Can you briefly discuss the content of this work?

Peter McDermott: Fran and I just had a paper accepted for publication on ways teachers could effectively use AI in The Middle School Journal. We talk about the differences of using AI and Googling, and importance of writing descriptive prompts.

For teachers, it’s critical to learn how to write good prompts that accurately describe important information regarding specifics like one’s school, student population, learning needs, what your goals and objectives are—and then, beyond the prompt, to be able to clinically analyze what AI produces.

What are your general thoughts on the introduction of GenAI into the classroom?

Fran Falk-Ross: Using GenAI effectively can really improve the classroom experience, and be very supportive of diverse student populations, or students with learning disabilities.

But at the same time, a good teacher needs to know how to think on their feet. And as we stand and teach, we don’t always have AI at our fingertips. If you have time to sit down and write something you can rely more on AI, but in terms of building skills in relation to research and critical thinking, you need a foundation that empowers you to create your own ideas and analyze existing ideas. You need to build your own expertise.

In what ways have you incorporated AI into the classroom?

McDermott: I’ve been playing with different assignments with my grad classes at Pace. One assignment is to write a half page argument essay about it a topic. The students write it, then ask ChatGPT to write an argument about the same topic. I have students compare what they produced with what ChatGPT produced. It’s an interesting exercise that helps students think about how AI can be used. Some students are surprised AI can be quite good.

In another assignment, I ask my students to upload middle school and high school student writing samples to ChatGPT and ask AI to analyze it, identify patterns of errors, and correct it. What AI produces in this case, is also very good. Taking the second step and having AI explain these edits can be very useful for education students.

In terms of building skills in relation to research and critical thinking, you need a foundation that empowers you to create your own ideas and analyze existing ideas. You need to build your own expertise.

In working with GenAI, have you had any experiences or observations that you have found concerning?

McDermott: There’s a lot of research—through publications such as the International Literacy Association, for example—that are essentially saying “AI is good, but you have to use it with a critical eye.”

When I use ChatGPT, I’ll often ask it to give me some recent citations about a topic. Last month when I did this it gave me a citation from a journal that I didn’t recognize, so I searched and searched for the journal. Turns out the journal doesn’t exist; it was an AI hallucination.

Falk-Ross: GenAI doesn’t use references without prompting, which means that students reliant on the technology will not become familiar with the history, or the established reasons for a scholarly argument. In class they’ll get an overview of research principles, but that now won’t carry over to assignments outside the classroom. In my view, AI should be complemented by a primary source, so that users are able to know where the information came from.

A student might not know the origin of an argument or a fact, or which research articles are seminal pieces—these are very important things know for teachers to be able to read more, learn more, and pass on a model of research and critical thinking before teaching students.

McDermott: You could ask AI to cite, but you still need that critical eye, of whether the citation is authentic.

You could ask AI to cite, but you still need that critical eye, of whether the citation is authentic.

In what ways have you seen students adopt the technology? How has student writing changed since the introduction of GenAI?

Falk-Ross: Using a tool like ChatGPT can be a part of the writing process and in many cases important to help clarify ideas. But there should be an editing process. You could get a useful model and good vocabulary suggestions from GenAI, but you need to reconstruct the text as your own.

McDermott: This is one of the reasons it’s important to teach students how to use AI in effective ways, to have ongoing dialogue with it. You can ask AI to produce something but then use your critical eye to evaluate that writing and ask it to revise—but be descriptive and specific in the ways you want the writing revised. The process of using ChatGPT can then become collaborative and discussion-like.

Falk-Ross: There is also the issue of students passing off what AI write as their own work. Even a few years ago, if a student wrote something and it didn’t seem like it was something from them, I could just throw it into Google and see if it’s plagiarized, because it’s referenced. Now, it’s much harder to figure out where it came from.

McDermott: There is a category of research/study called “AI resistant assignments.” We can develop assignments where students must use their personal life experience and history as research, that’s a way to overcome issues of plagiarism.

Falk-Ross: Regardless of plagiarism, I do think students lose the ability to develop expertise on their own and independently understand the process of constructing arguments, which is important in a diverse population. You need to make things work for the student population you’re teaching, but also within the school’s limits—there may be certain initiatives important to the schools, and this is a complex process.

What are your overall thoughts on this new academic normal? How can we balance the clear benefits of AI with some of its pitfalls?

Falk-Ross: Peter and I can read what AI produces and understand the quality of the output—what’s good, and what might be inaccurate or out of context. But I think that students—without being able or understanding the importance of looking up information further to see where it came from—can ultimately lose this important skill, and the scholarly model that academia is historically built upon.

Those things are pushed to the wayside, and they are critically important. But GenAI does organize information extremely well and provide useful facts. It’s important that users both learn from GenAI and understand its limitations.

McDermott: We really need to prepare teachers how to use it well. I think there’s more benefits than disadvantages, but teachers must be the critical decision-makers and leaders in its use.

It’s the responsibility of educators to understand how to effectively use these technologies.

More from Pace

Deep Dive

From helping immigrants start businesses, to breaking down barriers with AI-generated art, Pace professors are using technology to build stronger, more equitable communities.

Deep Dive

AI is changing the world—but should we be worried? To test its ability to engage in real academic discourse, Pace's writers tasked ChatGPT’s Deep Research with generating a fully AI-written, cited academic article. By pushing its capabilities, we’re not just showcasing what AI can do—we’re interrogating its limitations.