
Every time I facilitate a PBL workshop or sit alongside a teacher implementing a project, the same conversation comes up: what do we do about AI? Sometimes it comes from anxiety about students bypassing the learning process altogether. Sometimes it comes from genuine curiosity about using AI as a tool for deeper thinking. Most of the time, it is a little of both.
I have heard this question in coaching conversations, in workshop rooms, and as part of Cohort 1 of the GenerationAI Community of Practice, convened by ISTE+ASCD to do action research on meaningful AI integration in K-12 schools. What I have come to believe, across all of those contexts, is that the answer lives in the same place: the intentional design of the learning process.
The answer is not to push AI out of the classroom. It is to be more intentional than ever about designing the process that surrounds it. In my work facilitating PBL workshops and coaching teachers through projects, three questions keep surfacing: How can AI support sustained inquiry? How can it strengthen critique and revision? And how can it help students develop the discernment to use it thoughtfully? What follows explores each through two lenses: what teachers can design and scaffold, and how students can engage with AI as an intentional thought partner.
How Can AI Support Sustained Inquiry During the Learning Process?

Sustained inquiry, one of the Gold Standard Project Design Elements, asks students to engage in a rigorous, extended process of posing questions, finding resources, and applying information. It's one of the most demanding elements to facilitate well in PBL because teachers must resist the pull toward efficiency and intentionally design checkpoints that slow the process down enough for real thinking to happen. In an age where AI can surface answers in seconds, the teacher's most important job may be protecting the productive struggle before it disappears. Used intentionally, AI can be a thought partner that deepens inquiry. Used without that intention, it can do the thinking for students before they ever get the chance.
Teacher Lens
One of the most practical things a teacher can do is sequence when AI enters the project. Consider launching with a firsthand experience, like a community walkthrough, an expert conversation, or a hands-on investigation, or even more ideas through before any AI tools are introduced. When students arrive at the research phase with genuine observations and their own questions already forming, they bring real thinking to the table and can ask AI to push it further rather than do it for them. A student investigating water quality might tell AI, "I think the problem is runoff from the parking lot near our creek. What evidence should I look for, and what am I not considering?" The inquiry stays theirs. AI just makes it go deeper.
A mid-project Need to Know checkpoint reinforces this: ask students what they now understand, what still needs answering, and what new questions have surfaced. That creates the right conditions to introduce AI thoughtfully, after students have developed their own thinking, not before. As the PBLWorks Using AI to Support PBL Planning guide puts it, AI should deepen student research, not replace it.
Student Lens
What we want is for students to bring their thinking to AI, not the other way around. Coach them to open with what they already know: The prompt matters. Instead of “What is water pollution?” a student investigating water quality might ask: “I think runoff from the parking lot near our creek is the main problem. What evidence should I look for, and what am I not considering?” A student researching local food access might ask:” Here is what my survey and personal interviews data indicates. What patterns am I missing and what questions should I still be asking?” In a history PBL, students can use AI to find their own fallacies and challenge their perspectives. Instead of asking for answers, they ask deeper questions:’Here is my theory on why the Southeast Asia War happened. What am I missing, and how would someone with a different perspective argue against me? That reframe changes everything, from AI as a shortcut to AI as a genuine thought partner. When students document those exchanges in their project portfolio, that trail of thinking becomes visible evidence of real sustained inquiry.
How Can AI Support Critique and Revision Throughout a Project?

Critique and revision are the element of Gold Standard PBL that most closely mirrors how professionals actually work. Writers work with editors. Engineers test prototypes and rebuild them. When students experience this cycle in a project, they are learning that strong ideas get stronger through honest feedback and the willingness to revise. AI fits naturally into that cycle. Used intentionally, it provides students with one more source of honest feedback and one more opportunity to revise before submitting final products.
Teacher Lens
One of my favorite milestone moves is simple but powerful. Before any peer critique or gallery walk, ask students to self-critique using AI first. Give them a structure to work with. An English student drafting a literary analysis might use the Lenses Protocol as a guide, prompting AI to review their work through specific lenses: ”Review my argument through the lens of evidence and audience. Where is my reasoning strong and where does it need improvement?” A math student presenting a problem-based solution might use Stars and Stairs: ”What is working well in my process and explanation, and what is one specific next step that would make my reasoning clearer?” Students bring that AI feedback into the peer session alongside input from classmates and the teacher. When students weigh three sources of feedback and decide what to act on, they are doing exactly what PBL is designed to develop.
Student Lens
Here is a practice worth building into every project. Ask students to keep a revision log tracking feedback from all sources, peers, experts, teachers, and AI, and what they decided to do with it. What they kept, what they changed, and what they pushed back on. A student might log it this way: “AI flagged that my evidence was thin in the second section. I checked three additional sources and agreed, so I revised. My teacher said my conclusion was strong, so I kept it.” That record of decision-making tells a story of intellectual growth that is entirely the student's own.
How Can AI Support Students in Developing Discernment and Critical Thinking?
No AI tool can teach a student discernment. That skill develops through real experience, not shortcuts. Students have to learn when to trust information, when to question it, and when to push back on it. PBL is designed to provide exactly that. Students build discernment by investigating real problems and defending real ideas. Those are the conditions that build the kind of thinking no AI tool can replicate.
Teacher Lens
Source evaluation should not be a one-time lesson at the start of a project. Build it in as an ongoing milestone practice. Give students a structure to work with like Circle of Viewpoints with AI: “Here is what AI told me about this data. Whose perspective is represented here, whose is missing, and how might that change the conclusion?” Another example, a science student investigating climate data might apply the SIFT protocol before accepting any AI summary: stopping to investigate the source, finding better coverage from primary research, and tracing claims back to their origin before drawing conclusions. As the PBLWorks Using AI to Support PBL Planning guide reminds us, students should use AI to enhance their research process, not replace their own evaluation of it. When that practice is woven throughout the project, the intellectual work stays where it belongs, with the student.

Student Lens
The best PBL classrooms treat reflection as a rhythm, not a closing activity. In an AI-integrated project, that rhythm needs to include honest questions about the thinking students actually did. Coach them to ask themselves throughout the project: “What did I figure out on my own?” “Where did I lean on AI when I should have kept working myself?” “What did I learn that I could not have learned by just accepting the AI’s answer?” When reflection is integrated into the process rather than added at the end, students develop the awareness to know the difference between the tool doing the work and them doing the thinking.
One of the clearest examples of this came from my own GenerationAI action research project. I launched a student-led Help Desk where students managed device repairs, earned technical certifications, and then co-taught Google NotebookLM to their peers during library classes. What struck me most was not the technical skill they developed. It was the awareness that came with being the teacher. When students had to explain how to use an AI tool responsibly to someone else, they had to first get honest about what they actually understood versus what they had just been clicking through. That is the kind of reflection we want to build into every PBL project.
The Process Is Where the Learning Lives
The teachers I have worked alongside who are navigating this well all share something in common. They have not figured out the perfect AI policy. When sustained inquiry is genuine, when critique and revision are built into the rhythm of a project, and when reflection is a regular practice rather than a closing activity, AI becomes one more resource students work with. Not a threat to the learning, and not a replacement for it.
The final product will always be the most visible part of a PBL experience. But what students carry with them is what gets built in the process:the questions they learned to ask, the feedback they learned to weigh, the thinking they learned to trust. That is where our work as educators matters most. And in the age of AI, it matters more than ever.
