Episode 1103

full
Published on:

11th Nov 2025

Exploring the Human in the Loop: The Role of AI in Education

The salient point of our discussion centers on the imperative of integrating artificial intelligence (AI) within the educational sphere in a manner that remains profoundly human-centered. We explore the concept of the 'human in the loop,' elucidating how technology, rather than supplanting educators, should enhance their capacity to engage with students on a meaningful level. Today, we are privileged to converse with Dr. Janice Gobert, a distinguished cognitive scientist whose expertise in inquiry-based intelligent tutoring systems provides invaluable insights into this vital intersection of AI and education. As we navigate the complexities of future-ready learning, it becomes evident that true innovation must not only harness technological advancements but also cultivate an inclusive environment where every student is afforded dignity, agency, and a sense of belonging. Join us as we delve into how AI can empower educators and foster an educational landscape that prioritizes the unique needs and aspirations of all learners.

Additional Information

The conversation centers on the profound implications of integrating artificial intelligence (AI) into educational environments, particularly through the lens of the human in the loop concept. Dr. Janice Gobert, a distinguished cognitive scientist, elucidates how AI can enhance educational experiences by supporting teachers rather than replacing them. The discussion emphasizes the necessity of empowering educators with actionable data that allows them to foster meaningful connections with students, thereby ensuring that those who typically struggle do not fall through the cracks. Through the innovative platform developed by Dr. Gobert's team, educators receive real-time insights into student performance, enabling them to provide tailored support and address individual learning needs effectively. This human-centered approach is posited as essential for cultivating a classroom environment where every student can thrive and realize their potential, challenging the prevailing notion that technology detracts from personal interaction in education.

Takeaways:

  • The concept of the 'human in the loop' is pivotal in leveraging AI for education, ensuring that technology supports rather than supplants human connection.
  • Supervised machine learning is crucial for ethical AI use, preventing biases that can arise from relying solely on historical data for student assessments.
  • Future-ready schools must prioritize the cultivation of hope and agency among students, fostering an environment where they can thrive and feel supported.
  • AI should serve to enhance teacher-student relationships by providing real-time data, enabling educators to intervene promptly when students exhibit signs of disengagement.
  • Innovative educational practices must be rooted in humanity, focusing on the individual needs of students rather than merely delivering content through technology.
  • The integration of AI in classrooms should aim to empower both students and teachers, creating a collaborative learning experience that honors diverse identities and learning styles.

Learn More about Dr. Gobert's Work

InqITS (Inquiry Intelligent Tutoring System) is an online educational environment for science. InqITS puts students in control of their own learning by leading an authentic inquiry experience through their NGSS-aligned virtual labs. Check out their website: https://www.inqits.com/

Join The Wheelhouse Company!

If you’re a like-minded educator who believes the future of learning must stay human-centered, we’d love for you to stay connected.

Follow Students Matter, LLC on Instagram or LinkedIn — or find any of us there: Kathy Mohney, Michael Pipa, Dr. Alicia Monroe, and me, Dr. Grant Chandler.

And we’re thrilled to invite you to step inside The Wheelhouse: Below Deck at Learn Harbor — our new online space where these conversations come to life.

It’s more than a platform — it’s a community.

A free, curated, safe harbor for educators, leaders, and thinkers who want to reflect, connect, and take action together.

Inside Below Deck, you’ll find our special segment: The Wheelhouse: All Hands on Deck, extended content from today’s episode —where purpose meets possibility and learning stays joyful, collaborative, and deeply human.

  • Join us at LearnHarbor.thinkific.com and become part of this growing movement to build Future Ready Schools — where innovation is always rooted in humanity.

Until Next Time Remember: Keep your doors open and your hearts even wider.

Transcript
Speaker A:

What do we mean by the human in the loop?

Speaker A:

What is supervised and unsupervised?

Speaker A:

AI?

Speaker A:

What's the role of AI when we think about innovation rooted in humanity?

Speaker A:

Oh, there's so much to talk about.

Speaker A:

A new episode of the Wheelhouse begins right now.

Speaker A:

The future of education depends on a radical humanization of schools.

Speaker A:

Places where hope is cultivated, opportunities are opened, and innovation is harnessed to serve humanity.

Speaker A:

Only then can we create futures worthy of each student's dreams.

Speaker A:

We begin by cultivating hope.

Speaker A:

We ensure possibilities are real and accessible, and we design futures rooted in humanity.

Speaker A:

The future is already here, and it must be deeply human centered.

Speaker A:

The Wheelhouse exists to create an inclusive community of empowered educators who believe that together we can disrupt the transactional nature of schooling and reimagine what it means to learn, lead, and belong.

Speaker A:

We envision districts, schools and classrooms where every student feels confident, capable, optimistic, well supported, and emboldened to be and to become who they're meant to be.

Speaker A:

Each episode of the Wheelhouse explores the knowledge, practices and stories that bring this vision to life.

Speaker A:

Our team, Kathy mone, Michael Pipa, Dr. Alicia Munro and I, Dr. Grant Chandler, alongside our guests, take on the fundamental challenge of realizing what we most want each and every student to experience in school.

Speaker A:

Dignity, agency and belonging.

Speaker A:

In season 11, we ask a simple but profound question.

Speaker A:

Where are we going?

Speaker A:

We strive toward future ready learning that keeps humanity at its core.

Speaker A:

Under the theme Future Ready Innovation rooted in humanity, we affirm that future readiness is not defined by devices or data, but by our shared humanity.

Speaker A:

True innovation doesn't replace human connection, it amplifies it.

Speaker A:

A future ready school prepares students to thrive by honoring their identities, nurturing their creativity, and equipping them to engage with a changing world, all from a place of strength and purpose.

Speaker A:

Last time, we turned to one of the most human elements of all.

Speaker A:

Joy.

Speaker A:

What happens when joy isn't just an outcome of learning, but it's the engine that drives it?

Speaker A:

Today, we'll talk about the power of AI to drive innovation rooted in humanity.

Speaker A:

And what does the human in the loop mean?

Speaker A:

Today's guest is Dr. Janice Gobert, a cognitive scientist with 25 plus years of experience and the lead visionary on Ink itself inquiry intelligent tutoring System, a virtual learning and assessment platform for science.

Speaker A:

Dr. Gobert is a full professor of educational psychology and Learning sciences at Rutgers Graduate School of Education.

Speaker A:

She's also the lead inventor on three patents on ink, its underlying algorithms and methodology, and she also holds three other patents for eye tracking technology.

Speaker A:

She recently received an Innovation Award for ink its from her alma material, the University of Toronto.

Speaker A:

You're listening to season 11, episode three of the Wheelhouse, where we're joined by Dr. Janice Gopert for a powerful conversation on using AI to support innovation rooted in humanity.

Speaker A:

Let's dive in.

Speaker A:

Good morning.

Speaker A:

Welcome back to the wheelhouse.

Speaker A:

I'm Dr. Grant Chandler and I am here sitting here with amazing people.

Speaker A:

So let's start with the Wheelhouse team.

Speaker A:

Kathy Mone, Michael Pipa and Dr. Alicia Munroe.

Speaker A:

Good morning.

Speaker B:

Good morning.

Speaker C:

That one was perfect.

Speaker D:

Come back.

Speaker A:

They don't even practice that, but boy, you can tell how intentional they are when they do that.

Speaker A:

It is super cool to have you back in the Wheelhouse.

Speaker A:

This season is off to a great start.

Speaker A:

In episode one, we really formally introduced our theme of future ready innovation rooted in humanity.

Speaker A:

Last week, which dropped on election day, was harnessing joy filled learning and talking about joy as the engine that drives everything.

Speaker A:

And that was a really good conversation as well.

Speaker A:

And today I think this is just a natural extension.

Speaker A:

We're going to talk about technology and AI and how does that fit into this idea of, of innovation rooted in humanity?

Speaker A:

And we are thrilled to welcome to the space the distinguished professor from Rutgers University, Dr. Janice Gobert.

Speaker A:

Good morning and welcome our new dear friend to the Wheelhouse.

Speaker A:

Hi.

Speaker B:

Hi.

Speaker B:

Thank you so much for having me.

Speaker B:

I'm so thrilled to be here.

Speaker A:

We're really going to talk a lot about what it means when we use technology to see and to know and to empower each and every single student, especially those students who often are missed and fall through the cracks.

Speaker A:

So we're really excited to have you here.

Speaker A:

I just want to start just kind of in general.

Speaker A:

I think a lot of people think that when you start talking about technology that what you're talking about is simply putting children on devices and that image of what they conjure up is so far from human centered learning.

Speaker A:

Can you.

Speaker A:

Let's just dispel that myth right for a moment.

Speaker A:

Let's just talk big picture.

Speaker A:

How does technology empower humanity?

Speaker A:

Janice, help us get this conversation started.

Speaker B:

So I think, you know, good tech is designed to support teachers and students, but we'll talk about teachers first to relieve them of onerous tasks so they can focus on the individual connections with individual students.

Speaker B:

And if you, for example, do teachers grading for them, they're able to take that, that data, get that grading done, take the data they're getting in, in the case of the platform that we've developed in my group, take that data, data and support the Learner, one on one, so that they, he or she does not fall behind.

Speaker A:

You know, we hear a lot about, you know, and I, you know, I was a teacher for 21 years and we, we know that the list of tasks on teachers plates is plentiful.

Speaker A:

So this idea of helping with some of those teacher tasks so that we have more time, more time to engage with our children, I think this is profound.

Speaker B:

Yeah.

Speaker B:

And I think, you know, a few years ago the worry was, oh, AI is going to steal my job.

Speaker B:

But you know, I think that, well, the jury's still out on that.

Speaker B:

But I think that most parents would want a teacher supporting a student, giving the teacher this actionable information so that he or she can support the student.

Speaker B:

Because, you know, kids spend a lot of time in schools and those teacher relationships are really key.

Speaker B:

I think every single person can reflect on a teacher with whom they had a very special relationship, who has kind of impacted who they are, the decisions they've made, you know, professionally or personally, et cetera.

Speaker B:

I don't know one person who I've ever spoken to and asked this question to, you know, do you have a teacher that's influenced your life in a significant way?

Speaker B:

Almost everybody I've ever asked, I think everybody I've ever asked has said that is the case.

Speaker B:

And so we need to make sure that our technology is giving teachers the ability and the actionable information so that the teacher can go over to the student and help them in real time as they need, also so that the teacher has this information about how the student might be performing a little bit differently.

Speaker B:

So in the context of my platform, which is called Ink, its, it's a science platform, an AI based science platform.

Speaker B:

So giving the teacher this actionable information for how students are, how they're doing well, how they're not doing well, it gives them the ability to go over and say, hey, you're really good at doing X, Y, Z.

Speaker B:

Now you just need a little bit of help doing this.

Speaker B:

Right.

Speaker B:

That's very empowering to the student to say, wow, I'm actually doing this really well and I just need support on this last aspect of the science experiment, for example.

Speaker B:

And so the teacher, you know, this teacher, what we call in computer science a human in the loop, the teacher has this information, this real time, fine grained data that he or she can use to make pedagogical decisions on the fly.

Speaker B:

And they don't have to go over to the student, kind of hover over them, try to figure out where the student's struggling how the student's struggling.

Speaker B:

So what the system does is using AI, it's assessing the student in real time, aligned to the next generation science standards, which many states have adopted.

Speaker B:

The other thing that I think is important for your interests with respect to this notion of humanity is that if a student is not performing the way that they typically do, we have an alert called slow progress alert.

Speaker B:

And this has a lot of AI built into it because there's a number of ways in which it's not just that they're not progressing quickly.

Speaker B:

It can have some disengagement detection under the hood.

Speaker B:

So if a student seems to not be performing the way that they normally do or typically do, the teacher can go over and say, hey, you know, Sally, Hey, George, you know, is everything okay today?

Speaker B:

And that's where the human in the loop provides this contextual information and data that, you know, the teacher can utilize to ask those students, like, hey, is everything okay?

Speaker B:

You know, and that I think is extremely powerful.

Speaker B:

So in addition to getting these, these, this, you know, information about these standards and how a student is performing and where they need help, the teacher can go over and ask them, hey, you know, is everything okay?

Speaker B:

Because you don't know.

Speaker B:

We don't know what's happened with that student.

Speaker B:

You don't know what's happened with that student.

Speaker B:

But it's an opportunity.

Speaker B:

The student doesn't have to ask for help.

Speaker B:

It's an opportunity for the, for the student to say, you know, things are kind of rough at home or whatever.

Speaker B:

Right.

Speaker B:

I had a bad day, I got cut from the basketball team or whatever.

Speaker B:

Right.

Speaker B:

And that's that human in the loop, that emotional, that, that emotional component that the teacher is in this wonderful relationship.

Speaker B:

And the AI supports that.

Speaker C:

Yeah.

Speaker C:

So as you're, as you're talking, Janice, it all sounds beautiful, right?

Speaker C:

It sounds like what we would hope and dream that using AI in a classroom would and should look like.

Speaker C:

Right.

Speaker C:

It's that foundation of relationships, that human to human connection.

Speaker C:

How do we go from sometimes the reality of students just being put on devices?

Speaker C:

Right.

Speaker C:

Because then that becomes, I mean, that to me is such a big leap from when you hand things to educators.

Speaker C:

Because that's what we often do.

Speaker C:

Right.

Speaker C:

Here's another thing, another tool.

Speaker C:

Use this.

Speaker C:

That's gonna do all of these things.

Speaker C:

How, what it, what does that look like to ensure that it's not just another thing and another process that students are doing versus that level of engagement that you're talking about, that's my genuine fear and concern is that as these amazing tools, these opportunities, these things that we want to be able to provide.

Speaker C:

So educators have all of those things that now I can focus on even more.

Speaker C:

So.

Speaker C:

And that, human to human, it all sounds beautiful, but how do we even begin to get there?

Speaker B:

So I think, I'm not sure when the notion of teacher dashboards or when the idea of teacher dashboards came about.

Speaker B:

It's certainly more than 10 years ago.

Speaker B:

So, you know, rather than just plunking students on another, yet another software platform, be it driven by AI or not, I think you really need what's called a human in the loop, as I said before.

Speaker B:

And so if the teacher is getting fine grained, actionable information, he or she can use it to support students in a way that students are not supported otherwise.

Speaker B:

Right.

Speaker B:

So I think that aspect is really important.

Speaker B:

So building out a system that combines the student platform and the student data, and then supporting the teacher in these pedagogical decisions, in addition to supporting them in detecting when, as I said, a student is maybe off task or progressing slowly, because that can be a cue to a skilled teacher to go over and ask the student what's going on.

Speaker B:

So I do think that the combination of the teacher tool, student tool is really the way to leverage that and leverage the system so that students don't fall behind.

Speaker A:

So as I was listening to both of you, two images kept coming to mind.

Speaker A:

One was, so what you're talking about is what I as a teacher would have loved, right?

Speaker A:

Which is a second set of eyes.

Speaker A:

Help me look.

Speaker A:

Help me look.

Speaker A:

Right?

Speaker A:

Help me look.

Speaker A:

Cause I got 35 little babies in my.

Speaker A:

Well, they're not little.

Speaker A:

I was a high school teacher, right.

Speaker A:

I got 35 humans in my room and I'm constantly looking.

Speaker A:

But wow, a second set of eyes would be really powerful to help me make sure I don't miss something that I don't want to miss.

Speaker A:

And then the other image that was coming to mind was, you know, what a profound tool for equity.

Speaker B:

Absolutely.

Speaker B:

Yeah.

Speaker A:

Because.

Speaker A:

Because, because a lot of our students, especially those who need us the most, don't ask for help, don't approach us.

Speaker A:

Right.

Speaker A:

For lots of reasons that we've talked about over and over again on this podcast.

Speaker A:

But wow, if I get some information, then I can reach out to them.

Speaker A:

Those were the two things that I was thinking about was second set of eyes and equity.

Speaker A:

And oh my goodness, how powerful were both of those things for classroom teachers in the 21st century?

Speaker B:

Right.

Speaker B:

And the other thing that I want to point out is that the platform that we, that we developed has a co teacher functionality.

Speaker B:

So when a, when a teacher, say, is in a classroom with many students who need special, special resources, for example, special education, or, or on IEPs, the system provides a co teacher functionality.

Speaker B:

So the paraprofessional or the team teacher can get the data on students as well.

Speaker B:

So they can then kind of leverage this technology.

Speaker B:

And then it's like having, it's like having, you know, all this data.

Speaker B:

But now you have two teachers who can support students, students.

Speaker B:

So if you're in a resource classroom with, you know, say 12 students, one teacher can take six, help six, you know, individually, and the other teacher can help another six.

Speaker B:

And this has been a game changer for, for special education students.

Speaker B:

It's been an absolute game changer because now you have, because you have students whose learning trajectories is longer than you know, their, their peers who, who tend to be more typical learners.

Speaker B:

And you need a human in the loop, especially in that context.

Speaker B:

And you know, with students who are, say, not speaking in English is not their first language, they're going to take a little longer as well.

Speaker B:

And so our AI on the student platform, we have Spanish Spanish available now in virtually all of our labs.

Speaker B:

So we have a Spanish AI tutorial that's helping the student in real time.

Speaker B:

And then the teacher is getting this actionable information too that can, the teacher can walk over and help those students.

Speaker B:

So, you know, the whole system is like an ecosystem to support a broad range of learners.

Speaker B:

And all of this was developed in a ethical and equitable way because we used a diverse set of learners to build the AI platform and to make sure our algorithms are not biasing towards one type of student or another type of student.

Speaker B:

So that's one of the things that I think people are concerned about with AI and they should be concerned about that.

Speaker B:

And another thing that I'll just tip my hat towards is the notion of what's called supervised machine learning or supervised AI.

Speaker B:

So the system is not grabbing random data off the web to make decisions about a student when they get to what's called an edge case.

Speaker B:

So the AI is, is going along kind of assessing the student in real time in our system.

Speaker B:

And if the AI says, well, does the student know this or does the student not know this?

Speaker B:

If you're not using supervised AI, if you have unsupervised AI, the system might pull IP address, which of course is a proxy for socioeconomic bracket.

Speaker B:

And that is highly, highly unethical and highly problematic because the System could say, if you're dealing with unsupervised AI, oh, what is the zip code on?

Speaker B:

What is the IP address?

Speaker B:

What is the zip code?

Speaker B:

Whenever I get to an.

Speaker B:

So I'm going to say this student doesn't know it because I know that students from a typically underperforming district now, and also because unsupervised AI has no ethics built into it because they're just, you know, algorithms up under the hood that are potentially updating themselves.

Speaker B:

I'm going to say whenever the system encounters an edge case from that zip code, that student does not know it.

Speaker B:

So the obvious problem here is that this is incredibly unethical, right?

Speaker B:

Incredibly inequitable.

Speaker B:

So this notion of supervised, we hear a lot about how do you put guardrails on AI?

Speaker B:

Well, this is one of the ways in which you put guardrails on AI.

Speaker B:

And I don't think it's malicious intent, it's just that the.

Speaker B:

If the algorithms are permitted to.

Speaker B:

To update themselves, they will, and so they cannot.

Speaker B:

Right.

Speaker B:

And you know, what I feel very proud of in our system is that we're governed by Rutgers irb.

Speaker B:

And so, you know, we have to talk about these things and how are our algorithms developed and how can we ensure that they mitigate bias, etc.

Speaker B:

Right.

Speaker B:

So that's a really important thing that I think the public needs to understand.

Speaker B:

You know, you hear so much about whether AI is ethical, unethical, is it biased?

Speaker B:

Is it this, is it that?

Speaker B:

And as the community of researchers and educators who are deeply concerned about this, I think this is a conversation that needs to be brought to the fore.

Speaker A:

And this is an area of that.

Speaker A:

Of course, I am by no means an AI expert.

Speaker B:

Expert.

Speaker A:

Far from it.

Speaker A:

Right.

Speaker A:

But this whole idea of supervised and unsupervised is, I think, a really important piece of the conversation for educators who maybe have my level of understanding enough to be dangerous.

Speaker A:

Right.

Speaker A:

Because I think that's really important to this conversation around the human loop and around equitably serving our students and using these tools for the greater good.

Speaker B:

Right.

Speaker B:

Because, you know, if you're looking at unsupervised AI, right, it's making judgments based on historical data.

Speaker B:

So I'll give you a very simple example.

Speaker B:

If, if you know, most of the managers of businesses are white men, when the system encounters a woman of color and says the system is prompted, will this person be a good manager?

Speaker B:

Yes or no?

Speaker B:

Well, they'll say, of course not.

Speaker B:

Of course she's not going to be a good manager.

Speaker B:

It doesn't match the data that I have historically.

Speaker B:

Right.

Speaker B:

So that's a simple example.

Speaker C:

Actually, Elisa, you look like you've been holding on to something there because I'm.

Speaker D:

Doing the research, right.

Speaker D:

And I don't do the research in Pre K through 12 space.

Speaker D:

But I am a university professor.

Speaker D:

So I am sitting there having these conversations and it's about using technology and AI.

Speaker D:

And I love this conversation morally and ethically.

Speaker D:

And as we ground and anchor that, I truly need to understand, and I think educators need to understand that AI and technology based learning is for the purpose of supplementing, not supplanting.

Speaker D:

So when we're thinking about the human ethic and the human in the loop, love the construct of that.

Speaker D:

Right.

Speaker D:

You can't have one without the other.

Speaker D:

What I see is, I see the supplanting.

Speaker D:

So there is a replacement of the pedagogy practice and high quality instruction.

Speaker D:

Because now I'm going to sit a student on a machine and they are going to do and fill the voids in my practice that I just have not built the capacity to engage with students on this level.

Speaker D:

That's what I see in Pre K through 12.

Speaker D:

In higher ed, what I see is I do see the tracking, I do see the bias, right.

Speaker D:

Whether it's one of my students coming to me.

Speaker D:

And I'm not going to name the program and say, well, because I'm of a darker hue and I have a melanated skin, a deeply melanated skin.

Speaker D:

I was tagged for possibly cheating on an exam that was proctored online.

Speaker D:

Right.

Speaker D:

So I want us to understand, I do appreciate systems that work.

Speaker D:

However void of the human element, it could be more detrimental than good.

Speaker D:

And I wanted to really have this conversation because I am thrilled that we're in the space really engaging around this because this is a hot topic.

Speaker B:

I think you brought a couple of really important points, Alicia.

Speaker B:

by somebody named Haberman in:

Speaker B:

Right?

Speaker B:

It's that in good, you know, quote unquote, good high socioeconomic bracket districts, the technology is done, used as an enrichment and the teacher is well engaged because the class sizes are small, et cetera, et cetera.

Speaker B:

Right.

Speaker B:

Whereas in, in other districts there's this notion of pedagogy of poverty where the technology is simply used to like rote drill and kill kids and basically, you know, train them up.

Speaker B:

You know, basically train them up.

Speaker B:

And I think that's a, that's a problem and hopefully it's going away with these systems that are built to address the full ecosystem.

Speaker B:

Right.

Speaker B:

The, the data the teacher needs, doing the real time grading, supporting the kid, giving the teacher alerts as we do so that the teacher can help individual students.

Speaker B:

The teacher can do differentiated instruction, the teacher can do whole class instruction.

Speaker B:

Because if, you know, if you've got 30 kids in a classroom and 15 are struggling on the exact same thing, say running a science experiment on a simulation as they do in our platform ingots, the teacher can't help 15 students at a time.

Speaker B:

So even if the teacher is well meaning to do so, the student is disengaging because they're waiting for the teacher.

Speaker B:

Right?

Speaker B:

So Rex, our AI tutor can support those students and the student can be supported as the, as they're working through the system and maybe they're going to figure it out with some assistance from the AI agent Rex.

Speaker B:

Right.

Speaker B:

Which is gray.

Speaker B:

I mean the teacher can't always be there and students do need to develop that agency.

Speaker B:

One more thing, the other thing that I think is important is that no student wants to be the one that the teacher always has to go over to.

Speaker B:

Right?

Speaker B:

Just like a student's not going to raise their hand because hey, once they're in middle school or high school, they don't want to seem like they're uncool.

Speaker B:

They don't want to ask for help.

Speaker B:

So having a system that supports students simultaneously as it supports teachers, the teacher can kind of say, hey, I've been over to help, you know, Johnny or Sally or whatever.

Speaker B:

I've been over there already twice.

Speaker B:

I'm going to let Rex get them.

Speaker B:

Right?

Speaker B:

I'm going to see if Rex can get them so the other kids don't see, see that, that Johnny is always the one needing help.

Speaker B:

And that is a powerful thing too, right.

Speaker B:

We had During COVID the U.S. department of Education asked us to give away our platform for free worldwide.

Speaker B:

And we did that and we hardened our infrastructure so that we could have simultaneous use, many, many, many simultaneous use users because teachers from 90 countries signed up to use it, use the system.

Speaker B:

And we found that kids did better.

Speaker B:

Many kids, including kids from, you know, areas where typically they underperform, did better at home with ink its and our, our AI tutor Rex than they ever did in the classroom.

Speaker B:

That's pretty amazing, right?

Speaker B:

We actually wrote a paper about this, an academic paper and sometimes I think students are like, okay, well I know, you know, I feel like I'm not a science or STEM learner So I'm going to pretend that it's not important to me.

Speaker B:

When I'm in the classroom, I'm going to take this stance.

Speaker B:

Well, this is, this is, I'm too cool for this anyway.

Speaker B:

But when they're home by themselves, I think they're thinking, I know that I can get a good job if I get a, get into a STEM program for college or whatever.

Speaker B:

Right.

Speaker B:

And they're re engaging and engaging in a way that is supporting them.

Speaker B:

And we have this student report so they see how much better they did this time than the, than they did the last time.

Speaker B:

And that, in my experience, is extremely powerful because I think that people generally want to do well.

Speaker B:

I, I think they do.

Speaker B:

I think students want to do well.

Speaker B:

And when they see, oh gee, I have improved, it's motivating to them.

Speaker C:

Yeah.

Speaker C:

And I think your mentioning of COVID was, I mean, it's exactly where I was going to go.

Speaker C:

So to Alicia's point in regards to this supplanting versus supplementing, when teachers were thrown into this space of one to one technology that didn't, in many, especially those districts that tend to serve marginalized populations, they didn't, they didn't have those resources.

Speaker C:

And now all of a sudden they do.

Speaker C:

And then, okay, I don't, I don't really know what to do with it.

Speaker C:

And then it just becomes this, you know, supplanting of now that students are back in the classroom and you know, we can go into, well, behaviors are elevated and all of this stuff going on.

Speaker C:

It's just so much easier if I put them on a device over in the, you know, in their seats and then everybody's, you know, doing their thing, this, this rope, kill and drill and that, it becomes this supplanting of this humanizing education.

Speaker C:

It, it, it's, it's missing in, in a lot of spaces because we've thrown educators, educators and spaces into this without then continuing to fully support them and build those, that learning and all of those pieces to ensure that that human is in the loop.

Speaker B:

So we're assessing the full range of practices.

Speaker B:

You know, how you're, how they're forming a question, how they're collecting data, how they're interpreting the data, how they're using the mathematics to deepen their understanding of the phenomenon, which is why those computational mathematical practices were included in ngss.

Speaker B:

And then, you know, how well they can communicate the claim, evidence, reasoning, and we're auto scoring all of that using a combination of different kinds of AI, machine learning, knowledge engineering, and natural language processing, which is a kind of a type of LLM.

Speaker B:

Right.

Speaker B:

LLMs are based on natural language processing, which is actually old technology.

Speaker B:

It's not new technology.

Speaker B:

ame, it became Open source in:

Speaker B:

That's why we hear so much about it.

Speaker B:

t's actually developed in the:

Speaker B:

It's kind of interesting, right?

Speaker D:

Yeah.

Speaker B:

But I think, you know, as you say, the writing component, because we need to communicate with each other and communication is a 21st century skill.

Speaker B:

It's emphasized in the NGSS as well as all these other practices, because that's the means by which we, we display what we know in part to communicate to the public, communicate science.

Speaker B:

But it's not the only way.

Speaker B:

And so it's important to assess those competencies.

Speaker B:

And that's why we're using this, various kinds of them, of, of machine learning and knowledge engineering that's assessing them at the mouse click level.

Speaker B:

So it's, it's taking all their mouse clicks, their click stream data and aggregating it up.

Speaker B:

And then patented algorithms are assessing how well they're forming a question, collecting data, interpreting data, warranting claims, doing the mathematics, doing the writing.

Speaker B:

Now where I feel like it's really servicing those kids who, who fall behind or, or sometimes get.

Speaker B:

They're what we, what we might call someone who's parroting or someone who, you know, think about the bell curve, right?

Speaker B:

You have kids who can do all this math and science.

Speaker B:

But if the teacher is relying on an assessment that's only looking at their writing, like the lab report, which is what's historically been done in science, that student who can do all this math and science, but is not good at communicating in words because, say, English isn't their, isn't their first language, that child is being misassessed.

Speaker B:

That's not fair, that's not ethical, that's not equitable.

Speaker B:

Furthermore, how can they compete with a unilingual Anglophone?

Speaker B:

Right.

Speaker B:

And how are they going to compete for a college major in STEM if the teacher is relying on what they write?

Speaker B:

It's not fair to them, right?

Speaker B:

Or these kids.

Speaker B:

Sometimes these kids are, they're just like math science geniuses.

Speaker B:

Writing is not their strength.

Speaker B:

That's not fair to them.

Speaker B:

At the same time, you have kids who are just parroting what they've read or heard and they don't even have the data that they're talking about.

Speaker B:

So the teacher's getting this alert on the dashboard and can walk over and say, hey, you know, Billy, you've actually written a pretty good claim evidence reasoning statement.

Speaker B:

But now you need to go back and do the experiment and to deepen your understanding.

Speaker B:

And this is the rich, these are the, this is an example of the rich ways in which teachers are using our actionable information to make data driven decisions.

Speaker B:

But also for that student who feels like, gee, you know, I'm not good at science, I'm just going to catch the buzz, I'm going to do some random clicking through the system and then I'm going to write my claim evidence reasoning statement.

Speaker B:

And they don't actually have the data and they're getting pushed along, pushed along, pushed along.

Speaker B:

They're playing the school game because that's what they feel is their way forward, their only path forward.

Speaker B:

But when you say to them, hey, you're actually really good at this piece and this piece, and they're like, what?

Speaker B:

Like, we have had so many conversations with kids because we built ink.

Speaker B:

It's working with kids from an after school program in, in Massachusetts.

Speaker B:

And they came to our lab once a week on a bus.

Speaker B:

And there were at the time 13, 12, 13 people in my group.

Speaker B:

And there were.

Speaker B:

The bus had 12 seats, and so we have 12 kids.

Speaker B:

They'd come and you know, sometimes we had them testing stuff for us, but we were working with them one on one.

Speaker B:

And sometimes they would, I would say, hey, what are you doing in science next week?

Speaker B:

Or what are you doing right now?

Speaker B:

And I remember this one conversation, a student said, you know what we're doing?

Speaker B:

We're doing plate tectonics.

Speaker B:

I said, so what do you like about that?

Speaker B:

They're like, I don't like that topic.

Speaker B:

I don't know anything about that topic.

Speaker B:

I'm like, hey, want to be a superstar next week?

Speaker C:

Yeah.

Speaker B:

So we worked with them on the plate tectonics activity and they went to class and all of a sudden they're the one raising their hands, right?

Speaker B:

And they came back and he's like, miss Janice, Miss Janice.

Speaker B:

I did so well.

Speaker B:

I knew all the answers.

Speaker B:

It was really cute.

Speaker B:

So kids are getting back in the STEM game because we're supporting them where they need the support and pointing out to them, hey, you actually did really well at this, this, and this.

Speaker B:

Right now you only need support on this and this piece and that.

Speaker B:

Transparency to the educator, but to the student is a game changer for them.

Speaker B:

You know, those in special ed, those who are operating in a second language, those who are just not good at writing.

Speaker B:

So you need a system that can assess the full range of Competencies, as expected.

Speaker B:

And we're really, really seeing that this is improving state scores, like on the state science summative, bringing up, you know, state scores beyond where schools have ever been.

Speaker B:

And you know, that's, that's fantastic.

Speaker B:

That's a real point of pride for us.

Speaker B:

Not that I think state tests are the be all and end all, but they, they're not going away anytime soon.

Speaker B:

They're probably going to change in nature to more AI based testing.

Speaker B:

But you know, they're, it's important that we give kids those competencies because they are seeking to assess these, these competencies for 21st century skills that kids will need.

Speaker B:

And as we develop more and more important tests.

Speaker B:

I've, I've just come from the National Academy of Education on a meeting about AI and assessment and talking about these real issues about what will we do in the age of AI with these standardized science assessments that tend to be rather low level.

Speaker B:

And you know, teachers have been kind of in this role of forced to teach to the test because they're being judged based on whether their kids do well on these multiple choice items.

Speaker B:

So that's a whole other set of issues.

Speaker B:

Dr. Gobert, I would say that in.

Speaker D:

The story you told us about the.

Speaker B:

Conversation with that student, you were the human in the loop in that moment, because that moment is what we've been calling our transformative moment where something is.

Speaker D:

Reclaimed, something is restored and accessed, then has been acquired.

Speaker B:

Student acquired access.

Speaker B:

Right.

Speaker B:

And you know, it's amazing because we've had teachers tell us like, oh, you know, I thought they knew more, but I could never, I could never understand how well they could do X, Y, Z until I had a system calibrated and instrumented to collect that data and assess kids on the full range of competencies.

Speaker B:

And nothing is more powerful.

Speaker B:

And you know, I'm, I'm kind of glasses half full, but nothing is more powerful than saying to a student, hey, you actually did really well on this part, this part and this part.

Speaker B:

And they look at you like, wow, you're kidding me.

Speaker B:

And you know, it's really powerful to tell that to a student.

Speaker B:

They get back in the science game because I think early on they say, this is not for me, and they disengage.

Speaker B:

And so a system that can point out to them where they're doing really well and just the small areas in which they need support and give them that support is a real game changer for students.

Speaker A:

And that, my friends, brings us to the end of episode three.

Speaker A:

Thank you so much, Dr. Janice Gobert for joining us in the Wheelhouse today.

Speaker A:

If you want to hear more about this conversation, then I certainly would encourage you to check out our after show, the Wheelhouse All Hands on Deck.

Speaker A:

And we'll see you next week in the wheelhouse.

Speaker A:

Pause.

Speaker A:

1, 2, 3, 4.

Speaker A:

You ready for the after show?

Speaker A:

Welcome to Below Deck.

Speaker A:

Sorry, that's not right.

Speaker A:

I gotta do a commercial too.

Speaker A:

I forgot to mention.

Speaker A:

I forgot to mention that.

Speaker A:

So we're gonna go three, two, one.

Speaker A:

I'm gonna try to do this.

Speaker A:

Yes.

Speaker A:

If you'd like to hear more of this amazing conversation, then we have a second part or an after show called the Wheelhouse All Hands on Deck, only available at Lern Harbor.

Speaker A:

So join us in the Wheelhouse below deck@learnharbor.thinkific.com and that's a wrap on season 11, episode three of the Wheelhouse.

Speaker A:

A special thank you to Today's amazing guest, Dr. Janice Gobert, professor of Learning Sciences and Educational Psychology at the Graduate School of Education at Rutgers University and the CEO and founder of inq.

Speaker A:

It's along with the Wheelhouse team, Kathy mone, Michael Pipa, Dr. Alicia Munro, for helping us navigate this season's journey toward future Ready Schools.

Speaker A:

Innovation rooted in humanity.

Speaker A:

If you're a like minded educator who believes the future of learning must stay human centered, we'd love for you to stay connected.

Speaker A:

Follow Students Matter LLC on Instagram or LinkedIn or find any of us there.

Speaker A:

Kathy mone, Michael Pipa, Dr. Alicia Monroe and me, Dr. Grant Chandler.

Speaker A:

And we're thrilled to invite you to step inside the Wheelhouse Below Deck at Learn harbor, our new online space where these conversations come to life.

Speaker A:

It's more than a platform, it's a community.

Speaker A:

A free curated safe harbor for educators, leaders and thinkers who want to reflect, connect and take action together.

Speaker A:

Inside Below Deck, you'll find our special segment the All Hands on Deck extended content from today's episode where purpose meets possibility and learning stays joyful, collaborative and deeply human.

Speaker A:

Join us at learnharbor.thinkific.com and become part of this growing movement to build future ready schools where innovation is always rooted in humanity.

Speaker A:

Until next time.

Speaker A:

Remember, keep your doors open and your hearts even wider.

Listen for free

Show artwork for The Wheelhouse

About the Podcast

The Wheelhouse
Where Each Student is Distinctive and Irreplaceable
The Wheelhouse exists to create an inclusive community of empowered educators who believe that, together, we can disrupt the transactional herding nature of schooling to create districts, schools, and classrooms where each student feels confident, optimistic, capable, well-supported, and emboldened to be and to become who they are meant to be.

Guiding Principles
1. We are steadfastly committed to each learner and each educator believing they are distinctive and irreplaceable.
2. We believe that educating our children should be a humanizing, relational, and transformational endeavor. All else is secondary.
3. We believe that dignity is a birthright; it is not earned. Each child deserves a future filled with open doors and unlimited possibilities. Our work is in service to this central aspiration.
4. We believe that each human life is unique and precious; as such we are compelled to remove aspects of schooling that disregard any student’s dignity.

About your hosts

Grant Chandler

Profile picture for Grant Chandler
Dr. Chandler is currently the president and chief executive officer of Students Matter, the producer of The Wheelhouse. Along with Kathleen M. Budge, Chandler, is the author Powerful Student Care: Honoring Each Learning as Distinctive & Irreplaceable (ASCD, 2023). Chandler brings over 35 years of practical experience as a high school teacher, building and central office administrator, higher education dean, professional learning director in an outreach department at a large research university, and as a technical support provider and executive coach. . Since 2005, Chandler has provided technical support to over 350 districts in developing systemic approaches to solving student learning issues and was recognized by the US Department of Education as a national expert in small learning communities. He has designed and led professional learning experiences at many levels of the K-12 arena and for many different audiences and has conducted numerous workshops at national, state, and regional conferences. His consultancies include boards of education, state and regional service providers; as well as individual schools and local districts across the United States and internationally. In his spare time, he’s writing a children’s book and raises standard poodles for animal assisted activities. Contact him at grantchandler@ourstudentsmatter.org or www.linkedin.com/in/grant-a-chandler.

Katherin Mohney

Profile picture for Katherin Mohney
Kathy Mohney continues as an inspiring voice and thinker on The Wheelhouse since she began in Season 4. Katherin is a veteran educational leader having served as a local superintendent, a local state and federal program officer, and as a technical service provider for local districts in additional to her work as an elementary teacher, , instructional coach, principal, and consultant. Kathy strongly advocates for each student, understanding that a high-quality education is the foundation for having more opportunities beyond their K-12 education. Kathy earned her Bachelor’s degree in Elementary Education from Western Michigan University and her Master’s in Educational Leadership from Michigan State University. In her spare time, Kathy enjoys spending time with her husband, daughter, son-in-law, son, and her two fur babies.

Michael Pipa

Profile picture for Michael Pipa
Mike is a 36-year veteran educator. Before joining the CASDA faculty, he worked as an administrator at both the high school and middle school levels. Prior to his administrative career, Mike taught English Language Arts in middle and high school, achieving National Board Certification in 2006. He has worked extensively in support of students at risk as well as led his building’s professional development efforts.

Mike has worked as an instructional and administrative coach supporting staff in several area schools.

Alicia Monroe

Profile picture for Alicia Monroe
Alicia Monroe, EdD, is a PK–20 experienced educator, international education consultant, and career coach. She has served as a teacher, supervisor, assistant principal, principal, assistant superintendent, and adjunct professor. Her notable success in creating a culture of belonging and achievement in schools along with her expertise in developing equity and access models that frame educational opportunities for all students are the core of the ongoing professional learning and support she provides to school districts.

Dr. Monroe teaches undergraduate, graduate, and doctoral courses in Africana Studies and education at a state university. Her partnership with the Office of Accessibility Services and Center for Neurodiversity has provided for collaborative planning, mentoring, career coaching, and internship and job placement for diversability students and alumni.

Dr. Monroe is the CEO and founder of Solutions for Sustained Success, LLC. Through her private practice, she serves as national faculty for the Association for Supervision and Curriculum Development (ASCD). The whole child/whole student/whole educator framework that she was instrumental in designing is a trademark of ASCD.