Technology Essentials in Education Episode 12:
AI Ethics in Action
Host: Monica Burns
Mar 13, 2026
About the Episode
Technology Essentials in Education is your go-to podcast for practical insights on using technology to simplify your school week. Hosted by author and educator Monica Burns, Ed.D., in partnership with Jotform, this series is designed for K-12 educators, administrators, and leaders looking to make a meaningful impact. In this episode, Monica chats with Nasser Jones, founder of the nonprofit Bending the AI Curve, who shares his mission to ensure the future of technology is equitable for everyone. Together, they explore how to move beyond a reactive "policing" mindset to find proactive, strategic solutions that bridge the AI gap and prepare all students for the digital age. They discuss why minimizing "tool fragmentation" is essential for reducing teacher burnout, how to use agentic AI to identify and surface bias in school data, and how the AI Academy is empowering communities of color through culturally relevant tech resources.
Hello there, my name is Monica Burns and welcome to Technology Essentials and Education.
Today's episode is titled AI Ethics in Action, Bending the Curve, the AI Curve with Nasser Jones.
In today's episode, I'm joined by Nasser Jones, who is the founder of the nonprofit Bending the AI Curve.
Nasser is on a mission to ensure that the future of technology is equitable for everyone.
In today's episode, we dive into this critical world of AI ethics and talk about what it means to truly prepare all students for this age of AI.
Lots of great tips, strategies, and gems in this conversation. Let's get into it.
This episode is brought to you by Jotform. Jotform provides an all-in-one solution to streamline administrative tasks, enhance community engagement, and foster innovation.
Using their no-code drag-and-drop forms and workflows, your teams can securely collect and store data, automate tasks, and collaborate on team resources.
Educational institutions are also eligible for a 30% discount on Jotform Enterprise. Head to their website to learn more at jotform.com slash enterprise slash education.
Welcome to the podcast. I am so excited to chat with you today about a really big and very important topic of AI ethics and what this means for folks inside and outside of education.
Before we get into all of that together, can you share a bit about your role? What does your day-to-day look like?
Yeah, my name is Nasser Jones, and thank you for having me on this podcast, Doc. I really appreciate it. I'm excited about being here, of course, being with a fellow educator.
My day-to-day job is I build custom apps for corporate, nonprofit, and educational institutions. I also run a national nonprofit organization called Bend the AI Curve.
Amazing. Those are all things I want to get into in our conversation today. You mentioned bending the AI curve, and that's kind of where I want to start.
Today we're talking about AI ethics and that term you use, bending the AI curve. What problem or issue are you referring to there?
Literally nine months ago, I was out of business. I spent the last two and a half years telling people AI is coming for some jobs and maybe replacing some businesses.
I had a full service marketing firm serving educational organizations, college access professionals, and such. My clients would come to me and say, I just built a new website on Wix or a brochure on Canva.
Great for them, not great for my bank account. I've been in the AI space since day one, since we launched on OpenAI or ChatGPT, and knew what we had on our hands.
Nine months ago, I started pivoting into the AI space seriously and started looking at what I could get ahead of.
Every time I would get ahead, OpenAI or Claude would come up with something new. About six months ago, I started reading eight newsletters every day, mostly around nonprofits, education, and small business.
I started stumbling on data from MIT and Stanford that gave me a disturbing trend.
I got ahead of something called vibe coding, AI to help you build apps, and built a whole app store. I was going to go down the traditional route, build software, get funded, run ads.
But data showed a gap in the AI space where some demographics in the US are seeing exponential growth, spending billions, while others are flatlining or declining.
We've always seen education and wealth gaps, but now we're seeing that with AI. Groups like schools, rural communities, indigenous veterans are seeing their curve flatline or go down.
We have to address this gap or we won't build the people power to take on the jobs all this private investment is creating.
That phrase bending the AI curve is a great visual for people trying to understand this concept and the disparities many see in their own work.
When you hear AI ethics, what are the first two or three ethical questions you want decision makers to ask before adopting AI tools for their learning communities?
What I'm seeing in education is a reactionary response to AI ethics or policies. People harass their IT departments thinking it's a technical issue.
The first thing to consider is am I just reacting to what I hear in the marketplace?
Second, we need to be thoughtful and apply AI ethics across the board. You can't say students shouldn't use AI but teachers and admins can, as that creates adversarial relationships.
I don't want teachers being AI policemen in the classroom. Third, we need to think about how AI can be a tool that does more good than harm.
We need to be strategic, proactive, and invite more constituents into the conversation, including board members, administrators, teachers, parents, and the community.
You need skeptics and enthusiasts at the table. Everyone along the continuum is worthy to have the conversation.
I'm glad you said that because I encourage people to hold a healthy hesitation when interacting with new technology. It doesn't mean a hard no, but a thoughtful pause.
Being proactive, transparent, and having open conversations like the teacher-student example are important layers school leaders and educators should keep in mind.
Where do you see bias show up most often in AI products? Is it data, model, user experience, or how results are used?
Bias often starts with the data we use. In education, we've seen bias in student discipline, course recommendations, and placements.
If your data is biased from the start, your outcomes will be flawed. The good news is AI can help analyze if data is biased and how that bias might show up in curriculum, discipline, or hiring.
You can use AI to develop and execute plans to address bias, and even if you fail forward, AI can adjust.
We're moving past call and response AI use, where uninformed inputs lead to uninformed outputs. We need an agentic approach, solving problems from multiple angles.
Today, AI can flag issues in real time, like discipline problems early in the school year, allowing timely interventions for students and coaching for teachers.
If you start with flawed data or approach, it's not your fault; the technology is new. Another issue is fragmentation, where schools have many AI tools, creating complexity.
Teachers may have to learn five new tools, which is overwhelming. Ideally, districts should use only two or three tools, like a primary language model and maybe one additional tool.
More than three tools add unnecessary complexity. Someone should help compress the learning curve for constituents.
Investing in many tools means investing time and energy to learn and switch between them, which is very different from using just a couple of tools.
Teachers are trauma-informed educators, but they've been traumatized by constant changes and added layers of work over the last 25 years, making their job harder and less fun.
Teachers are overwhelmed, and now AI is another new thing added to their plates while they manage struggling students.
We've traumatized a generation of teachers over the last 20 years, and I want to help solve that.
When talking to educators, it's not necessarily adding something new but eventually helping take things off their plates.
Jotform lets you build forms in minutes, like student surveys and homework submissions, with free templates designed for schools and districts.
Educational institutions can get a 30% discount on Jotform Enterprise at jotform.com slash enterprise slash education.
You mentioned the number of tools someone might have access to. What are the biggest barriers to meaningful AI participation? Device access, connectivity, language, trust, time, or something else?
I talk to organizations about their Slack, the thing preventing them from moving forward. Organizations drift toward complexity, not simplicity.
Often, policies prevent adopting helpful tools. For example, concerns about privacy and sharing data with AI systems like OpenAI are big challenges.
Policies need to be rethought from the ground up, which requires serious conversations among boards.
Policies are often created after issues arise and rarely revisited to see if they still work.
Other impediments are practical, like how to approach curriculum design, discipline policy, and using AI to compress roadblocks toward optimization.
Working with schools, I see pushback and decisions about what to bring into communities and what stakeholders are comfortable with.
I'm curious about student interactions and how to facilitate developmentally appropriate conversations about AI ethics, bias, and automation in classrooms.
With vibe coding and app creation, teachers should think about creating apps to help students work more efficiently.
We know students are using AI because teachers are using AI. The question is what is appropriate and ethical use.
Are students just using AI to get the quickest result? In my classroom, I help students add layers to the conversation.
English teachers are often the biggest resistors to AI because they want students to maintain their voice in writing.
Some traditional skills like cursive are fading, and there may be less introspection in writing.
I add layers to the conversation with students, like if you generated this with AI, do a video about your response or prompt.
I'm working on AI technology to help students better maintain their voice, walking them through waypoints like the old Oregon Trail game.
I start with social-emotional learning, asking students where they are today and how they feel, as environment affects writing.
Then I ask about research approaches, sources used, and if AI pulled sources for them.
I want students to go through the whole process, write, and look at punctuation. The AI agent constantly makes recommendations.
Even skeptics can target students precisely based on performance, ESL status, academic level, and fine-tune AI to district goals and standards.
This helps move students along pathways more quickly instead of a one-size-fits-all approach.
These are basic things teachers could use today, and we could drill down much more, but time is limited.
This is perfect for someone wrapping their head around what's possible and thinking about the rest of the year with their students.
Before we finish, tell us about the AI Academy, your mission, and where people can connect and learn more.
Bend the AI Curve is my ethos that AI should be for all. I've created an app store with 70 free apps to remove roadblocks.
Whether you're in school or writing a grant or resume, I want people to have access to AI in every way possible and give it away for free.
You can bend the AI curve where you are. I built this app store to help with access.
Nonprofits or schools can set up computer labs or community centers for parents to access AI if they lack internet or devices.
Many Latino families have only one smart device, often just a cell phone without a plan, creating complexity for students doing work online.
You can download the playbook for your demographic and bend the AI curve.
I'm launching a 10-city tour starting in Philly, going to cities with the most need like Chester, Wilmington, Baltimore, D.C., Northern Virginia, Atlanta, Chicago, Detroit, Vegas, Phoenix, and back to Southern California.
We're launching the AI Academy for students of color because these cities have particular challenges.
I'm starting pilots in every city with a week of AI conversations. This is not for nerds but to begin necessary conversations.
I also have My University, where veterans teach veterans how to prompt and get comfortable with AI technology.
Content is culturally connected and relevant, addressing needs of reservations, indigenous people, and their place in the world with technology.
You can bend the AI curve in your district, classroom, or community by implementing policies to help your specific community.
If you need help, I come in to look at your biggest district problems, open computers, and walk through solutions together.
Before leaving, you'll have a roadmap to solve problems and a framework for agentic thinking about AI tools.
My goal is to increase your utility of AI tools from 10% to 30, 40, or 50% rather than adding complexity and fragmentation.
You can find me online, connect with my team, and I'm happy to help your organization. There's no financial barrier because AI should be for all.
This has been fantastic. I appreciate your time, energy, strategies, and resources for educators.
We'll link out to all of that so people can find and connect with you. Thank you so much for your time today.
You are so welcome. Thank you for having me, Doc.
It was lots of fun chatting with Nasser Jones today.
Let's make this EdTech easy with a few key points from the episode.
The AI curve is widening with some communities accelerating while others risk falling further behind.
Schools often respond reactively to AI ethics instead of taking a strategic, inclusive, proactive approach.
Bias data leads to bias outcomes, and AI can help analyze and surface inequities in curriculum, discipline, or hiring.
Too many AI tools create fragmentation, and most districts might need only a primary language model and possibly one additional tool.
Thanks so much for joining today's episode. Make sure to check out the resources to learn more about Nasser Jones' work.
A big thank you to Jotform, the presenter of today's episode. To learn more and get a 30% discount on Jotform Enterprise, head to jotform.com slash enterprise slash education.
