Unpacking Education & Tech Talk For Teachers

Prosper, Prepare, and Protect (Part IV: Brookings AI Study–Framework for Action)

AVID Open Access Season 5 Episode 83

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 16:22

In today’s episode, we'll review the 12 suggested action step for effectively adopting AI into our schools as outlined in The Center for Universal Education at Brookings' study "A New Direction for Students in an AI World: Prosper, Prepare, Protect." 

Paul Beckermann 0:00 Welcome to Tech Talk for Teachers. I'm your host, Paul Beckermann.

Transition Music with Rena's Children 0:05 Check it out. Check it out. Check it out. What's in the toolkit? Check it out.

Paul Beckermann 0:15 The topic of today's episode is "Prosper, Prepare, and Protect, Part Four: Brookings AI Study—A Framework for Action." Today is the final installment in our unpacking of the report titled, "A New Direction for Students in an AI World: Prosper, Prepare, Protect," by the Center for Universal Education at Brookings.

So far, we've reviewed the scope of the study and outlined both potential benefits and risks to integrating AI into our K-12 classrooms. The authors of the report have weighed both these advantages and disadvantages and write that, "Without proactive and comprehensive intervention, the risks and harms we have identified are likely to persist and intensify, potentially outweighing the benefits of AI and widening disparities in access and effective use." To this statement, they add two very important follow-ups: 1) It's not too late to shift the current trajectory; and 2) We must act with urgency in response to their call for action.

The authors of the Brookings study conclude their report by outlining 12 action steps divided into three subcategories: Prosper, Prepare, and Protect. These subcategories make up the structure for the framework for action, ultimately to help steer our schools in a positive direction. In regards to AI adoption, they urge people reading the report to identify and act on at least one recommendation in the next three years. In today's episode, I'm going to dig into these 12 potential action steps. As you are introduced to them, consider which one or ones you might pursue.

Transition Music with Rena's Children Let's count it. Let's count it. Let's count it down!

Paul Beckermann First, let's dive into the four recommendations under the PROSPER section. The Prosper pillar focuses on transforming teaching and learning experiences so that children and youth can thrive in an education system where AI is omnipresent.

Number one: Shift educational experiences in school. This section states explicitly: technology is not pedagogy. Rather, it's a tool that can potentially empower educators to overcome past challenges and limitations and reshape learning for the better. What we don't want to use it for is a continuation of poor learning strategies, described by one student as "memorize, recite, and forget." That would be an example of using AI to amplify poor practice.

Instead, we need to identify the core competencies students will need to thrive in an AI world and then identify when and how AI should and should not be integrated into teaching and learning experiences to achieve those goals. This means shifting some educational experiences and crafting new pedagogies that are AI-aware, AI-assisted, and, when necessary, AI-resistant. The authors also suggest an interdisciplinary approach to learning, where the sciences are integrated into the humanities and social sciences. They argue that this will be important for developing critical thinking and ethical reflection in an AI world. Learning experiences should also be designed so that they connect to students' interests while providing some degree of choice and control to help boost their motivation and engagement and allow for the development of student agency.

Number two: Co-create educational AI tools with educators, students, parents, and communities. This action step is targeted toward companies that create AI tools. It encourages them to form true collaborative relationships with stakeholders, especially students and teachers, to make sure their products transform learning based on research rather than simply a repackaging of old practices in a flashy new interface. Currently, teachers represented in the report say they feel like AI is "being done to us, not with us."

The authors offer a few suggestions for how to do this. One is to establish "Teach-Tech" co-design hubs, where teachers are involved with designing new products from the earliest stages of ideation through production. Another idea is to include students and parents in AI decision-making and design. This might include a student AI council or a parent committee. Finally, the authors suggest involving communities so that local languages and priorities are supported through the new tools and platforms.

Number three: Use AI tools that teach, not tell. Publicly accessible generative AI tools like ChatGPT and Google Gemini were not designed with research-based learning strategies in mind. They were developed to provide the general public with fully developed answers to prompts and questions; they were not designed to teach. The recommendation here is for tech companies to create AI-powered tools that are specifically designed to be child-friendly and to facilitate learning. One suggestion is to design an AI interface that isn't so quick to agree and please; rather, design it to challenge, critique, and productively disagree with users. This approach could push students toward greater self-reflection and higher quality standards. Other suggestions include designing AI tools to provide reasoning behind the answers they provide and to require users to pause and reflect before they get AI assistance. In general, the goal is to keep students in the mode of critical thinking and self-reflection.

Number four: Conduct research on children's learning and development in an AI world. This is number four, but it probably could be number one. As the report states, there is an urgent need for high-quality research to track students' learning, well-being, and development in an AI world. AI is so new that we simply don't know enough yet about how to best deploy it for the highest quality learning experiences.

Within this research context, the authors provide a few suggestions that align closely to the structure of this report. First, we should prioritize research that identifies AI risks and solutions for how to mitigate them. Similarly, we should research ways to leverage AI to benefit student learning and development. In addition to these student-focused studies, we should also discover what support teachers need to be successful, including ways to enrich student experiences and strengthen motivation, engagement, and agency. Any research that is undertaken should include a variety of quality research approaches, including examining real school situations, conducting controlled studies, running evidence-based pilot programs, and completing rigorous longitudinal research.

So, those are the four proposed action steps under the Prosper section. Now let's take a look at recommendations five through eight under the next section: PREPARE.

The Prepare pillar acknowledges that preparation requires building the knowledge, capacity, and structures for ethical and effective AI integration, ensuring that schools develop clear AI visions with dedicated resources, organized adoption processes, and measurable evaluation criteria to track implementation success. With that context in mind, the Prepare section urges governments, funders, private sector players, education systems, families, and communities to all work together to advance the following actions.

Number five: Promote holistic AI literacy for students, teachers, parents, and education leaders. The report cites TeachAI's definition of AI literacy, which states that AI literacy includes the knowledge, skills, and attitudes associated with how AI works—including its principles, concepts, and limitations—as well as how to use AI. To promote and support this definition of AI literacy, several specific actions are outlined. One is to adopt holistic AI frameworks to guide implementation. There are many frameworks already available that can be adopted, including examples from the European Commission, ISTE, ASCD, and UNESCO. In addition to adopting a framework, schools should create or adopt guidelines for AI literacy. National guidelines are ideal and can provide consistency and clarity, but where these are absent, schools or states can adopt their own. Another specific action step is to support systemic AI literacy approaches. This means that AI literacy should be integrated in all curriculum areas and not be confined to just computer science classes. Another recommended strategy is to support peer-to-peer AI literacy, where students become leaders, mentors, and facilitators for their peers. Finally, to ensure a consistent message and reinforced approaches, schools should include families and communities in these AI literacy efforts.

Number six: Prepare teachers to teach with and through AI. If we want our students to be AI-literate, we need our teachers to be AI-literate. This can happen in a couple of different ways. Ideally, AI literacy becomes a component of pre-service teacher preparation so teachers have these skills when they enter the profession. However, the pre-service teaching curriculum often takes a while to catch up to current learning needs. It also doesn't address the many teachers already in the profession. Therefore, robust in-service professional development is also needed. As with any other quality training, these learning experiences need to be differentiated and sustained. They should also both precede and accompany AI implementation so teachers are supported throughout the process. This learning must also go beyond simple use of AI; it needs to weave together AI skills with effective pedagogy, course content, and techniques for developing high-quality student learning experiences.

Number seven: Provide a clear vision for ethical AI use that centers human agency. The report begins the section with a very strong statement from one of its panelists: "Thinking and learning are not tasks to be outsourced to AI. They are how students build their identity, agency, and dreams. While AI offers speed and fluency, it cannot experience wonder, wrestle with doubt, or choose values. Education systems can help students see that struggling to find their own words and pursue their own questions is what makes learning meaningful and life worth living." To do this, the authors recommend developing a clear vision for how AI can be used to help ethically advance human agency. This begins with solid policy that can guide this vision into practice. An example of related guidance is provided from the Washington Office of Superintendent of Public Instruction, which states that good AI use always starts with human inputs and inquiry and always concludes with human reflection and edits.

Number eight: Employ innovative financing strategies to close the AI divide. This recommendation is focused on making sure that all students and teachers have access to AI and AI literacy education through proper and equitable funding. Especially in under-financed regions, this may require innovative funding models, including public-private partnerships. The government can also provide funding structures that ensure access for all. This could include models like the current E-Rate system, as well as other mandatory school discounts or perhaps access to free, open-access AI models.

The final four recommendations, number nine through 12, appear under the PROTECT pillar. This pillar aims to protect students from the potential risks that have been outlined in the report. This means implementing safeguards for student privacy, safety, emotional well-being, and cognitive and social development.

Number nine: Break the engagement addiction and design platforms that are centered around positive mental health for children and youth. Through this recommendation, the authors of the report are asking AI companies to operate in ethical ways which put student safety and health concerns first. One way to do this is to require online products to meet safety standards. This includes making a distinction between general-use AI products and those specifically designed for students. This would require these products to be developmentally appropriate, safe, fair, reliable, and transparent. Another approach is to stress-test AI platforms for safety to make AI companies slow down and ensure their products are ready and appropriate for student consumption before they are released. Companies and government entities can also put together advisory boards to ensure necessary safeguards are in place. It can be beneficial to include students in developing these safeguards and processes as well.

Number 10: Establish comprehensive regulatory frameworks for educational AI. This recommendation is calling on government entities to take action by providing government oversight and regulations. Leadership can help ensure that new AI products are safe and appropriate for children. This may include specific policy implementation that reaches across various government branches. It might include requirements for meeting safety and ethics standards. Some of these regulations may be universal and some may be AI-specific. Other regulations may address technical standards, while others may require independent audits. The list goes on, but the point here is that government regulatory frameworks can guide and even require that the creation and release of new AI products is done in a way that is thoughtful and therefore safer for students.

Number 11: Procure technology that protects students' privacy, safety, and security. This action step leans on the influence that school districts have as purchasing agents. If schools hold tech companies to higher standards before purchasing AI products, those companies will be more likely to strive for the high standards needed in order to acquire and keep those paying customers. One way schools can ensure this is to develop and adopt child-friendly procurement policies and practices that outline high expectations. Districts can also lean on product certifications from well-respected nonprofit organizations such as Digital Promise and the National Education Association.

Number 12: Support families to manage students' AI use at home. This final point acknowledges that AI use is not confined to school; it frequently and regularly is being used at home and in the community. With this in mind, schools should strive to support parents in how to guide their children in the safe use of AI. The report offers a few suggestions, including encouraging students to share about AI use in school with their parents, as well as providing families and their children with AI safety information. There are many of these resources already freely available, and the report recommends a few for review.

There may be other steps that can be taken as well, but these are the 12 outlined in the Brookings report that aim to help our students prosper, be prepared for an AI world, and remain safe while doing so. AI is still an emerging technology, and the state of its capabilities and implementation often changes on a daily basis. Still, the suggestions in this report can give us a starting point as we look for ways to successfully navigate the assimilation of AI tools into our world and schools.

Think about this list offered by the authors of the Brookings study and ask yourself: what can I do to make a positive difference in the adoption of AI?

Paul Beckermann 15:51 To learn more about today's topic and explore other free resources, visit avidopenaccess.org. Specifically, I encourage you to check out the article collection, "AI in the K-12 Classroom," and, of course, be sure to join Rena, Winston, and me every Wednesday for our full-length podcast, Unpacking Education, where we're joined by exceptional guests and explore education topics that are important to you. Thanks for listening. Take care, and thanks for all you do. You make a difference.