For the past 15 years, Jérémie Rostan has helped international schools combine academic rigor and student experience through the development of transformative programs. A prolific author, his holistic and innovative approach is regularly featured in leading publications, as well as on his website, where he creates resources for school leaders and educators. Jérémie currently serves as High School Curriculum and Instruction Coordinator at the International School of Panama.
In a recent interview with K12 Digest, Jeremie Rostan discussed his experience with Curriculum and Instruction and artificial intelligence. He shared his views on shaping the curriculum and instructional strategies, AI impacting the field of education, and technologies in the classroom.
As a Curriculum and Instruction Coordinator, could you share your insights on how educational leadership plays a crucial role in shaping the curriculum and instructional strategies within a school or district?
A key responsibility of school leaders is to ensure curriculum and instructional practices are “optimal”, which means both the best possible in their own right and the most suited for a school’s unique context and identity. Achieving this goal requires a well-balanced mix of strategies because, as is always the case with leadership, it is a matter of finding the equilibrium point between opposite demands. For instance, while it is important to offer a clear vision for curriculum and instruction based on research and aligned with the mission of the school, it is equally crucial to involve all community members, as appropriate, in this design. Its curriculum is literally “what” a school is all about. It is what parents sign up their children for, what teachers teach, and what learners learn. It is what brings everybody together – and therefore what might leave some of them aside, or behind, if it is not appropriate. For that reason, striving for an optimal curriculum is not a one-time effort, but a continuous pursuit; especially as needs change. Leading such a complex process is, again, a question of managing tensions. On the one hand, a curriculum must be systematic, “aligned”, and offer stability; on the other hand, it must also be adapted, adaptive, and a factor of innovation.
The same is very much true of the implementation of a curriculum, notably through instruction. A school is more than a number of classrooms with shared walls. It is a holistic learning environment and therefore requires a pedagogical proximity among teachers. There must be an alignment of learning objectives and teaching practices, and therefore appropriate guidance and training. Yet again, there must also be an element of freedom in the pedagogy (just like in the curriculum itself). In the end, a school is a learning community, and just like any healthy community, it requires both coordination of efforts in a shared direction and the necessary autonomy and flexibility to keep this social body alive and growing.
How do you see artificial intelligence (AI) impacting the field of education, and what are some innovative ways in which AI can be integrated into curriculum and instruction?
An optimal curriculum must, of course, remain adapted to a world where everything, notably technology and the very definition of “future-readiness”, changes constantly. How can school leaders ensure their curriculum stays relevant? First, as I mentioned earlier, they must accept the fact that curriculum development now has to be an ongoing effort. Schools have long had “review cycles” where they would periodically audit and edit their learning offerings, but this is not sufficient anymore. The curriculum now needs to be agile. “Agility” is one of those buzzwords that are often used without precise meaning. Etymologically, it really simply means a capacity for both quick responses and varied range of motion. Rather than a delayed, retrofitting cycle, such a process requires constant monitoring and innovation. Some ways educational leaders can facilitate such curricular agility include decentralizing curriculum design, encouraging experimentation, partnering with industry experts, and investing in professional development.
The risk is obviously to sacrifice the very idea of a stable curriculum to an unstable and incoherent patchwork of trend-chasing, ad hoc units. To avoid this (another one of those tensions we mentioned above), there are a number of things school leaders can do.
First, innovations can take the form of small and short “sprints”, or trial runs, ready to be scaled if and as needed. This approach has the added advantage of generating prompt feedback, allowing one to learn from multiple early iterations. An example would be the AI-rich IB Visual Arts unit a teacher and I designed at the beginning of the year. During this unit, students were given a challenge: exploring, through AI-generated art an AI-related question. This not only allowed them to discover some of the early text-to-image and image-to-image tools currently available, but also to develop AI-literacy understandings and skills by reflecting on questions such as: How to prompt engineer image-generating tools? Is it morally acceptable to use such tools trained on artworks without their authors’ consent? Who is the author of an AI-assisted artistic piece? Can AI-generated art be considered Art? To what extent are art-generating AI technologies biased? How to reference AI tools and leverage them as a source of inspiration? If the particular techniques used in this unit were to become rapidly obsolete, modifying this small proof-of-concept would be quite easy; and even in a worst-case scenario, most of the learning would still have been meaningful. On the flip side, if these questions were to become more and more relevant, it would give us a model to follow. Future readiness can thus be served by crafting experimental modules “ready” to be scaled into full-fledge units.
Another strategy can be to approach the inherent uncertainty of tomorrow through choice. Schools like to say that their curriculum is 21st-century skills-based, fostering critical thinking, creativity, collaboration, and communication. However, there is quite a bit of range between traditional and fully project-based or personalized classes. On this continuum, there will be a best fit for parents’ preferences and students’ needs, allowing once again for innovative experimentations. A great example, here, would be the Innovation and Entrepreneurship Diploma (iED) developed by some of our faculty members. Offered as an alternative to our IB Diploma Programme pathway, this track provides students with exciting options based on syllabi created by partner universities – to which we are now adding AI-rich dimensions, such as AI-assisted professional writing. Importantly, such a provision would not be possible without our overarching standards-based architecture, which guarantees that all of our students develop similar competencies, whichever option they select.
A third strategy mentioned earlier is personalization. A personalized curriculum means that the students are actively engaged in the selection and design of what they are learning. AI technologies now make this easier than ever (a “co-design” approach, as described by UNESCO). A personalized curriculum does not only makes it much easier to adapt in real-time to new societal developments but also challenges students to develop the skills needed to do so.
It is crucial to keep in mind, however, that an optimal curriculum, even personalized, can never be a pure “mirror” of students’ backgrounds and interests. It should always also act as a mirror and open their minds to other realities and perspectives. Indeed, intercultural competency is a key element of future readiness, and should always be embedded in the curriculum.
Can you discuss your experience and strategies for coaching teachers and staff to effectively implement new instructional approaches or technologies in the classroom?
Since ChatGPT made AI a mainstream topic about 15 months ago, the focus as far as education is concerned has mostly been on ways these new technologies can transform instruction. Equally as important, however, are the effects they can have on school leadership, as well as on curriculum design. Looking at it from a simplified SAMR perspective, AI integration can help teachers as curriculum designers be more efficient, effective, and/or innovative. This is invaluable because teachers are often tasked with curriculum design but not provided with sufficient time, resources, or training to do so. In our case at ISP, colleagues are making extensive use of bespoke generative chatbots to assist the considerable effort involved in our transition to standards-based teaching and learning. With a little bit of prompt engineering, and taking advantage of the RAG and API capabilities offered by platforms such as POE or Zapier, I was able to create a number of tools allowing teachers to map their current units to specific standards, brainstorm KUDs (what students will Know, Understand, and be able to DO), create learning scales (or high-quality rubrics), and even draft entire unit plans. This is not only a time-saver, generating ideas teachers can revise and perfect: artificial intelligence also helps us generate more meaningful “essential questions” and “enduring understandings”. Our unit planners use a UbD (Understanding by Design) template and are also aligned with the IB’s Approaches to Teaching and Learning. What we have found is that AI was even capable of coming up with original ways to address such dimensions as the “Theory of Knowledge connections” or “potential barriers to learning”.
This last aspect is what LUDIA, the AI thinking partner I co-created with UDL expert Beth Stark, was designed for. It is an interesting example of the way AI integration can transform classroom instruction. At this early stage, it seems like the most popular tools among teachers are the ones that automate very traditional practices such as generating classis lesson plans, slide presentations, and multiple-choice questions. This is worrisome, as there is a real risk that AI integration will crowd out best practices that cannot yet be delegated to AI, or at least not as easily, therefore privileging lower-thinking activities and leading to even more “industrial” teaching than we saw before. LUDIA does the opposite, as it helps teachers devise ways to adapt instruction to their learners’ needs, ensuring all have access to high educational outcomes–generally without any AI tool involved at all. Of course, AI technologies can also be used to create transformative learning experiences, but this often involves a little bit of a learning curve, and can only be seen for now among the most enthusiastic and tech-savvy early adopters.
What are some challenges or barriers you have encountered in implementing AI-driven instructional practices, and how have you addressed or overcome them?
There are of course many challenges and barriers to the implementation of AI-driven instructional practices, and even more so to effective and responsible AI-integration.
Fortunately, there are also solutions and workarounds.
The first problem that is common to any new tech-enhancement opportunity is related to costs and insufficient resources. Like other edtech options, EduAI tools often follow a “freemium” model where advanced functionalities require paid subscriptions. While this is certainly normal for an economic product, this can generate (and often reinforce) inequities. This is true for teachers and students alike. ChatGPT-3.5 is substantially worse at math than ChatGPT-4, for instance. A workaround, here, is that similar capabilities are often accessible without cost, generally through open-source options. The issue is that they are much less user-friendly. For instance, ChatGPT-4’s greater math ability comes from the fact that it can execute code. ChatGPT-3.5 can write the code needed to perform accurate calculations or data analysis. But it cannot execute it. There are many options to overcome this barrier. From beginner to advanced: using Microsoft Copilot, which gives free GPT-4 access; executing the code with an editor or Python compiler; using CodeLlama on Perplexity labs, or even installing a specially trained LLM such as WizardMath or Dolphin-Mixtral locally. Likewise, most AI tools used by teachers and students are wrappers adding functionalities and a nice user interface to underlying foundational models. These add-ons are often quite straightforward to replicate without a paywall – sometimes very easily with a little bit of prompt engineering, and sometimes not so easily, with actual technical expertise. This is why it is so crucial that schools collaborate on these efforts.
A second problem, though, is that even when the necessary resources are accessible, some teachers and students will offer resistance. This is a unique aspect of this new wave of tech integration. When teachers were encouraged to leverage the new opportunities created by the Internet, or by iPads, this did not lead to (as much) opposition, because these technologies were not perceived as being as threatening as AI can be. There was no professional angst that they might “replace” teachers, and no philosophical concerns about their very nature and impact, as they did not imitate and automate human cognition. The solution, here, is to listen attentively to those critics, as they can provide invaluable insights into some of the risks that AI does create, all while appeasing them through open discussions.
This should be part of the professional training that has become so necessary to provide – a third problem. We cannot expect effective and responsible AI-integration unless we set up programs that upskill faculty, and guide and support their AI initiatives.
Without guidance, many will not know where to start, or where to stop. A key issue is obviously the idea that introducing AI in the classroom might open the door to misuse by students and undermine their learning. This is a legitimate concern. Beyond “cheating”, the risk is that AI will replace rather than enhance the development of targeted understandings and skills. Training is thus needed, not only for faculty but also for students. There are solutions to this problem, such as AI Scales clarifying appropriate uses. I proposed one here at ISP, and there are many good instruments out there. The UNESCO will soon be finalizing its own AI Competencies Framework. Once again, the real problem is that expertise in the field is still naturally quite scarce, which makes it so important for educational organizations to work together. Our Extended Essay Coordinator has done a fantastic job of integrating AI options into the self-guided course she created. Those are the kinds of resources that schools need to exchange.
This is also a good example of the kind of support that should be provided. Any given school will have faculty members who are taking the lead on AI integration. Encouraging them to share their learning makes it possible to offer a menu of options to other colleagues, which can lead to lesson studies, action research, coaching, and other PLC strategies. We have done so at ISP with our Digital Promise Microcredential “AI@School”, which is open to volunteers and has led to a pilot program and partnership with the edtech company Mizou.
Finally, a fourth problem revolves around data privacy, harmful content, biases, and “hallucinations” – which are all inherent threats to these new AI technologies. The solution, here, is to develop robust use guidelines. Many have been made available recently, so all the expertise needed is really to know how to find and adapt them to each school’s context. Once again, this goes to show how important school partnerships will be to the success and safety of this ongoing revolution.