Survival Guide for the AI-pocalypse

  • Academics
  • Community Voices
Survival Guide for the AI-pocalypse

By Jennifer Parnell

Jennifer Parnell is a Lawrenceville history teacher and the School’s Director of innovation and AI projects. This article is part of a collection of pieces on artificial intelligence, titled "Survival Guide for the AI-pocalypse" in the Spring 2025 issue of Independent School magazine, published by the National Association of Independent Schools.

In a mountaineering class at the University of Alaska, I learned an acronym for actions to take for survival in an emergency—RISSFWP: recognition, inventory, shelter, signal, food, water, and play. And while RISSFWP mostly conjures actions you might use if lost in the wilderness, I’ve been thinking about how it relates to the AI-pocalypse and my work as the director of innovation and AI projects at The Lawrenceville School (NJ).

Recognition. In the wilderness, the first step is to acknowledge that there is an emergency. While artificial intelligence isn’t that kind of emergency, the impact of AI is bigger than any new technology we’ve seen before. It’s a major paradigm shift for education. With the arrival of AI, we need to accept that how we think about teaching and learning and schools—and human society—has fundamentally changed.

There’s a consensus that we, as educators, need to explore, understand, and wrap our heads around this new technology—what it is and what it can do. Recognition, however, doesn’t mean that we need to panic.

Many of my colleagues are not yet ready or willing to embrace AI. There’s real fear and anxiety about AI eliminating jobs and upending workflows, resulting in a loss of control and changing relationships across campus. There are some controversial elements: algorithmic bias, copyright issues, accuracy/hallucinations, privacy, data use, energy use, and emergence, which describes the unanticipated skills that some of the large language models (LLMs) display. But ignoring AI or trying to control developments is not possible and is actually counterproductive. Any buffering that might have facilitated more careful or methodical preparation has vanished. We are all students now.

And we must recognize that we are capable of success. We know we can react to a rapidly changing educational environment; we may have blocked memories of our first days of teaching on Zoom during the pandemic, but resilience and creativity born from that experience are assets.

 Inventory. To assess an emergency—or AI in a school—take inventory of tools and assets available, including people. Before schools form task forces, committees, or learning groups to focus on how they will move forward with this new technology, school leaders should first identify faculty and staff and students who already have expertise or interest in AI. They should also consider how and where such groups might function: Does the current academic schedule facilitate interdepartmental meetings? Are there physical spaces on campus where students and faculty can interact collaboratively?

Many of the tools school leaders will want to take stock of are related to unauthorized use of AI in academic work. School leaders may be tempted to envision AI-detection tools as an essential component of inventory, but that approach is fraught with complications. Research has shown that AI-generated text and human-generated text are often misidentified. Once AI-generated text is manipulated, it becomes almost impossible to detect. Faculty can use LLMs, such as ChatGPT, to compare student work to AI-generated responses. When more monitoring is necessary, software extensions such as Google Draftback can review document histories.

Shelter. In a survival situation, shelter provides protection from the elements. What does shelter look like as it relates to AI in a school?

Schools need to provide shelter that allows students to flourish—a learning environment that is purposeful, relevant, and joyful. Not everyone learns at the same pace, however, and people respond to change in different ways, so differentiation must be part of the strategy. We can support our students in this endeavor by building awareness of social and ethical considerations of AI across the school as well as nurturing digital literacy and civil discourse. To think through this work, schools might turn to a constructivist pedagogy, where risk and failure are actively taught and practiced, and students take an active role in how they learn. Students need to understand the possibilities and limitations of AI.

Schools need to create detailed structures so that students have a good understanding of how they are allowed to use AI. School leaders should ask how AI rules are being clarified and evaluated within and across departments and work toward providing specific guidelines and clear policies regarding academic dishonesty.

At Lawrenceville, we use “AI Scale of Use,” adapted from the Artificial Intelligence Assessment Scale: A Framework for Ethical Integration of Generative AI in Educational Assessment, published in April 2024 in the Journal of University Teaching and Learning Practice, to help set clear expectations for our students.

AI Scale of Use graphic

Even with this structure, questions remain. What constitutes responsible AI use? Copying AI-generated text in its entirety is not a responsible use, of course, but what about using

AI to generate a research question, craft an outline, or help edit a conclusion? Even deeper questions for faculty remain when evaluating assignments and assessments: If AI can do it, why is it still worth doing? Who is the work for, and why am I creating it?

Signal. A signal is the outward message that helps one survive. More than an SOS, a successful AI signal is a flexible vision clearly communicated with all stakeholders. Consistency of messaging about AI is critical to building a culture of collaboration rather than a state of confusion for families, and prospective students.

School leaders should establish procedures among teachers, course teams, and academic departments. Nurturing thought leaders on campuses and creating both formal and informal discussion opportunities is essential. AI provides an excellent opportunity for thought leaders to ask big questions: What are the essential elements of an independent school education? Is the writing process itself valuable, or is critical thinking about writing the essential learning? To what extent should we be helping our students build a portfolio of AI and literacy?

Families must feel assured that we are carefully guiding this process in our classrooms and campuses. School communications should send a signal that we are positioning students well for this transition to the AI landscape, balancing critical thinking and problem-solving.

Food and Water. Although not first on the survival checklist, sustenance obviously has an important place. In the AI space, sustenance is training and professional development for faculty, staff, and students. From orientation to the end of the school year, students must be taught how to use AI responsibly and how to choose platforms designed for educational purposes that prioritize accuracy and reliable information. Students should learn how to study with AI; use it for Socratic prompting; create study guides and vocabulary quizzes; explain complex topics; and develop questions, practice problems, and podcasts from notes. AI can point out ways students can think more critically about readings, create customized study schedules, or suggest effective techniques based on personalized study plans.

Many students share the concerns of faculty and parents that they will never learn how to learn or that AI will take the human component out of learning. They wonder who determines what content is correct. They also argue that they need to be taught how AI and machine learning work—not the complex programming or mathematics involved but how training data works in a large language model, in essence how “unsupervised” AI works. If we expect our students to be discerning consumers of AI-generated knowledge, they need to learn how to use AI appropriately in different situations.

Faculty need regular and specific training geared to different levels of experience so that AI use is productive. Specific upskilling can help faculty use AI as a colleague or teaching assistant. AI is very effective at generating quizzes, study guides, lesson plans, course readings, and rubrics to more easily differentiate learning. Faculty also need opportunities to share tips and tricks with colleagues and to meet in small departmental groups.

Play. The final aspect of RISSFWP is surprising: play. Maintaining a positive and playful mindset allows us to stretch our imagination and move from surviving to thriving. In the AI space, that leads to greater comfort with the tools and a more open-minded approach to technology. Because the speed of AI technology can be daunting and proficiency can seem out of reach, providing students, faculty, and administrators with unstructured time and opportunities to explore AI tools is essential for success.

For administrators, encouraging ethical AI exploration and use may be the most foundational component of a school strategy. The focus should be on creative, collaborative work rather than repetitive tasks.

AI proficiency grows from play. It might mean hosting an AI hackathon or having students work jointly with faculty on Shark Tank-style research initiatives or applications. At Lawrenceville, our students develop and test use-cases that mirror AI evaluation and implementation in the real world.

From Surviving to Thriving

This moment is an unprecedented opportunity for us to reimagine education. In this new space, we must listen to and learn from our largest stakeholder group—our students. Continuous learning will be the norm, and AI will push educators into a realm of rapid reevaluation. Success may be elusive as the pace and breadth of change defy our experience and imagination. But our students can help lead our strategy; they understand that technology is inexorably linked to learning—it is unimaginable to go backwards to a traditional ecosystem.

At Lawrenceville, a student-led AI Council meets regularly to discuss research, experiment with new tools and platforms, evaluate project opportunities, and formulate feedback loops linking academic and operational realms. The group helps develop orientation materials, presentations for alumni and family weekends, and guidelines for department use. The overall goal is to identify opportunities to automate and simplify operations, examine workflows, and turn timely actionable data into insights.

It is difficult to anticipate where campus leaders should target their strategy. The key is focusing on innovations that make education transformational, not transactional. This requires a positive mindset, unprecedented levels of flexibility, and, undoubtedly, a sense of humor and humility.

AI does not threaten our survival: It provides independent schools with a unique opportunity to rethink essential elements of the education we offer and to redesign that education with new levels of collaboration, creativity, and community.

For more information, contact Lisa M. Gillard H'17, director of public relations, at lgillard@lawrenceville.org.