April 26, 1986 (almost exactly three month after the Challenger disaster) is a day I will never forget.
My uncle and aunt had invited me to spend the weekend at their cottage by a lake near Kharkiv, where we lived. With May 1st (The Labour Day) fast approaching, one of the major holidays in the former Soviet Union, many people were eager to escape the cities for a few days hoping for a warm spring weather. Their only son, my cousin, was serving in the army near Kyiv, the capital of Ukraine. In a grim twist of fate, it felt like luck that he had been stationed there rather than sent to Afghanistan, where the USSR was fighting a brutal and unpopular war. That Saturday was unusually cold. We spent most of the day huddled in a small room, warming ourselves by a small electric heater. As he often did, my uncle pulled an old radio from the cabinet and began slowly turning the dial, searching for the Voice of America, BBC, Deutsche Welle, or another “illegal” Western station broadcasting in Russian, virtually the only sources of information we trusted. It was never easy to find a signal that wasn’t jammed, but this time, when he did, something felt different. The voices of the radio hosts were tense, their concern unmistakable.
They were reporting that something extremely dangerous had happened at a nuclear power plant in a town called Chernobyl, not far from Kyiv. Chernobyl in Russian (or Ukrainian) means a black grass (or black stalk). It was an ominous name.
Because my cousin was serving nearby, our fear instantly escalated. We knew the Soviet regime had little regard for human life. Even as radioactive material was spreading, 2.6 million people in Kyiv were not warned, not evacuated, not protected. Soviet media remained silent, broadcasting nothing out of the ordinary. Yet we understood immediately that something far worse than anyone could openly admit was unfolding. By that evening, through Western broadcasts, we learned the words that would forever change history: a disaster at the Chernobyl nuclear power plant.
The next day, my grandfather (Lev I. Bolotin), a physicist working on the Particle accelerator at Kharkiv Institute of Physics and Technology (which is now temporarily closed due to the Russian Invasion of Ukraine), came home carrying a strange device. He called it a Geiger counter. It could detect radiation by emitting sounds when a radioactive material is detected. The concern on his face was unmistakable, deeply unsettling. I did not fully understand the science then, but I understood his fear.
I still remember that moment vividly, forty years later, as if it were yesterday.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Nearly four decades have passed since Challenger and Chernobyl explained, brutally and publicly, what happens when warning signs are ignored, expertise is sidelined, and accountability is blurred. One disaster unfolded in front of millions of schoolchildren on live television; the other was hidden, denied, and politicized until it could no longer be contained. Different contexts, different systems—yet the same underlying lesson.
Human judgment failed where scientific understanding should have prevailed.
And forty years later, this lesson remains disturbingly relevant to the state of STEM education in Canada and North America.
What We Chose to Remember and What We Didn’t
In the aftermath of Challenger, the public narrative initially centered on inspiration. A classroom teacher, Christa McAuliffe, had been selected to fly into space, and the President of the United States publicly emphasized the role of teachers in inspiring the next generation. Science suddenly felt personal and accessible. Across North America, classrooms paused to watch the launch live. It was meant to be followed by conversations about human ingenuity, scientific progress, and the role of STEM in shaping society.
What followed, however, was different—though no less significant.
After Chernobyl, the narrative shifted toward scale and fear. Invisible radiation, evacuation zones, and long-term environmental and health consequences dominated public consciousness. The disaster exposed not only technological failure but also the devastating consequences of secrecy, centralized control, and systemic corruption within the Soviet system. Unlike Challenger, where failure was publicly examined, Chernobyl unfolded in silence and denial.
The political consequences were profound. Trust in the Soviet government eroded rapidly, both domestically and internationally. While Chernobyl did not single-handedly cause the collapse of the Soviet Union, it became a powerful symbol of a system that could no longer sustain itself. In this sense, Chernobyl is often remembered as one of the final straws—an event that laid bare the cost of placing ideology above science, and control above human life.
What was harder to sustain in both cases was attention to the less dramatic but more consequential causes, such as misunderstood data, ignored experts, normalized rule-breaking, weak conceptual grasp at decision points, institutional pressure overriding scientific caution. These are not cinematic failures. They are epistemic failures. And the core of these failures lies in the inability to create robust STEM education that discusses STEM ideas and their societal outcomes.
The Post-Disaster Paradox in STEM Education
One might expect that such events would strengthen science education, deepen respect for expertise, and reinforce the value of rigorous thinking. Instead, the decades that followed saw a gradual erosion of deep STEM engagement. Enrolment in advanced physics, chemistry, and mathematics declined. Students disengaged just as ideas became conceptually demanding. STEM became louder in policy documents but thinner in classrooms. We added activities, tools, slogans, and technologies—but often weakened the intellectual core. This is not coincidence. When STEM education drifts away from foundational understanding, it produces graduates who can follow procedures but struggle to evaluate evidence, question assumptions, or recognize when systems are being pushed beyond safe limits. That is precisely the failure mode exposed by both CHALLENGER and Chernobyl.
The Human Factor Is Not Optional
Engineering disasters are often described as “human error,” but this phrase is misleading. The real issue is how humans reason within systems. How humans make decision under extreme pressure. There is no better illustration for this concept as the analysis of aviation disasters. A Canadian Series “Mayday” shows multiple examples how human errors can lead to terrifying consequences even when state of the art technology is present. Humans are prone to interpret imperfect data, make wrong decisions under pressure, rely on assumption and models that have limited applications, overestimate their expertise among other things. Strong scientific foundations do not eliminate human judgment; they discipline it. Without conceptual understanding, numbers become decorations. Without modelling literacy, simulations become theatre. Without epistemic humility, confidence becomes dangerous.
This is why STEM education cannot be reduced to engagement alone. Curiosity without rigor is not enough. Tools without theory are not safeguards.
Forty Years On: Why Accountability Still Depends on Science
Today’s challenges, such as climate systems, pandemics, AI-driven decisions, energy infrastructure, aerospace, and environmental risk, are at least as complex as those faced in the 1980s. The margin for error is smaller, not larger.
Accountability in such contexts depends on people who understand uncertainty, recognize when data contradicts expectations, know the limits of models, are willing to slow down decisions, and have the scientific confidence to say “this is not safe”. These capacities are not innate. They are cultivated through serious, sustained STEM education. I remember hearing a phrase attributed to a navy SEAL ” Under pressure, you do not rise to the occasion, you fall to the level of your training”. If we want our students to be able to solve the problems we face today under the increasing amount of pressure, we have to train them differently. The level of STEM education most of our students get in their schooling is not going to prepare to withstand the challenges are yet to come.
What Needs to Change
If Challenger and Chernobyl still matter, and they do, then STEM education must reclaim its role as preparation for responsible judgment, not just technical skill. I claim that we today, by and large, fail to do either. We need to prioritizing STEM education depth over coverage; valuing disciplinary knowledge (this applies to teacher preparation as well) alongside interdisciplinary work; supporting teachers as epistemic leaders, not curriculum deliverers; treating technology as a cognitive aid, not a substitute for understanding; and designing learning where evidence, uncertainty, and accountability are central. Removing grades to make students feel good about themselves, in my view, is counterproductive and will have dire consequences in the future.
A Closing Thought
We are now forty years removed from two disasters that should have permanently altered how we think about science, responsibility, and education. In both cases, the science was understood well enough to prevent catastrophe, yet political pressure, institutional inertia, and managerial authority overruled expert judgment. The real question, therefore, is not whether students are “engaged,” but whether they are being prepared to recognize danger, challenge assumptions, and speak with evidence-based authority when it matters most. STEM education, at its core, is not about producing more engineers or scientists; it is about ensuring that when critical decisions are made, someone in the room truly understands what is at stake—and is empowered to be heard. The lesson is not new. What would be truly unforgivable is choosing, once again, to ignore it.
