The science education myth

Here’s interesting piece of research that undercuts the persistent rhetoric about the failure of schools to prepare students for the STEM (science, technology, engineering, and mathematics) fields.

The Urban Institute’s report Into the Eye of the Storm: Assessing the Evidence on Science and Engineering Education, Quality, and Workforce Demand says that the evidence does NOT support recent policy report claims that the United States is falling behind other nations in science and math education and graduating insufficient numbers of scientists and engineers. The report argues that:

U.S. student performance rankings are comparable to other leading nations and colleges graduate far more scientists and engineers than are hired each year. Instead, the evidence suggests targeted education improvements are needed for the lowest performers and demand-side factors may be insufficient to attract qualified college graduates.

The report, written by B. Lindsay Lowell and Harold Salzman, shows U.S. student performance has steadily improved over time in math, science, and reading. It also found enrollment in math and science courses is actually up. For example, in 1982 high school graduates earned 2.6 math credits and 2.2 science credits on average. By 1998, the average number of credits increased to 3.5 math and 3.2 science credits. The percent of students taking chemistry increased from 45% in 1990 to 55% in 1996 and 60% in 2004. Scores in national tests such as the National Assessment of Educational Progress, the SAT, and the ACT have also shown increases in math scores over the past two decades.

Why the discrepancy between the evidence and the rhetoric about STEM achievement and jobs? Salzman told Business Week

that reports citing low U.S. international rankings often misinterpret the data. Review of the international rankings, which he says are all based on one of two tests, the Trends in International Mathematics & Science Study (TIMMS) or the Programme for International Student Assessment (PISA), show the U.S. is in a second-ranked group, not trailing the leading economies of the world as is commonly reported. In fact, the few countries that place higher than the U.S. are generally small nations, and few of these rank consistently high across all grades, subjects, and years tested. Moreover, he says, serious methodological flaws, such as different test populations, and other limitations preclude drawing any meaningful comparison of school systems between countries.

Leave a Reply

Your email address will not be published. Required fields are marked *