There is little evidence that higher test scores are correlated with success in later life, let alone with better college or job performance.
Reference: Mary Anne Rawid quoted in David C. Berliner and Bruce J. Biddle, The Manufactured Crisis: Myths, Fraud, and the Attack on America's Public Schools, Cambridge, Mass., Perseus Books, 1995, p. 194.The insistence on quantitative measures of school effectiveness has reduced educational outcomes to testable products and deemphasized the role of the school in other areas, such as preparing young people for civic participation, encouraging their personal development, and helping them master higher-level intellectual skills.
Critical thinking skills are particularly important in an internet age when so much information is available. Without such skills, students cannot discriminate between fraudulent information and accurate information, useful information and trivial information, poor quality information and high quality information. Yet an emphasis on tests trains students to accept all information provided rather than be able to assess its worth and credibility.
Even the New Commission on the Skills of the American Workforce has observed that standardised testing is too limited:
…more often than not, little or nothing is done to measure many of the other qualities that we have suggested spell the difference between success and failure for the students who will grow up to be the workers of 21st century America: creativity and innovation, facility with the use of ideas and abstractions, the self-discipline and organization needed to manage one’s work and drive it through to a successful conclusion, the ability to function well as a member of a team, and so on.
Nor is there much evidence that this focus on testing and accountability has helped children to attain a better education. In states in the US that have the severest penalties for failure in standardised tests teachers narrow their focus to what students need to know to pass the tests. This doesn’t help them pass other types of tests. Consequently those same states get below average results in the National Assessment of Educational Progress (NAEP), an exam that predates the current wave of standardised testing and covers a broader range of learning. Overall high school students are getting worse results on NAEP reading tests than they did in 1992.
The back to basics curriculum may have improved the scores for some elementary students in the US but it does not prepare them well for more advanced education. The much heralded gains in test scores at elementary schools at the turn of the 21st century didn’t translate into improved academic performance at high school. According to Kozol in his book The Shame of the Nation, the students at poor urban elementary schools who made the “dramatic gains … cannot set down their ideas in sentences expected of most fourth and fifth grade students in the suburbs” when they get to secondary schools, despite having longer school days, longer school years, no recess, and “cancelling or cutting back on all the so-called frills (art, music, even social sciences…)”.
The Washington Post reported how Washington students who have been schooled with constant testing are finding the college environment a shock because it requires a different mode of learning where memory skills do not play much role. John Bader, the associate dean for academic programs at John Hopkins University, points out that students who are “tested within an inch of their lives so regularly and so intensely” at school have to quickly adapt to an environment where, for many courses, “there is no clear answer. There is no right or wrong”.
The test scores for English and mathematics in UK primary schools also seemed to rise dramatically between 1995 and 2000 and this was “widely publicised as evidence of a rapid rise in standards”. However doubt has been cast on the validity of these results. A 2005 University of Durham study, which compared standardised test results with other studies and independent tests found that there were only small gains in primary school achievement between 1995 and 2000 (and none since). The gains in secondary education have also been small. The UK Statistics Commission agrees that the test scores overstate the improvement in learning during that period.
Various explanations have been given for the small gains in test results at the end of the 1990s, including the students’ greater familiarity with sitting tests, and the growing tendency for teachers to teach test techniques to students and to focus on teaching material that is likely to be in the tests. A similar initial rise in standardised test scores has been observed in the US, for example, when the Texas Assessment of Academic Skills was first introduced.
Another explanation has been that test standards have changed over time with more lenient exams and lowered pass marks to enable governments to claim their policies have led to improved student performance. The University of Durham researchers conclude: “The gains have been modest but the efforts have been massive.”
If you have any examples or updates you would like to contribute please email them to me and I will add them here. Please give references for where you sourced the information.