Significance of PISA math results

A new round of two international comparisons of student mathematics performance came out recently and there was a lot of interest because the reports were almost simultaneous, TIMSS[1] in late November 2016 and PISA[2] just a week later. They are often reported as 2015 instead of 2016 because the data collection for each was in late 2015 that would seem to improve the comparison even more. In fact, no comparison is appropriate; they are completely different instruments and, between them, the TIMSS is the one that should be of more concern to educators. Perhaps surprising and with great room for improvement, the US performance is not as dire as the PISA results would imply. By contrast, Finland continues to demonstrate that its internationally recognized record of PISA-proven success in mathematics education – with its widely applauded, student-friendly approach – is completely misinforming.

In spite of the popular press and mathematics education folklore, Finland’s performance has been known to be overrated since PISA first came out as documented by an open letter[3] written by the president of the Finnish Mathematical Society and cosigned by many mathematicians and experts in other math-based disciplines:

“The PISA survey tells only a partial truth of Finnish children’s mathematical skills” “in fact the mathematical knowledge of new students has declined dramatically”

This letter links to a description[4] of the most fundamental problem that directly involves elementary mathematics education:

“Severe shortcomings in Finnish mathematics skills” “If one does not know how to handle fractions, one is not able to know algebra”

The previous TIMSS had the 4th grade performance of Finland as a bit above that of the US but well behind by 8th. In the new report, it has slipped below the US at 4th and did not even submit itself to be assessed at 8th much less the Advanced level. Similar remarks apply to another country often recognized for its student-friendly mathematics education, the Netherlands, home of the PISA at the Freudenthal Institute. This decline was recognized in the TIMSS summary of student performance[1]with the comparative grade-level rankings as Exhibits 1.1 and 1.2 with the Advanced[5] as Exhibit M1.1:

pastedimageBy contrast, PISA[2] came out a week later and…

Netherlands 11
Finland 13
United States 41

Note: These include China* (just below Japan) of 3 provinces, not the country – if omitted, subtract 1.

Why the difference? The problem is that PISA was never for “school mathematics” but for all 15-year-old students in regard to their “mathematics literacy[6]”, not even mathematics at the algebra level needed for non-remedial admission to college much less the TIMSS Advanced level interpreted as AP or IB Calculus in the US:

“PISA is the U.S. source for internationally comparative information on the mathematical and scientific literacy of students in the upper grades at an age that, for most countries, is near the end of compulsory schooling. The objective of PISA is to measure the “yield” of education systems, or what skills and competencies students have acquired and can apply in these subjects to real-world contexts by age 15. The literacy concept emphasizes the mastery of processes, understanding of concepts, and application of knowledge and functioning in various situations within domains. By focusing on literacy, PISA draws not only from school curricula but also from learning that may occur outside of school.”

Historically relevant is the fact that conception of PISA at the Freudenthal Institute in the Netherlands included heavy guidance from Thomas Romberg of the University of Wisconsin’s WCER and the original creator of the middle school math ed curriculum MiC, Mathematics in Context. Its underlying philosophy is exactly that of PISA, the study of mathematics through everyday applications that do not require the development of the more sophisticated mathematics that opens the doors for deeper study in mathematics; i.e., all mildly sophisticated math-based career opportunities, so-called STEM careers. In point of fact, the arithmetic of the PISA applications is calculator-friendly so even elementary arithmetic through ordinary fractions – so necessary for eventual algebra – need not be developed to score well.


[2] (Table 3, page 23)
[5] [Distribution of Advanced Mathematics Achievement]

Wayne Bishop, PhD
Professor of Mathematics, Emeritus
California State University, LA

Significance of PISA math results was originally published on Nonpartisan Education Blog

Significance of PISA math results was originally published on Nonpartisan Education Blog

Large-scale educational testing in Chile: Some thoughts

Recently in the auditorium of Universidad Finis Terrae, I argued that Chile’s Prueba de Selección Universitaria (PSU) cannot be “fixed” and should be scrapped. I do not, however, advocate the elimination of university entrance examinations but, rather, the creation of a fairer and more informative and transparent examination.

Chile’s pre-2002 system (PAA plus PCEs) may not have been well maintained. But, the basic structure of a general aptitude test strongly correlated with university-level work, along with highly focused content-based tests designed by each faculty is as close to an ideal university entrance system as one could hope for.

I have perused the decade-long history of the PSU, its funding, and the involvement of international organizations (World Bank, OECD) in shaping its character. Most striking is the pervasive involvement of economists in creating, implementing, and managing the test, and the corresponding lack of involvement of professionals trained in testing and measurement.

In the PSU, World Bank, and OECD documents, the economists advocate one year that the PSU be a high school exit examination (which should be correlated with the high school curriculum), then the next year that it be a university entrance examination (which should be correlated with university work), or that it is meant to monitor the implementation of the new curriculum, or that it is designed to increase opportunities for students from low socioeconomic backgrounds (in fact, it has been decreasing those opportunities). No test can possibly do all that the PSU advocates have promised it will do. The PSU has been sold as a test that can do anything you might like a test to do, and now does nothing well. It is time to bring in a team that genuinely understands how to build a test, and is willing to be open and transparent in all its dealings with the public.

The greatest danger posed by the dysfunctional PSU, I fear, is the bad reputation it gives all tests. Some in Chile have advocated eliminating the SIMCE, which, to my observation, is as well managed as the PSU is poorly managed. The SIMCE gathers information to be used in improving instruction. In theory, a school could be closed due to poor SIMCE scores, but not one ever has been. There are no consequences for students or teachers. Much information about the SIMCE is freely available and more becomes available every month; it is not the “black box” that the PSU is.

It would be a mistake to eliminate all testing because one is badly managed. We need assessments. It is easy to know what you are teaching; but, you can only know what students are learning if you assess.

Richard P. Phelps, US Fulbright Specialist at the Agencia de Calidad de la Educacion and Universidad Finis Terrae in Santiago, editor and co-author of Correcting Fallacies about Educational and Psychological Testing (American Psychological Association, 2008/2009)

Large-scale educational testing in Chile: Some thoughts was originally published on Nonpartisan Education Blog

Large-scale educational testing in Chile: Some thoughts was originally published on Nonpartisan Education Blog

Large-scale educational testing in Chile: Some thoughts was originally published on Nonpartisan Education Blog