Are We Too Quick to Accept PIAAC Findings at Face Value?

Ralf St. Clair comments on last week’s PIAAC research conference:

At this meeting, lots of findings were discussed, but very little time was spent on methodology. The papers written by presenters were not available in advance (and mostly not at the meeting). One of the problems with PIAAC data is that it is not complete…

In many cases such data gaps are tackled through synthetic data, where the existing data is used to estimate what the missing data should be. One of the problems with this, of course, is that the missing data is essentially assumed to fit with what we have, and unexpected results will never arise.

Without understanding the details of how these types are issues are tackled it is difficult to assess the implications of some of the correlations found, which are often quite weak. Would they exist at all if we had the missing data? Would they run in different directions? What sorts of assumptions are being made throughout the research process that generates the results?

Yet throughout the meeting the findings were accepted at face value and the issues of the data set never fully discussed, even though it was a room full of people who could understand and even work out how to deal with them. As in so much of the activity that surrounds international surveys, the will to believe overwhelms the skepticism we must bring to these exercises. (my emphasis)

I would just add that critical scrutiny is particularly important with PIAAC since it appears that the adult education field (in the U.S. at least, can’t speak for other countries) has decided to embrace PIAAC as our primary foundational data source for policy decisions going forward.

I recommend reading the entire post.