Close all USED-funded research centers: Evaluation of existing regulations: My two bits

My comments below in response to the USED request for comments on existing USED regulations. To submit your own, follow the instructions at:  https://www.regulations.gov/document?D=ED-2017-OS-0074-0001

MEMORANDUM
To:  Hilary Malawer, Assistant General Counsel, Office of the General Counsel, U.S. Department of Education
From:  Richard P. Phelps
Date:  July 8, 2017
Re:  Evaluation of Existing Regulations[1]

Greetings:

I encourage the US Education Department to eliminate from any current and future funding education research centers. Ostensibly, federally funded education research centers fill a “need” for more research to guide public policy on important topics. But, the research centers are almost entirely unregulated, so they can do whatever they please. And, what they please is too often the promotion of their own careers and the suppression or denigration of competing ideas and evidence.

Federal funding of education research centers concentrates far too much power in too few hands. And, that power is nearly unassailable. One USED funded research center, the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) blatantly and repeatedly misrepresented research I had conducted while at the U.S. Government Accountability Office (GAO) in favor of their own small studies on the same topic. I was even denied attendance at public meetings where my research was misrepresented. Promises to correct the record were made, but not kept.

When I appealed to the USED project manager, he replied that he had nothing to say about “editorial” matters. In other words, a federally funded education research center can write and say anything that pleases, or benefits, the individuals inside.

Capturing a federally funded research center contract tends to boost the professional provenance of the winners stratospherically. In the case of CRESST, the principals assumed control of the National Research Council’s Board on Testing and Assessment, where they behaved typically—citing themselves and those who agree with them, and ignoring, or demonizing, the majority of the research that contradicted their work and policy recommendations.

Further, CRESST principals now seem to have undue influence on the assessment research of the international agency, the Organisation for Economic Co-operation and Development (OECD), which, as if on cue, has published studies that promote the minority of the research sympathetic to CRESST doctrine while simply ignoring even the existence of the majority of the research that is not. The rot—the deliberate suppression of the majority of the relevant research–has spread worldwide, and the USED funded it.

In summary, the behavior of the several USED funded research centers I have followed over the years meet or exceed the following thresholds identified in the President’s Executive Order 13777:

(ii) Are outdated, unnecessary, or ineffective;

(iii) Impose costs that exceed benefits;

(iv) Create a serious inconsistency or otherwise interfere with regulatory reform initiatives and policies;

(v) Are inconsistent with the requirements of section 515 of the Treasury and General Government Appropriations Act, 2001 (44 U.S.C. 3516 note), or the guidance issued pursuant to that provision, in particular those regulations that rely in whole or in part on data, information, or methods that are not publicly available or that are insufficiently transparent to meet the standard for reproducibility.

Below, I cite only relevant documents that I wrote myself, so as not to implicate anyone else. As the research center principals gain power, fewer and fewer of their professional compatriots are willing to disagree with them. The more power they amass, the more difficult it becomes for contrary evidence and points of view, no matter how compelling or true, to even get a hearing.

References:

Phelps, R. P. (2015, July). The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html

Phelps, R. P. (2014, October). Review of Synergies for Better Learning: An International Perspective on Evaluation and Assessment (OECD, 2013), Assessment in Education: Principles, Policies, & Practices. doi:10.1080/0969594X.2014.921091 http://www.tandfonline.com/doi/full/10.1080/0969594X.2014.921091#.VTKEA2aKJz1

Phelps, R. P. (2013, February 12). What Happened at the OECD? Education News.

Phelps, R. P. (2013, January 28). OECD Encourages World to Adopt Failed US Ed Programs. Education News.

Phelps, R. P. (2013). The rot spreads worldwide: The OECD – Taken in and taking sides. New Educational Foundations, 2(1). Preview: http://www.newfoundations.com/NEFpubs/NEFv2Announce.html

Phelps, R. P. (2012, June). Dismissive reviews: Academe’s Memory Hole. Academic Questions, 25(2), pp. 228–241. doi:10.1007/s12129-012-9289-4 https://www.nas.org/articles/dismissive_reviews_academes_memory_hole

Phelps, R. P. (2012). The effect of testing on student achievement, 1910–2010. International Journal of Testing, 12(1), 21–43. http://www.tandfonline.com/doi/abs/10.1080/15305058.2011.602920

Phelps, R. P. (2010, July). The source of Lake Wobegon [updated]. Nonpartisan Education Review / Articles, 1(2). http://nonpartisaneducation.org/Review/Articles/v6n3.htm

Phelps, R. P. (2000, December). High stakes: Testing for tracking, promotion, and graduation, Book review, Educational and Psychological Measurement, 60(6), 992–999. http://richardphelps.net/HighStakesReview.pdf

Phelps, R. P. (1999, April). Education establishment bias? A look at the National Research Council’s critique of test utility studies. The Industrial-Organizational Psychologist, 36(4) 37–49. https://www.siop.org/TIP/backissues/Tipapr99/4Phelps.aspx

[1] In accordance with Executive Order 13777, “Enforcing the Regulatory Reform Agenda,” the Department of Education (Department) is seeking input on regulations that may be appropriate for repeal, replacement, or modification.

Students Last

Will Fitzhugh

The Concord Review
6 April 2017
The great social psychiatrist Harry Stack Sullivan wrote that the principal problem with communication is that we think we express meaning to others, when in fact we evoke it.
That is, what we say brings a response in the listener which involves their current thoughts at the the time, their feelings, wishes, goals and other preoccupations, all of which affect and alter the meanings of our expression as they hear it.
Psychiatrists are carefully trained to be useful in that situation. They learn to listen. When they do listen, they can derive an understanding of at least some of the ways in which the thoughts of their patients have responded to what was said. They can find out how the patient’s own experiences, thoughts and concerns have interacted with what the psychiatrist said, and this can help the doctor shape what they say next in perhaps more pertinent and more useful ways.
When I was a high school History teacher I was not a bad person, but I almost never shut up in class. If the teacher talks, that can make life easier for students, because they can continue giving their attention to whatever they were thinking about at the time, and if the teacher pauses, most students can easily ask a question to get the teacher talking again if they seem to be slowing down.
Most high school History teachers are not bad people, but they usually feel they have an obligation to talk, present, excite, inspire, demonstrate material and in other ways fill up the time of students in their classes. Some of the best teachers do ask questions, but even they believe they can’t spend too much time on student answers, not to mention on what students are actually thinking about what the teacher has said, or, if other students talk, about what they have said.
This is much less the case in some special secondary schools, like Phillips Exeter, which have small classes meeting around a table as a seminar, specifically designed to gather the comments and thoughts of students about academic subjects. But for public school teachers with five classes of 30 students each, that kind of dialogue is not an option.
Unless they fall silent, high school History teachers almost never have any idea what their students are thinking, and students come to understand that, at least in most classrooms, what is on their minds is of little importance to the process. This doesn’t mean that they don’t learn anything in their History classes. Some teachers really are well-educated, full of good stories, fascinating speakers, and fun to be with. That does not change the fact that even those best teachers have very little idea of what students are actually thinking about the History which is offered to them.
Some teachers do assign short papers, and if the students can choose the topics themselves, and if teachers have the time to read those papers, they can learn more about what some part of History means to their students. Sad to say, the assignment of serious History research papers is declining in this country, with some students working on slide presentations or videos, but many fewer students writing Extended Essays in History.
Education reform pundits all agree that the most important variable in student academic achievement is teacher quality, because what teachers do is the lowest level of educational activity of which they are able to take notice. In fact, the most important variable in student academic achievement is student academic work. Students learn the most from the academic work that they do, but this factor escapes the notice of the majority of education professors, theorists, reporters and other thought leaders. 
Since 1987, The Concord Review has published 1,241 exemplary History research papers [average 7,000 words, with endnotes and bibliography] by secondary students from 44 states and 40 other countries [tcr.org]. These papers are on a vary large variety of historical topics, ancient and modern, domestic and foreign, but all of them show what students are actually thinking as they take History seriously. If more teachers of History would read a few of these strong research papers, they would become more aware, first, that some high school History students actually can think about History, and second, that such student writing, based on extensive reading of History, demonstrates a level of sophistication in their understanding of History that can never be discovered in classes where teachers do all the talking. 
Great teachers of History should continue to talk the way they do in classes, and their students will learn a lot. But the actual thoughts of students of History should have a place for their expression as well. Students whose work is published in The Concord Review not only benefit from the hard work they have done, they also come to have greater respect for their own achievement and potential as scholars of History.


“Teach with Examples”
Will Fitzhugh [Founder],
The Concord Review [1987]

National Writing Board [1998]
TCR Academic Coaches [2014]

TCR Summer Program [2014]
730 Boston Post Road, Suite 24
Sudbury, Massachusetts 01776-3371 USA
978-443-0022
www.tcr.orgfitzhugh@tcr.org
Varsity Academics®
tcr.org/bookstore
www.tcr.org/blog

Significance of PISA math results

A new round of two international comparisons of student mathematics performance came out recently and there was a lot of interest because the reports were almost simultaneous, TIMSS[1] in late November 2016 and PISA[2] just a week later. They are often reported as 2015 instead of 2016 because the data collection for each was in late 2015 that would seem to improve the comparison even more. In fact, no comparison is appropriate; they are completely different instruments and, between them, the TIMSS is the one that should be of more concern to educators. Perhaps surprising and with great room for improvement, the US performance is not as dire as the PISA results would imply. By contrast, Finland continues to demonstrate that its internationally recognized record of PISA-proven success in mathematics education – with its widely applauded, student-friendly approach – is completely misinforming.

In spite of the popular press and mathematics education folklore, Finland’s performance has been known to be overrated since PISA first came out as documented by an open letter[3] written by the president of the Finnish Mathematical Society and cosigned by many mathematicians and experts in other math-based disciplines:

“The PISA survey tells only a partial truth of Finnish children’s mathematical skills” “in fact the mathematical knowledge of new students has declined dramatically”

This letter links to a description[4] of the most fundamental problem that directly involves elementary mathematics education:

“Severe shortcomings in Finnish mathematics skills” “If one does not know how to handle fractions, one is not able to know algebra”

The previous TIMSS had the 4th grade performance of Finland as a bit above that of the US but well behind by 8th. In the new report, it has slipped below the US at 4th and did not even submit itself to be assessed at 8th much less the Advanced level. Similar remarks apply to another country often recognized for its student-friendly mathematics education, the Netherlands, home of the PISA at the Freudenthal Institute. This decline was recognized in the TIMSS summary of student performance[1]with the comparative grade-level rankings as Exhibits 1.1 and 1.2 with the Advanced[5] as Exhibit M1.1:

pastedimageBy contrast, PISA[2] came out a week later and…

Netherlands 11
Finland 13
United States 41

Note: These include China* (just below Japan) of 3 provinces, not the country – if omitted, subtract 1.

Why the difference? The problem is that PISA was never for “school mathematics” but for all 15-year-old students in regard to their “mathematics literacy[6]”, not even mathematics at the algebra level needed for non-remedial admission to college much less the TIMSS Advanced level interpreted as AP or IB Calculus in the US:

“PISA is the U.S. source for internationally comparative information on the mathematical and scientific literacy of students in the upper grades at an age that, for most countries, is near the end of compulsory schooling. The objective of PISA is to measure the “yield” of education systems, or what skills and competencies students have acquired and can apply in these subjects to real-world contexts by age 15. The literacy concept emphasizes the mastery of processes, understanding of concepts, and application of knowledge and functioning in various situations within domains. By focusing on literacy, PISA draws not only from school curricula but also from learning that may occur outside of school.”

Historically relevant is the fact that conception of PISA at the Freudenthal Institute in the Netherlands included heavy guidance from Thomas Romberg of the University of Wisconsin’s WCER and the original creator of the middle school math ed curriculum MiC, Mathematics in Context. Its underlying philosophy is exactly that of PISA, the study of mathematics through everyday applications that do not require the development of the more sophisticated mathematics that opens the doors for deeper study in mathematics; i.e., all mildly sophisticated math-based career opportunities, so-called STEM careers. In point of fact, the arithmetic of the PISA applications is calculator-friendly so even elementary arithmetic through ordinary fractions – so necessary for eventual algebra – need not be developed to score well.

 

[1] http://timss2015.org/timss-2015/mathematics/student-achievement/
[2] http://nces.ed.gov/pubs2017/2017048.pdf (Table 3, page 23)
[3] http://matematiikkalehtisolmu.fi/2005/erik/PisaEng.html
[4] http://matematiikkalehtisolmu.fi/2005/erik/KivTarEng.html
[5] http://timss2015.org/advanced/ [Distribution of Advanced Mathematics Achievement]
[6] https://nces.ed.gov/timss/pdf/naep_timss_pisa_comp.pdf

Wayne Bishop, PhD
Professor of Mathematics, Emeritus
California State University, LA

A New Core

The Concord Review
December 2, 2016

Dinosaur scholars like Mark Bauerlein argue that the decline in the humanities in our universities is caused by their retreat from their own best works—literature departments no longer celebrate great literature, history departments no longer offer great works of history to students to read, and so on.

However, an exciting new article by Nicholas Lemann in The Review from The Chronicle of Higher Education, while it shares some concerns about the decline of the humanities, proposes an ingenious modern new Core, which would…

“put methods above subject-matter knowledge in the highest place of honor, and they treat the way material is taught as subsidiary to what is taught…”

In this new design, what is taught is methods, not knowledge—of history, literature, languages, philosophy and all that…

Here is a list of the courses Professor Lemann recommends:

Information Acquisition
Cause and Effect
Interpretation
Numeracy
Perspective
The Language of Form
Thinking in Time
Argument

And he says that: “What these courses have in common is a primary commitment to teaching the rigorous (and also properly humble) pursuit of knowledge.”

At last we can understand that the purpose of higher education in the humanities should be the pursuit of knowledge, and not actually to catch up with any of it. We may thus enjoy a new generation of mentally “fleet-footed” ignoramuses who have skipped the greatness of the humanities in the chase for methods and skills of various kinds. This approach is as hollow and harmful as it was in the 1980s, when Harvard College tried to design a knowledge-free, methods-filled Core Curriculum, so it seems that what comes around does indeed come around, but still students are neither learning from or enjoying the greatness of the humanities in college much these days…

——————-

“Teach with Examples”
Will Fitzhugh [Founder]
The Concord Review [1987]
Ralph Waldo Emerson Prizes [1995]
National Writing Board [1998]
TCR Academic Coaches [2014]
730 Boston Post Road, Suite 24
Sudbury, Massachusetts 01776-3371 USA
978-443-0022
http://www.tcr.org; fitzhugh@tcr.org
Varsity Academics®
tcr.org/bookstore
http://www.tcr.org/blog