My comments below in response to the USED request for comments on existing USED regulations. To submit your own, follow the instructions at: https://www.regulations.gov/document?D=ED-2017-OS-0074-0001
To: Hilary Malawer, Assistant General Counsel, Office of the General Counsel, U.S. Department of Education
From: Richard P. Phelps
Date: July 8, 2017
Re: Evaluation of Existing Regulations
I encourage the US Education Department to eliminate from any current and future funding education research centers. Ostensibly, federally funded education research centers fill a “need” for more research to guide public policy on important topics. But, the research centers are almost entirely unregulated, so they can do whatever they please. And, what they please is too often the promotion of their own careers and the suppression or denigration of competing ideas and evidence.
Federal funding of education research centers concentrates far too much power in too few hands. And, that power is nearly unassailable. One USED funded research center, the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) blatantly and repeatedly misrepresented research I had conducted while at the U.S. Government Accountability Office (GAO) in favor of their own small studies on the same topic. I was even denied attendance at public meetings where my research was misrepresented. Promises to correct the record were made, but not kept.
When I appealed to the USED project manager, he replied that he had nothing to say about “editorial” matters. In other words, a federally funded education research center can write and say anything that pleases, or benefits, the individuals inside.
Capturing a federally funded research center contract tends to boost the professional provenance of the winners stratospherically. In the case of CRESST, the principals assumed control of the National Research Council’s Board on Testing and Assessment, where they behaved typically—citing themselves and those who agree with them, and ignoring, or demonizing, the majority of the research that contradicted their work and policy recommendations.
Further, CRESST principals now seem to have undue influence on the assessment research of the international agency, the Organisation for Economic Co-operation and Development (OECD), which, as if on cue, has published studies that promote the minority of the research sympathetic to CRESST doctrine while simply ignoring even the existence of the majority of the research that is not. The rot—the deliberate suppression of the majority of the relevant research–has spread worldwide, and the USED funded it.
In summary, the behavior of the several USED funded research centers I have followed over the years meet or exceed the following thresholds identified in the President’s Executive Order 13777:
(ii) Are outdated, unnecessary, or ineffective;
(iii) Impose costs that exceed benefits;
(iv) Create a serious inconsistency or otherwise interfere with regulatory reform initiatives and policies;
(v) Are inconsistent with the requirements of section 515 of the Treasury and General Government Appropriations Act, 2001 (44 U.S.C. 3516 note), or the guidance issued pursuant to that provision, in particular those regulations that rely in whole or in part on data, information, or methods that are not publicly available or that are insufficiently transparent to meet the standard for reproducibility.
Below, I cite only relevant documents that I wrote myself, so as not to implicate anyone else. As the research center principals gain power, fewer and fewer of their professional compatriots are willing to disagree with them. The more power they amass, the more difficult it becomes for contrary evidence and points of view, no matter how compelling or true, to even get a hearing.
Phelps, R. P. (2015, July). The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html
Phelps, R. P. (2014, October). Review of Synergies for Better Learning: An International Perspective on Evaluation and Assessment (OECD, 2013), Assessment in Education: Principles, Policies, & Practices. doi:10.1080/0969594X.2014.921091 http://www.tandfonline.com/doi/full/10.1080/0969594X.2014.921091#.VTKEA2aKJz1
Phelps, R. P. (2013, February 12). What Happened at the OECD? Education News.
Phelps, R. P. (2013, January 28). OECD Encourages World to Adopt Failed US Ed Programs. Education News.
Phelps, R. P. (2013). The rot spreads worldwide: The OECD – Taken in and taking sides. New Educational Foundations, 2(1). Preview: http://www.newfoundations.com/NEFpubs/NEFv2Announce.html
Phelps, R. P. (2012, June). Dismissive reviews: Academe’s Memory Hole. Academic Questions, 25(2), pp. 228–241. doi:10.1007/s12129-012-9289-4 https://www.nas.org/articles/dismissive_reviews_academes_memory_hole
Phelps, R. P. (2012). The effect of testing on student achievement, 1910–2010. International Journal of Testing, 12(1), 21–43. http://www.tandfonline.com/doi/abs/10.1080/15305058.2011.602920
Phelps, R. P. (2010, July). The source of Lake Wobegon [updated]. Nonpartisan Education Review / Articles, 1(2). http://nonpartisaneducation.org/Review/Articles/v6n3.htm
Phelps, R. P. (2000, December). High stakes: Testing for tracking, promotion, and graduation, Book review, Educational and Psychological Measurement, 60(6), 992–999. http://richardphelps.net/HighStakesReview.pdf
Phelps, R. P. (1999, April). Education establishment bias? A look at the National Research Council’s critique of test utility studies. The Industrial-Organizational Psychologist, 36(4) 37–49. https://www.siop.org/TIP/backissues/Tipapr99/4Phelps.aspx
 In accordance with Executive Order 13777, “Enforcing the Regulatory Reform Agenda,” the Department of Education (Department) is seeking input on regulations that may be appropriate for repeal, replacement, or modification.
“Teach with Examples”
Will Fitzhugh [Founder],
The Concord Review 
A new round of two international comparisons of student mathematics performance came out recently and there was a lot of interest because the reports were almost simultaneous, TIMSS in late November 2016 and PISA just a week later. They are often reported as 2015 instead of 2016 because the data collection for each was in late 2015 that would seem to improve the comparison even more. In fact, no comparison is appropriate; they are completely different instruments and, between them, the TIMSS is the one that should be of more concern to educators. Perhaps surprising and with great room for improvement, the US performance is not as dire as the PISA results would imply. By contrast, Finland continues to demonstrate that its internationally recognized record of PISA-proven success in mathematics education – with its widely applauded, student-friendly approach – is completely misinforming.
In spite of the popular press and mathematics education folklore, Finland’s performance has been known to be overrated since PISA first came out as documented by an open letter written by the president of the Finnish Mathematical Society and cosigned by many mathematicians and experts in other math-based disciplines:
“The PISA survey tells only a partial truth of Finnish children’s mathematical skills” “in fact the mathematical knowledge of new students has declined dramatically”
This letter links to a description of the most fundamental problem that directly involves elementary mathematics education:
“Severe shortcomings in Finnish mathematics skills” “If one does not know how to handle fractions, one is not able to know algebra”
The previous TIMSS had the 4th grade performance of Finland as a bit above that of the US but well behind by 8th. In the new report, it has slipped below the US at 4th and did not even submit itself to be assessed at 8th much less the Advanced level. Similar remarks apply to another country often recognized for its student-friendly mathematics education, the Netherlands, home of the PISA at the Freudenthal Institute. This decline was recognized in the TIMSS summary of student performancewith the comparative grade-level rankings as Exhibits 1.1 and 1.2 with the Advanced as Exhibit M1.1:
By contrast, PISA came out a week later and…
United States 41
Note: These include China* (just below Japan) of 3 provinces, not the country – if omitted, subtract 1.
Why the difference? The problem is that PISA was never for “school mathematics” but for all 15-year-old students in regard to their “mathematics literacy”, not even mathematics at the algebra level needed for non-remedial admission to college much less the TIMSS Advanced level interpreted as AP or IB Calculus in the US:
“PISA is the U.S. source for internationally comparative information on the mathematical and scientific literacy of students in the upper grades at an age that, for most countries, is near the end of compulsory schooling. The objective of PISA is to measure the “yield” of education systems, or what skills and competencies students have acquired and can apply in these subjects to real-world contexts by age 15. The literacy concept emphasizes the mastery of processes, understanding of concepts, and application of knowledge and functioning in various situations within domains. By focusing on literacy, PISA draws not only from school curricula but also from learning that may occur outside of school.”
Historically relevant is the fact that conception of PISA at the Freudenthal Institute in the Netherlands included heavy guidance from Thomas Romberg of the University of Wisconsin’s WCER and the original creator of the middle school math ed curriculum MiC, Mathematics in Context. Its underlying philosophy is exactly that of PISA, the study of mathematics through everyday applications that do not require the development of the more sophisticated mathematics that opens the doors for deeper study in mathematics; i.e., all mildly sophisticated math-based career opportunities, so-called STEM careers. In point of fact, the arithmetic of the PISA applications is calculator-friendly so even elementary arithmetic through ordinary fractions – so necessary for eventual algebra – need not be developed to score well.
 http://nces.ed.gov/pubs2017/2017048.pdf (Table 3, page 23)
 http://timss2015.org/advanced/ [Distribution of Advanced Mathematics Achievement]
Wayne Bishop, PhD
Professor of Mathematics, Emeritus
California State University, LA
The Concord Review
December 2, 2016
Dinosaur scholars like Mark Bauerlein argue that the decline in the humanities in our universities is caused by their retreat from their own best works—literature departments no longer celebrate great literature, history departments no longer offer great works of history to students to read, and so on.
However, an exciting new article by Nicholas Lemann in The Review from The Chronicle of Higher Education, while it shares some concerns about the decline of the humanities, proposes an ingenious modern new Core, which would…
“put methods above subject-matter knowledge in the highest place of honor, and they treat the way material is taught as subsidiary to what is taught…”
In this new design, what is taught is methods, not knowledge—of history, literature, languages, philosophy and all that…
Here is a list of the courses Professor Lemann recommends:
Cause and Effect
The Language of Form
Thinking in Time
And he says that: “What these courses have in common is a primary commitment to teaching the rigorous (and also properly humble) pursuit of knowledge.”
At last we can understand that the purpose of higher education in the humanities should be the pursuit of knowledge, and not actually to catch up with any of it. We may thus enjoy a new generation of mentally “fleet-footed” ignoramuses who have skipped the greatness of the humanities in the chase for methods and skills of various kinds. This approach is as hollow and harmful as it was in the 1980s, when Harvard College tried to design a knowledge-free, methods-filled Core Curriculum, so it seems that what comes around does indeed come around, but still students are neither learning from or enjoying the greatness of the humanities in college much these days…
“Teach with Examples”
Will Fitzhugh [Founder]
The Concord Review 
Ralph Waldo Emerson Prizes 
National Writing Board 
TCR Academic Coaches 
730 Boston Post Road, Suite 24
Sudbury, Massachusetts 01776-3371 USA