Cognitive Science and the Common Core

New in the Nonpartisan Education Review:

Cognitive Science and the Common Core Mathematics Standards

by Eric A. Nelson

Abstract

Between 1995 and 2010, most U.S. states adopted K–12 math standards which discouraged memorization of math facts and procedures.  Since 2010, most states have revised standards to align with the K–12 Common Core Mathematics Standards (CCMS).  The CCMS do not ask students to memorize facts and procedures for some key topics and delay work with memorized fundamentals in others.

Recent research in cognitive science has found that the brain has only minimal ability to reason with knowledge that has not previously been well-memorized.  This science predicts that students taught under math standards that discouraged initial memorization for math topics will have significant difficulty solving numeric problems in mathematics, science, and engineering.  As one test of this prediction, in a recent OECD assessment of numeracy skills among 22 developed-world nations, U.S. 16–24 year olds ranked dead last.  Discussion will include steps that can be taken to align K–12 state standards with practices supported by cognitive research.

Close all USED-funded research centers: Evaluation of existing regulations: My two bits

My comments below in response to the USED request for comments on existing USED regulations. To submit your own, follow the instructions at:  https://www.regulations.gov/document?D=ED-2017-OS-0074-0001

MEMORANDUM
To:  Hilary Malawer, Assistant General Counsel, Office of the General Counsel, U.S. Department of Education
From:  Richard P. Phelps
Date:  July 8, 2017
Re:  Evaluation of Existing Regulations[1]

Greetings:

I encourage the US Education Department to eliminate from any current and future funding education research centers. Ostensibly, federally funded education research centers fill a “need” for more research to guide public policy on important topics. But, the research centers are almost entirely unregulated, so they can do whatever they please. And, what they please is too often the promotion of their own careers and the suppression or denigration of competing ideas and evidence.

Federal funding of education research centers concentrates far too much power in too few hands. And, that power is nearly unassailable. One USED funded research center, the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) blatantly and repeatedly misrepresented research I had conducted while at the U.S. Government Accountability Office (GAO) in favor of their own small studies on the same topic. I was even denied attendance at public meetings where my research was misrepresented. Promises to correct the record were made, but not kept.

When I appealed to the USED project manager, he replied that he had nothing to say about “editorial” matters. In other words, a federally funded education research center can write and say anything that pleases, or benefits, the individuals inside.

Capturing a federally funded research center contract tends to boost the professional provenance of the winners stratospherically. In the case of CRESST, the principals assumed control of the National Research Council’s Board on Testing and Assessment, where they behaved typically—citing themselves and those who agree with them, and ignoring, or demonizing, the majority of the research that contradicted their work and policy recommendations.

Further, CRESST principals now seem to have undue influence on the assessment research of the international agency, the Organisation for Economic Co-operation and Development (OECD), which, as if on cue, has published studies that promote the minority of the research sympathetic to CRESST doctrine while simply ignoring even the existence of the majority of the research that is not. The rot—the deliberate suppression of the majority of the relevant research–has spread worldwide, and the USED funded it.

In summary, the behavior of the several USED funded research centers I have followed over the years meet or exceed the following thresholds identified in the President’s Executive Order 13777:

(ii) Are outdated, unnecessary, or ineffective;

(iii) Impose costs that exceed benefits;

(iv) Create a serious inconsistency or otherwise interfere with regulatory reform initiatives and policies;

(v) Are inconsistent with the requirements of section 515 of the Treasury and General Government Appropriations Act, 2001 (44 U.S.C. 3516 note), or the guidance issued pursuant to that provision, in particular those regulations that rely in whole or in part on data, information, or methods that are not publicly available or that are insufficiently transparent to meet the standard for reproducibility.

Below, I cite only relevant documents that I wrote myself, so as not to implicate anyone else. As the research center principals gain power, fewer and fewer of their professional compatriots are willing to disagree with them. The more power they amass, the more difficult it becomes for contrary evidence and points of view, no matter how compelling or true, to even get a hearing.

References:

Phelps, R. P. (2015, July). The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html

Phelps, R. P. (2014, October). Review of Synergies for Better Learning: An International Perspective on Evaluation and Assessment (OECD, 2013), Assessment in Education: Principles, Policies, & Practices. doi:10.1080/0969594X.2014.921091 http://www.tandfonline.com/doi/full/10.1080/0969594X.2014.921091#.VTKEA2aKJz1

Phelps, R. P. (2013, February 12). What Happened at the OECD? Education News.

Phelps, R. P. (2013, January 28). OECD Encourages World to Adopt Failed US Ed Programs. Education News.

Phelps, R. P. (2013). The rot spreads worldwide: The OECD – Taken in and taking sides. New Educational Foundations, 2(1). Preview: http://www.newfoundations.com/NEFpubs/NEFv2Announce.html

Phelps, R. P. (2012, June). Dismissive reviews: Academe’s Memory Hole. Academic Questions, 25(2), pp. 228–241. doi:10.1007/s12129-012-9289-4 https://www.nas.org/articles/dismissive_reviews_academes_memory_hole

Phelps, R. P. (2012). The effect of testing on student achievement, 1910–2010. International Journal of Testing, 12(1), 21–43. http://www.tandfonline.com/doi/abs/10.1080/15305058.2011.602920

Phelps, R. P. (2010, July). The source of Lake Wobegon [updated]. Nonpartisan Education Review / Articles, 1(2). http://nonpartisaneducation.org/Review/Articles/v6n3.htm

Phelps, R. P. (2000, December). High stakes: Testing for tracking, promotion, and graduation, Book review, Educational and Psychological Measurement, 60(6), 992–999. http://richardphelps.net/HighStakesReview.pdf

Phelps, R. P. (1999, April). Education establishment bias? A look at the National Research Council’s critique of test utility studies. The Industrial-Organizational Psychologist, 36(4) 37–49. https://www.siop.org/TIP/backissues/Tipapr99/4Phelps.aspx

[1] In accordance with Executive Order 13777, “Enforcing the Regulatory Reform Agenda,” the Department of Education (Department) is seeking input on regulations that may be appropriate for repeal, replacement, or modification.

Students Last

Will Fitzhugh

The Concord Review
6 April 2017
The great social psychiatrist Harry Stack Sullivan wrote that the principal problem with communication is that we think we express meaning to others, when in fact we evoke it.
That is, what we say brings a response in the listener which involves their current thoughts at the the time, their feelings, wishes, goals and other preoccupations, all of which affect and alter the meanings of our expression as they hear it.
Psychiatrists are carefully trained to be useful in that situation. They learn to listen. When they do listen, they can derive an understanding of at least some of the ways in which the thoughts of their patients have responded to what was said. They can find out how the patient’s own experiences, thoughts and concerns have interacted with what the psychiatrist said, and this can help the doctor shape what they say next in perhaps more pertinent and more useful ways.
When I was a high school History teacher I was not a bad person, but I almost never shut up in class. If the teacher talks, that can make life easier for students, because they can continue giving their attention to whatever they were thinking about at the time, and if the teacher pauses, most students can easily ask a question to get the teacher talking again if they seem to be slowing down.
Most high school History teachers are not bad people, but they usually feel they have an obligation to talk, present, excite, inspire, demonstrate material and in other ways fill up the time of students in their classes. Some of the best teachers do ask questions, but even they believe they can’t spend too much time on student answers, not to mention on what students are actually thinking about what the teacher has said, or, if other students talk, about what they have said.
This is much less the case in some special secondary schools, like Phillips Exeter, which have small classes meeting around a table as a seminar, specifically designed to gather the comments and thoughts of students about academic subjects. But for public school teachers with five classes of 30 students each, that kind of dialogue is not an option.
Unless they fall silent, high school History teachers almost never have any idea what their students are thinking, and students come to understand that, at least in most classrooms, what is on their minds is of little importance to the process. This doesn’t mean that they don’t learn anything in their History classes. Some teachers really are well-educated, full of good stories, fascinating speakers, and fun to be with. That does not change the fact that even those best teachers have very little idea of what students are actually thinking about the History which is offered to them.
Some teachers do assign short papers, and if the students can choose the topics themselves, and if teachers have the time to read those papers, they can learn more about what some part of History means to their students. Sad to say, the assignment of serious History research papers is declining in this country, with some students working on slide presentations or videos, but many fewer students writing Extended Essays in History.
Education reform pundits all agree that the most important variable in student academic achievement is teacher quality, because what teachers do is the lowest level of educational activity of which they are able to take notice. In fact, the most important variable in student academic achievement is student academic work. Students learn the most from the academic work that they do, but this factor escapes the notice of the majority of education professors, theorists, reporters and other thought leaders. 
Since 1987, The Concord Review has published 1,241 exemplary History research papers [average 7,000 words, with endnotes and bibliography] by secondary students from 44 states and 40 other countries [tcr.org]. These papers are on a vary large variety of historical topics, ancient and modern, domestic and foreign, but all of them show what students are actually thinking as they take History seriously. If more teachers of History would read a few of these strong research papers, they would become more aware, first, that some high school History students actually can think about History, and second, that such student writing, based on extensive reading of History, demonstrates a level of sophistication in their understanding of History that can never be discovered in classes where teachers do all the talking. 
Great teachers of History should continue to talk the way they do in classes, and their students will learn a lot. But the actual thoughts of students of History should have a place for their expression as well. Students whose work is published in The Concord Review not only benefit from the hard work they have done, they also come to have greater respect for their own achievement and potential as scholars of History.


“Teach with Examples”
Will Fitzhugh [Founder],
The Concord Review [1987]

National Writing Board [1998]
TCR Academic Coaches [2014]

TCR Summer Program [2014]
730 Boston Post Road, Suite 24
Sudbury, Massachusetts 01776-3371 USA
978-443-0022
www.tcr.orgfitzhugh@tcr.org
Varsity Academics®
tcr.org/bookstore
www.tcr.org/blog

Another post-Inaugural Change: The calendar

Here in DC, the nation’s capital, which has enjoyed Home Rule since 1974,
but remains ultimately under the thumb of Congress and the President (thanks
to Art. I, Section 8 of the Constitution), one never knows what surprise
awaits each new day. These days, one need not be addicted to social media
or even tea leaves to hypothesize what’s around the corner. Something as
politically innocuous as an “Alley Restoration” could be a harbinger of
things to come.

A few weeks ago, as I turned the corner into my alley, I was struck by signs
announcing an “Alley Restoration” and a change in our calendar. Not since
Pope Gregory XIII or even Julius Caesar …… (except for the brief
anticlerical calendar of the French Revolution). Don’t blame Marion Barry;
he has passed to his reward.

https://nonpartisaneducation.files.wordpress.com/2017/02/make-february-great-again-poster-17-0207.pdf

 

“Organizationally orchestrated propaganda” at ETS

With the testing opt-out movement growing in popularity in 2016, Common Core’s profiteers began to worry. Lower participation enough and the entire enterprise could be threatened: with meaningless aggregate scores; compromised test statistics vital to quality control; and a strong signal that many citizens no longer believe the Common Core sales pitch.

The Educational Testing Service (ETS) was established decades ago by the Carnegie Foundation to serve as an apolitical research laboratory for psychometric work. For a while, ETS played that role well, producing some of the world’s highest-quality, most objective measurement research.

In fits and starts over the past quarter century, however, ETS has commercialized. At this point, there should be no doubt in anyone’s mind that ETS is a business–a business that relies on contracts and a business that aims to please those who can pay for its services.

Some would argue, with some justification, that ETS had no choice but to change with the times. Formerly guaranteed contracts were no longer guaranteed, and the organization needed either to pay its researchers or let them go.

Instead of now presenting itself honestly to the public as a commercial enterprise seeking profits, however, ETS continues to prominently display the trappings of a neutral research laboratory seeking truths. Top employees are awarded lofty academic titles and research “chairs”. Whether the awards derive from good research work or success in courting new business is open to question.

I perceive that ETS at least attempts something like an even split between valid research and faux-research pandering. The awarding of ETS’s most prestigious honor bestowed upon outsiders–the Angoff Award–for example, takes turns between psychometricians conducting high-quality, non-political technical work one year, and high-profile gatekeepers conducting highly suspicious research the next. Members of the latter group can be found participating in, or awarding, ETS commercial contracts.

With their “research” on the Common Core test opt-out movement, ETS blew away any credible pretense that it conducts objective research where its profits are threatened. Opt-out leaders are portrayed by ETS as simple-minded, misinformed, parents of poor students, …you name it. And, of course, they are protesting against “innovative, rigorous, high quality” tests they are too dumb to appreciate.

Common Core testing, in case you didn’t know and haven’t guessed from that written above, represents a substantial share of ETS’s work. Pearson holds the largest share of work for the PARCC exams, but ETS holds the second largest.

The most ethical way for ETS to have handled the issue of Common Core opt-outs would have been to say nothing. After all, it is, supposedly, a research laboratory of apolitical test developers. They are statistical experts at developing assessment instruments, not at citizen movements, education administration, or public behavior.

Choosing to disregard the most ethical choice, ETS could have at least made it abundantly clear that it retains a large self-interest in the success of PARCC testing.

Instead, ETS continues to wrap itself in its old research laboratory coat and condemns opt-out movement leaders and sympathizers as ignorant and ill-motivated. Never mind that the opt-out leaders receive not a dime for their efforts, and ETS’s celebrity researchers are remunerated abundantly for communicating the company line.

Four months ago, I responded to one of these ETS anti-opt-out propaganda pieces, written by Randy E. Bennett, the “Norman O. Frederiksen Chair in Assessment Innovation at Educational Testing Service.” It took a few weeks, but ETS, in the person of Mr. Bennett, responded to my comment questioning ETS’s objectivity in the matter.

He asserted, “There’s a lot less organizationally orchestrated propaganda, and a lot more academic freedom, here than you might think!”

To which I replied, “The many psychometricians working at ETS with a starkly different vision of what constitutes “high quality” in assessment are allowed to publish purely technical pieces. But, IMHO, the PR road show predominantly panders to power and profit. ETS’s former reputation for scholarly integrity took decades to accumulate. To my observation, it seems to be taking less time to dissemble. RP”

My return comment, however, was blocked. All comments have now been removed from the relevant ETS web page. All comments remain available to read at the Disqus comments manager site, though. The vertical orange bar next to the Nonpartisan Education logo is Disqus’ indication that the comment was blocked by ETS at its web site.