Surprise! SBAC and CRESST stonewall public records request for their financial records

Say what you will about Achieve, PARCC, Fordham, CCSSO, and NGA— some of the organizations responsible for promoting the Common Core Initiative on us all. But, their financial records are publicly available.

Not so for some other organizations responsible for the same Common Core promotion. The Smarter Balanced Assessment Consortium (SBAC) and the Center for Research on Educational Standards and Student Testing (CRESST) have absorbed many millions of taxpayer and foundation dollars over the years. But, their financial records have been hidden inside the vast, nebulous cocoon of the University of California – Los Angeles (UCLA). UCLA’s financial records, of course, are publicly available, but amounts there are aggregated at a level that subsumes thousands of separate, individual entities.

UCLA is a tax-supported state institution, however, and California has an open records law on the books. After some digging, I located the UCLA office responsible for records requests and wrote to them. Following is a summary of our correspondence to date:

 

July 5, 2017

Greetings:

I hope that you can help me. I have spent a considerable amount of time clicking around in search of financial reports for the Smarter Balanced Assessment Consortium (SBAC) and the National Center for Research on Evaluation, Standards, and Student Testing (CRESST), both “housed” at UCLA (or, until just recently in SBAC’s case). Even after many hours of web searching, I still have no clue as to where these data might be found.

Both organizations are largely publicly funded through federal grants. I would like to obtain revenue and expenditure detail on the order of what a citizen would expect to see in a nonprofit organization’s Form 990. I would be happy to search through a larger data base that contains relevant financial details for all of UCLA, so long as the details for SBAC and CRESST are contained within and separately labeled.

I would like annual records spanning the lifetimes of each organization: SBAC only goes back several years, but CRESST goes back to the 1980s (in its early years, it was called the Center for the Study of Evaluation).

Please tell me what I need to do next.

Thank you for your time and attention.

Best Wishes, Richard Phelps

 

July 6, 2017

RE: Acknowledgement of Public Records Request – PRR # 17-4854

Dear Mr. Phelps:

This letter is to acknowledge your request under the California Public Records Act (CPRA) dated July 5, 2017, herein enclosed. Information Practices (IP) is notifying the appropriate UCLA offices of your request and will identify, review, and release all responsive documents in accordance with relevant law and University policy.

Under the CPRA, Cal. Gov’t Code Section 6253(b), UCLA may charge for reproduction costs and/or programming services. If the cost is anticipated to be greater than $50.00 or the amount you authorized in your original request, we will contact you to confirm your continued interest in receiving the records and your agreement to pay the charges. Payment is due prior to the release of the records.

As required under Cal. Gov’t Code Section 6253, UCLA will respond to your request no later than the close of business on July 14, 2017. Please note, though, that Section 6253 only requires a public agency to make a determination within 10 days as to whether or not a request is seeking records that are publicly disclosable and, if so, to provide the estimated date that the records will be made available. There is no requirement for a public agency to actually supply the records within 10 days of receiving a request, unless the requested records are readily available. Still, UCLA prides itself on always providing all publicly disclosable records in as timely a manner as possible.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

July 14, 2017

RE: Public Records Request – PRR # 17-4854

Dear Mr. Phelps:

The purpose of this letter is to confirm that UCLA Information Practices (IP) continues to work on your public records request dated July 5, 2017. As allowed pursuant to Cal. Gov’t Code Section 6253(c), we require additional time to respond to your request, due to the following circumstance(s):

The need to search for and collect the requested records from field facilities or other establishments that are separate from the office processing the request.

IP will respond to your request no later than the close of business on July 28, 2017 with an estimated date that responsive documents will be made available.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

July 28, 2017

Dear Mr. Phelps,

Please know UCLA Information Practices continues to work on your public records request, attached for your reference. I will provide a further response regarding your request no later than August 18, 2017.

Should you have any questions, please contact me at (310) 794-8741 or via email and reference the PRR number found above in the subject line.

Kind regards,

Paula Hill

Assistant Manager

UCLA Information Practices

 

July 29, 2017

Thank you. RP

 

August 18, 2017

Re: Public Records Request – PRR # 17-4854

Dear Mr. Richard Phelps:

UCLA Information Practices (IP) continues to work on your public records request dated July 5, 2017. As required under Cal. Gov’t Code Section 6253, and as noted in our email communication with you on July 28, 2017, we are now able to provide you with the estimated date that responsive documents will be made available to you, which is September 29, 2017.

As the records are still being compiled and/or reviewed, we are not able at this time to provide you with any potential costs, so that information will be furnished in a subsequent communication as soon as it is known.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

September 29, 2017

Dear Mr. Richard Phelps,

Unfortunately, we must revise the estimated availability date regarding your attached request as the requisite review has not yet been completed. We expect to provide a complete response by November 30, 2017. We apologize for the delay.

Should you have any questions, please contact our office at (310) 794-8741 or via email, and reference the PRR number found above in the subject line.

Best regards,

UCLA Information Practices

 

September 29, 2017

I believe that if you are leaving it up to CRESST and SBAC to voluntarily provide the information, they will not be ready Nov. 30 either. RP

Close all USED-funded research centers: Evaluation of existing regulations: My two bits

My comments below in response to the USED request for comments on existing USED regulations. To submit your own, follow the instructions at:  https://www.regulations.gov/document?D=ED-2017-OS-0074-0001

MEMORANDUM
To:  Hilary Malawer, Assistant General Counsel, Office of the General Counsel, U.S. Department of Education
From:  Richard P. Phelps
Date:  July 8, 2017
Re:  Evaluation of Existing Regulations[1]

Greetings:

I encourage the US Education Department to eliminate from any current and future funding education research centers. Ostensibly, federally funded education research centers fill a “need” for more research to guide public policy on important topics. But, the research centers are almost entirely unregulated, so they can do whatever they please. And, what they please is too often the promotion of their own careers and the suppression or denigration of competing ideas and evidence.

Federal funding of education research centers concentrates far too much power in too few hands. And, that power is nearly unassailable. One USED funded research center, the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) blatantly and repeatedly misrepresented research I had conducted while at the U.S. Government Accountability Office (GAO) in favor of their own small studies on the same topic. I was even denied attendance at public meetings where my research was misrepresented. Promises to correct the record were made, but not kept.

When I appealed to the USED project manager, he replied that he had nothing to say about “editorial” matters. In other words, a federally funded education research center can write and say anything that pleases, or benefits, the individuals inside.

Capturing a federally funded research center contract tends to boost the professional provenance of the winners stratospherically. In the case of CRESST, the principals assumed control of the National Research Council’s Board on Testing and Assessment, where they behaved typically—citing themselves and those who agree with them, and ignoring, or demonizing, the majority of the research that contradicted their work and policy recommendations.

Further, CRESST principals now seem to have undue influence on the assessment research of the international agency, the Organisation for Economic Co-operation and Development (OECD), which, as if on cue, has published studies that promote the minority of the research sympathetic to CRESST doctrine while simply ignoring even the existence of the majority of the research that is not. The rot—the deliberate suppression of the majority of the relevant research–has spread worldwide, and the USED funded it.

In summary, the behavior of the several USED funded research centers I have followed over the years meet or exceed the following thresholds identified in the President’s Executive Order 13777:

(ii) Are outdated, unnecessary, or ineffective;

(iii) Impose costs that exceed benefits;

(iv) Create a serious inconsistency or otherwise interfere with regulatory reform initiatives and policies;

(v) Are inconsistent with the requirements of section 515 of the Treasury and General Government Appropriations Act, 2001 (44 U.S.C. 3516 note), or the guidance issued pursuant to that provision, in particular those regulations that rely in whole or in part on data, information, or methods that are not publicly available or that are insufficiently transparent to meet the standard for reproducibility.

Below, I cite only relevant documents that I wrote myself, so as not to implicate anyone else. As the research center principals gain power, fewer and fewer of their professional compatriots are willing to disagree with them. The more power they amass, the more difficult it becomes for contrary evidence and points of view, no matter how compelling or true, to even get a hearing.

References:

Phelps, R. P. (2015, July). The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html

Phelps, R. P. (2014, October). Review of Synergies for Better Learning: An International Perspective on Evaluation and Assessment (OECD, 2013), Assessment in Education: Principles, Policies, & Practices. doi:10.1080/0969594X.2014.921091 http://www.tandfonline.com/doi/full/10.1080/0969594X.2014.921091#.VTKEA2aKJz1

Phelps, R. P. (2013, February 12). What Happened at the OECD? Education News.

Phelps, R. P. (2013, January 28). OECD Encourages World to Adopt Failed US Ed Programs. Education News.

Phelps, R. P. (2013). The rot spreads worldwide: The OECD – Taken in and taking sides. New Educational Foundations, 2(1). Preview: http://www.newfoundations.com/NEFpubs/NEFv2Announce.html

Phelps, R. P. (2012, June). Dismissive reviews: Academe’s Memory Hole. Academic Questions, 25(2), pp. 228–241. doi:10.1007/s12129-012-9289-4 https://www.nas.org/articles/dismissive_reviews_academes_memory_hole

Phelps, R. P. (2012). The effect of testing on student achievement, 1910–2010. International Journal of Testing, 12(1), 21–43. http://www.tandfonline.com/doi/abs/10.1080/15305058.2011.602920

Phelps, R. P. (2010, July). The source of Lake Wobegon [updated]. Nonpartisan Education Review / Articles, 1(2). http://nonpartisaneducation.org/Review/Articles/v6n3.htm

Phelps, R. P. (2000, December). High stakes: Testing for tracking, promotion, and graduation, Book review, Educational and Psychological Measurement, 60(6), 992–999. http://richardphelps.net/HighStakesReview.pdf

Phelps, R. P. (1999, April). Education establishment bias? A look at the National Research Council’s critique of test utility studies. The Industrial-Organizational Psychologist, 36(4) 37–49. https://www.siop.org/TIP/backissues/Tipapr99/4Phelps.aspx

[1] In accordance with Executive Order 13777, “Enforcing the Regulatory Reform Agenda,” the Department of Education (Department) is seeking input on regulations that may be appropriate for repeal, replacement, or modification.

“Organizationally orchestrated propaganda” at ETS

With the testing opt-out movement growing in popularity in 2016, Common Core’s profiteers began to worry. Lower participation enough and the entire enterprise could be threatened: with meaningless aggregate scores; compromised test statistics vital to quality control; and a strong signal that many citizens no longer believe the Common Core sales pitch.

The Educational Testing Service (ETS) was established decades ago by the Carnegie Foundation to serve as an apolitical research laboratory for psychometric work. For a while, ETS played that role well, producing some of the world’s highest-quality, most objective measurement research.

In fits and starts over the past quarter century, however, ETS has commercialized. At this point, there should be no doubt in anyone’s mind that ETS is a business–a business that relies on contracts and a business that aims to please those who can pay for its services.

Some would argue, with some justification, that ETS had no choice but to change with the times. Formerly guaranteed contracts were no longer guaranteed, and the organization needed either to pay its researchers or let them go.

Instead of now presenting itself honestly to the public as a commercial enterprise seeking profits, however, ETS continues to prominently display the trappings of a neutral research laboratory seeking truths. Top employees are awarded lofty academic titles and research “chairs”. Whether the awards derive from good research work or success in courting new business is open to question.

I perceive that ETS at least attempts something like an even split between valid research and faux-research pandering. The awarding of ETS’s most prestigious honor bestowed upon outsiders–the Angoff Award–for example, takes turns between psychometricians conducting high-quality, non-political technical work one year, and high-profile gatekeepers conducting highly suspicious research the next. Members of the latter group can be found participating in, or awarding, ETS commercial contracts.

With their “research” on the Common Core test opt-out movement, ETS blew away any credible pretense that it conducts objective research where its profits are threatened. Opt-out leaders are portrayed by ETS as simple-minded, misinformed, parents of poor students, …you name it. And, of course, they are protesting against “innovative, rigorous, high quality” tests they are too dumb to appreciate.

Common Core testing, in case you didn’t know and haven’t guessed from that written above, represents a substantial share of ETS’s work. Pearson holds the largest share of work for the PARCC exams, but ETS holds the second largest.

The most ethical way for ETS to have handled the issue of Common Core opt-outs would have been to say nothing. After all, it is, supposedly, a research laboratory of apolitical test developers. They are statistical experts at developing assessment instruments, not at citizen movements, education administration, or public behavior.

Choosing to disregard the most ethical choice, ETS could have at least made it abundantly clear that it retains a large self-interest in the success of PARCC testing.

Instead, ETS continues to wrap itself in its old research laboratory coat and condemns opt-out movement leaders and sympathizers as ignorant and ill-motivated. Never mind that the opt-out leaders receive not a dime for their efforts, and ETS’s celebrity researchers are remunerated abundantly for communicating the company line.

Four months ago, I responded to one of these ETS anti-opt-out propaganda pieces, written by Randy E. Bennett, the “Norman O. Frederiksen Chair in Assessment Innovation at Educational Testing Service.” It took a few weeks, but ETS, in the person of Mr. Bennett, responded to my comment questioning ETS’s objectivity in the matter.

He asserted, “There’s a lot less organizationally orchestrated propaganda, and a lot more academic freedom, here than you might think!”

To which I replied, “The many psychometricians working at ETS with a starkly different vision of what constitutes “high quality” in assessment are allowed to publish purely technical pieces. But, IMHO, the PR road show predominantly panders to power and profit. ETS’s former reputation for scholarly integrity took decades to accumulate. To my observation, it seems to be taking less time to dissemble. RP”

My return comment, however, was blocked. All comments have now been removed from the relevant ETS web page. All comments remain available to read at the Disqus comments manager site, though. The vertical orange bar next to the Nonpartisan Education logo is Disqus’ indication that the comment was blocked by ETS at its web site.

 

101 Terms for Denigrating Others’ Research

In scholarly terms, a review of the literature or literature review is a summation of the previous research conducted on a particular topic. With a dismissive literature review, a researcher assures the public that no one has yet studied a topic or that very little has been done on it. Dismissive reviews can be accurate, for example with genuinely new scientific discoveries or technical inventions. But, often, and perhaps usually, they are not.

A recent article in the Nonpartisan Education Review includes hundreds of statements—dismissive reviews—of some prominent education policy researchers.* Most of their statements are inaccurate; perhaps all of them are misleading.

“Dismissive review”, however, is the general term. In the “type” column of the files linked to the article, a finer distinction is made among simply “dismissive”—meaning a claim that there is no or little previous research, “denigrating”—meaning a claim that previous research exists but is so inferior it is not worth even citing, and “firstness”—a claim to be the first in the history of the world to ever conduct such a study. Of course, not citing previous work has profound advantages, not least of which is freeing up the substantial amount of time that a proper literature review requires.

By way of illustrating the alacrity with which some researchers dismiss others’ research as not worth looking for, I list the many terms marshaled for the “denigration” effort in the table below. I suspect that in many cases, the dismissive researcher has not even bothered to look for previous research on the topic at hand, outside his or her small circle of colleagues.

Regardless, the effect of the dismissal, particularly when coming from a highly influential researcher, is to discourage searches for others’ work, and thus draw more attention to the dismisser. One might say that “the beauty” of a dismissive review is that rival researchers are not cited, referenced, or even identified, thus precluding the possibility of a time-consuming and potentially embarrassing debate.

Just among the bunch of high-profile researchers featured in the Nonpartisan Education Review article, one finds hundreds of denigrating terms employed to discourage the public, press, and policymakers from searching for the work done by others. Some in-context examples:

  • “The shortcomings of [earlier] studies make it difficult to determine…”
  • “What we don’t know: what is the net effect on student achievement?
    -Weak research designs, weaker data
    -Some evidence of inconsistent, modest effects
    Reason: grossly inadequate research and evaluation”
  • “Nearly 20 years later, the debate … remains much the same, consisting primarily of opinion and speculation…. A lack of solid empirical research has allowed the controversy to continue unchecked by evidence or experience…”

To consolidate the mass of verbiage somewhat, I group similar terms in the table below.

(Frequency)   Denigrating terms used for other research
(43)   [not] ‘systematic’; ‘aligned’; ‘detailed’; ‘comprehensive’; ‘large-scale’; ‘cross-state’; ‘sustained’; ‘thorough’
(31)    [not] ‘empirical’; ‘research-based’; ‘scholarly’
(29)   ‘limited’; ‘selective’; ‘oblique’; ‘mixed’; ‘unexplored’
(19)   ‘small’; ‘scant’; ‘sparse’; ‘narrow’; ‘scarce’; ‘thin’; ‘lack of’; ‘handful’; ‘little’; ‘meager’; ‘small set’; ‘narrow focus’
(15)   [not] ‘hard’; ‘solid’; ‘strong’; ‘serious’; ‘definitive’; ‘explicit’; ‘precise’
(14)   ‘weak’; ‘weaker’; ‘challenged’; ‘crude’; ‘flawed’; ‘futile’
(9)    ‘anecdotal’; ‘theoretical’; ‘journalistic’; ‘assumptions’; ‘guesswork’; ‘opinion’; ‘speculation’; ‘biased’; ‘exaggerated’
(8)    [not] ‘rigorous’
(8)    [not] ‘credible’; ‘compelling’; ‘adequate’; ‘reliable’; ‘convincing’; ‘consensus’; ‘verified’
(7)    ‘inadequate’; ‘poor’; ‘shortcomings’; ‘naïve’; ‘major deficiencies’; ‘futile’; ‘minimal standards of evidence’
(5)    [not] ‘careful’; ‘consistent’; ‘reliable’; ‘relevant’; ‘actual’
(4)    [not] ‘clear’; ‘direct’
(4)    [not] ‘high quality’; ‘acceptable quality’; ‘state of the art’
(4)    [not] ‘current’; ‘recent’; ‘up to date’; ‘kept pace’
(4)    ‘statistical shortcomings’; ‘methodological deficiencies’; ‘individual student data, followed school to school’; ‘distorted’
(2)    [not] ‘independent’; ‘diverse’

As well as illustrating the facility with which some researchers denigrate the work of rivals, the table summary also illustrates how easy it is. Hundreds of terms stand ready for dismissing entire research literatures. Moreover, if others’ research must satisfy the hundreds of sometimes-contradictory characteristics used above simply to merit acknowledgement, it is not surprising that so many of the studies undertaken by these influential researchers are touted as the first of a kind.

* Phelps, R.P. (2016). Dismissive reviews in education policy research: A list. Nonpartisan Education Review/Resources/DismissiveList.htm
http://nonpartisaneducation.org/Review/Resources/DismissiveList.htm

Censorship at Education Next

In response to their recent misleading articles about a fall 2015 Mathematica report that claims to (but does not) find predictive validity for the PARCC test with Massachusetts college students, I wrote the text below and submitted it to EdNext as a comment to the article. The publication neither published my comment nor provided any explanation. Indeed, the comments section appears to have vanished entirely.

http://educationnext.org/testing-college-readiness-massachusetts-parcc-mcas-standardized-tests/

“First, the report attempts to calculate only general predictive validity. The type of predictive validity that matters is “incremental predictive validity”—the amount of predictive power left over when other predictive factors are controlled. If a readiness test is highly correlated with high school grades or class rank, it provides the college admission counselor no additional information. It adds no value. The real value of the SAT or ACT is in the information it provides admission counselors above and beyond what they already know from other measures available to them.

“Second, the study administered grade 10 MCAS and PARCC tests to college students at the end of their freshmen years in college, and compared those scores to their first-year grades in college. Thus, the study measures what students learned in one year of college and in their last two years of high school more than it measures what they knew as of grade 10. The study does not actually compute predictive validity; it computes “concurrent” validity.

“Third, student test-takers were not representative of Massachusetts tenth graders. All were volunteers; and we do not know how they learned about the study or why they chose to participate. Students not going to college, not going to college in Massachusetts, or not going to these colleges in Massachusetts could not have participated. The top colleges—where the SAT would have been most predictive—were not included in the study (e.g., U. Mass-Amherst, any private college, or elite colleges outside the state). Students not going to college, or attending occupational certificate training programs or apprenticeships–for whom one would suspect the MCAS would be most predictive–were not included in the study.”