Surprise! SBAC and CRESST stonewall public records request for their financial records

Say what you will about Achieve, PARCC, Fordham, CCSSO, and NGA— some of the organizations responsible for promoting the Common Core Initiative on us all. But, their financial records are publicly available.

Not so for some other organizations responsible for the same Common Core promotion. The Smarter Balanced Assessment Consortium (SBAC) and the Center for Research on Educational Standards and Student Testing (CRESST) have absorbed many millions of taxpayer and foundation dollars over the years. But, their financial records have been hidden inside the vast, nebulous cocoon of the University of California – Los Angeles (UCLA). UCLA’s financial records, of course, are publicly available, but amounts there are aggregated at a level that subsumes thousands of separate, individual entities.

UCLA is a tax-supported state institution, however, and California has an open records law on the books. After some digging, I located the UCLA office responsible for records requests and wrote to them. Following is a summary of our correspondence to date:

 

July 5, 2017

Greetings:

I hope that you can help me. I have spent a considerable amount of time clicking around in search of financial reports for the Smarter Balanced Assessment Consortium (SBAC) and the National Center for Research on Evaluation, Standards, and Student Testing (CRESST), both “housed” at UCLA (or, until just recently in SBAC’s case). Even after many hours of web searching, I still have no clue as to where these data might be found.

Both organizations are largely publicly funded through federal grants. I would like to obtain revenue and expenditure detail on the order of what a citizen would expect to see in a nonprofit organization’s Form 990. I would be happy to search through a larger data base that contains relevant financial details for all of UCLA, so long as the details for SBAC and CRESST are contained within and separately labeled.

I would like annual records spanning the lifetimes of each organization: SBAC only goes back several years, but CRESST goes back to the 1980s (in its early years, it was called the Center for the Study of Evaluation).

Please tell me what I need to do next.

Thank you for your time and attention.

Best Wishes, Richard Phelps

 

July 6, 2017

RE: Acknowledgement of Public Records Request – PRR # 17-4854

Dear Mr. Phelps:

This letter is to acknowledge your request under the California Public Records Act (CPRA) dated July 5, 2017, herein enclosed. Information Practices (IP) is notifying the appropriate UCLA offices of your request and will identify, review, and release all responsive documents in accordance with relevant law and University policy.

Under the CPRA, Cal. Gov’t Code Section 6253(b), UCLA may charge for reproduction costs and/or programming services. If the cost is anticipated to be greater than $50.00 or the amount you authorized in your original request, we will contact you to confirm your continued interest in receiving the records and your agreement to pay the charges. Payment is due prior to the release of the records.

As required under Cal. Gov’t Code Section 6253, UCLA will respond to your request no later than the close of business on July 14, 2017. Please note, though, that Section 6253 only requires a public agency to make a determination within 10 days as to whether or not a request is seeking records that are publicly disclosable and, if so, to provide the estimated date that the records will be made available. There is no requirement for a public agency to actually supply the records within 10 days of receiving a request, unless the requested records are readily available. Still, UCLA prides itself on always providing all publicly disclosable records in as timely a manner as possible.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

July 14, 2017

RE: Public Records Request – PRR # 17-4854

Dear Mr. Phelps:

The purpose of this letter is to confirm that UCLA Information Practices (IP) continues to work on your public records request dated July 5, 2017. As allowed pursuant to Cal. Gov’t Code Section 6253(c), we require additional time to respond to your request, due to the following circumstance(s):

The need to search for and collect the requested records from field facilities or other establishments that are separate from the office processing the request.

IP will respond to your request no later than the close of business on July 28, 2017 with an estimated date that responsive documents will be made available.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

July 28, 2017

Dear Mr. Phelps,

Please know UCLA Information Practices continues to work on your public records request, attached for your reference. I will provide a further response regarding your request no later than August 18, 2017.

Should you have any questions, please contact me at (310) 794-8741 or via email and reference the PRR number found above in the subject line.

Kind regards,

Paula Hill

Assistant Manager

UCLA Information Practices

 

July 29, 2017

Thank you. RP

 

August 18, 2017

Re: Public Records Request – PRR # 17-4854

Dear Mr. Richard Phelps:

UCLA Information Practices (IP) continues to work on your public records request dated July 5, 2017. As required under Cal. Gov’t Code Section 6253, and as noted in our email communication with you on July 28, 2017, we are now able to provide you with the estimated date that responsive documents will be made available to you, which is September 29, 2017.

As the records are still being compiled and/or reviewed, we are not able at this time to provide you with any potential costs, so that information will be furnished in a subsequent communication as soon as it is known.

Should you have any questions, please contact me at (310) 794-8741 or via email at pahill@finance.ucla.edu and reference the PRR number found above in the subject line.

Sincerely,

Paula Hill

Assistant Manager, Information Practices

 

September 29, 2017

Dear Mr. Richard Phelps,

Unfortunately, we must revise the estimated availability date regarding your attached request as the requisite review has not yet been completed. We expect to provide a complete response by November 30, 2017. We apologize for the delay.

Should you have any questions, please contact our office at (310) 794-8741 or via email, and reference the PRR number found above in the subject line.

Best regards,

UCLA Information Practices

 

September 29, 2017

I believe that if you are leaving it up to CRESST and SBAC to voluntarily provide the information, they will not be ready Nov. 30 either. RP

Close all USED-funded research centers: Evaluation of existing regulations: My two bits

My comments below in response to the USED request for comments on existing USED regulations. To submit your own, follow the instructions at:  https://www.regulations.gov/document?D=ED-2017-OS-0074-0001

MEMORANDUM
To:  Hilary Malawer, Assistant General Counsel, Office of the General Counsel, U.S. Department of Education
From:  Richard P. Phelps
Date:  July 8, 2017
Re:  Evaluation of Existing Regulations[1]

Greetings:

I encourage the US Education Department to eliminate from any current and future funding education research centers. Ostensibly, federally funded education research centers fill a “need” for more research to guide public policy on important topics. But, the research centers are almost entirely unregulated, so they can do whatever they please. And, what they please is too often the promotion of their own careers and the suppression or denigration of competing ideas and evidence.

Federal funding of education research centers concentrates far too much power in too few hands. And, that power is nearly unassailable. One USED funded research center, the National Center for Research on Evaluation, Standards, and Student Testing (CRESST) blatantly and repeatedly misrepresented research I had conducted while at the U.S. Government Accountability Office (GAO) in favor of their own small studies on the same topic. I was even denied attendance at public meetings where my research was misrepresented. Promises to correct the record were made, but not kept.

When I appealed to the USED project manager, he replied that he had nothing to say about “editorial” matters. In other words, a federally funded education research center can write and say anything that pleases, or benefits, the individuals inside.

Capturing a federally funded research center contract tends to boost the professional provenance of the winners stratospherically. In the case of CRESST, the principals assumed control of the National Research Council’s Board on Testing and Assessment, where they behaved typically—citing themselves and those who agree with them, and ignoring, or demonizing, the majority of the research that contradicted their work and policy recommendations.

Further, CRESST principals now seem to have undue influence on the assessment research of the international agency, the Organisation for Economic Co-operation and Development (OECD), which, as if on cue, has published studies that promote the minority of the research sympathetic to CRESST doctrine while simply ignoring even the existence of the majority of the research that is not. The rot—the deliberate suppression of the majority of the relevant research–has spread worldwide, and the USED funded it.

In summary, the behavior of the several USED funded research centers I have followed over the years meet or exceed the following thresholds identified in the President’s Executive Order 13777:

(ii) Are outdated, unnecessary, or ineffective;

(iii) Impose costs that exceed benefits;

(iv) Create a serious inconsistency or otherwise interfere with regulatory reform initiatives and policies;

(v) Are inconsistent with the requirements of section 515 of the Treasury and General Government Appropriations Act, 2001 (44 U.S.C. 3516 note), or the guidance issued pursuant to that provision, in particular those regulations that rely in whole or in part on data, information, or methods that are not publicly available or that are insufficiently transparent to meet the standard for reproducibility.

Below, I cite only relevant documents that I wrote myself, so as not to implicate anyone else. As the research center principals gain power, fewer and fewer of their professional compatriots are willing to disagree with them. The more power they amass, the more difficult it becomes for contrary evidence and points of view, no matter how compelling or true, to even get a hearing.

References:

Phelps, R. P. (2015, July). The Gauntlet: Think tanks and federally funded centers misrepresent and suppress other education research. New Educational Foundations, 4. http://www.newfoundations.com/NEFpubs/NEF4Announce.html

Phelps, R. P. (2014, October). Review of Synergies for Better Learning: An International Perspective on Evaluation and Assessment (OECD, 2013), Assessment in Education: Principles, Policies, & Practices. doi:10.1080/0969594X.2014.921091 http://www.tandfonline.com/doi/full/10.1080/0969594X.2014.921091#.VTKEA2aKJz1

Phelps, R. P. (2013, February 12). What Happened at the OECD? Education News.

Phelps, R. P. (2013, January 28). OECD Encourages World to Adopt Failed US Ed Programs. Education News.

Phelps, R. P. (2013). The rot spreads worldwide: The OECD – Taken in and taking sides. New Educational Foundations, 2(1). Preview: http://www.newfoundations.com/NEFpubs/NEFv2Announce.html

Phelps, R. P. (2012, June). Dismissive reviews: Academe’s Memory Hole. Academic Questions, 25(2), pp. 228–241. doi:10.1007/s12129-012-9289-4 https://www.nas.org/articles/dismissive_reviews_academes_memory_hole

Phelps, R. P. (2012). The effect of testing on student achievement, 1910–2010. International Journal of Testing, 12(1), 21–43. http://www.tandfonline.com/doi/abs/10.1080/15305058.2011.602920

Phelps, R. P. (2010, July). The source of Lake Wobegon [updated]. Nonpartisan Education Review / Articles, 1(2). http://nonpartisaneducation.org/Review/Articles/v6n3.htm

Phelps, R. P. (2000, December). High stakes: Testing for tracking, promotion, and graduation, Book review, Educational and Psychological Measurement, 60(6), 992–999. http://richardphelps.net/HighStakesReview.pdf

Phelps, R. P. (1999, April). Education establishment bias? A look at the National Research Council’s critique of test utility studies. The Industrial-Organizational Psychologist, 36(4) 37–49. https://www.siop.org/TIP/backissues/Tipapr99/4Phelps.aspx

[1] In accordance with Executive Order 13777, “Enforcing the Regulatory Reform Agenda,” the Department of Education (Department) is seeking input on regulations that may be appropriate for repeal, replacement, or modification.

“Organizationally orchestrated propaganda” at ETS

With the testing opt-out movement growing in popularity in 2016, Common Core’s profiteers began to worry. Lower participation enough and the entire enterprise could be threatened: with meaningless aggregate scores; compromised test statistics vital to quality control; and a strong signal that many citizens no longer believe the Common Core sales pitch.

The Educational Testing Service (ETS) was established decades ago by the Carnegie Foundation to serve as an apolitical research laboratory for psychometric work. For a while, ETS played that role well, producing some of the world’s highest-quality, most objective measurement research.

In fits and starts over the past quarter century, however, ETS has commercialized. At this point, there should be no doubt in anyone’s mind that ETS is a business–a business that relies on contracts and a business that aims to please those who can pay for its services.

Some would argue, with some justification, that ETS had no choice but to change with the times. Formerly guaranteed contracts were no longer guaranteed, and the organization needed either to pay its researchers or let them go.

Instead of now presenting itself honestly to the public as a commercial enterprise seeking profits, however, ETS continues to prominently display the trappings of a neutral research laboratory seeking truths. Top employees are awarded lofty academic titles and research “chairs”. Whether the awards derive from good research work or success in courting new business is open to question.

I perceive that ETS at least attempts something like an even split between valid research and faux-research pandering. The awarding of ETS’s most prestigious honor bestowed upon outsiders–the Angoff Award–for example, takes turns between psychometricians conducting high-quality, non-political technical work one year, and high-profile gatekeepers conducting highly suspicious research the next. Members of the latter group can be found participating in, or awarding, ETS commercial contracts.

With their “research” on the Common Core test opt-out movement, ETS blew away any credible pretense that it conducts objective research where its profits are threatened. Opt-out leaders are portrayed by ETS as simple-minded, misinformed, parents of poor students, …you name it. And, of course, they are protesting against “innovative, rigorous, high quality” tests they are too dumb to appreciate.

Common Core testing, in case you didn’t know and haven’t guessed from that written above, represents a substantial share of ETS’s work. Pearson holds the largest share of work for the PARCC exams, but ETS holds the second largest.

The most ethical way for ETS to have handled the issue of Common Core opt-outs would have been to say nothing. After all, it is, supposedly, a research laboratory of apolitical test developers. They are statistical experts at developing assessment instruments, not at citizen movements, education administration, or public behavior.

Choosing to disregard the most ethical choice, ETS could have at least made it abundantly clear that it retains a large self-interest in the success of PARCC testing.

Instead, ETS continues to wrap itself in its old research laboratory coat and condemns opt-out movement leaders and sympathizers as ignorant and ill-motivated. Never mind that the opt-out leaders receive not a dime for their efforts, and ETS’s celebrity researchers are remunerated abundantly for communicating the company line.

Four months ago, I responded to one of these ETS anti-opt-out propaganda pieces, written by Randy E. Bennett, the “Norman O. Frederiksen Chair in Assessment Innovation at Educational Testing Service.” It took a few weeks, but ETS, in the person of Mr. Bennett, responded to my comment questioning ETS’s objectivity in the matter.

He asserted, “There’s a lot less organizationally orchestrated propaganda, and a lot more academic freedom, here than you might think!”

To which I replied, “The many psychometricians working at ETS with a starkly different vision of what constitutes “high quality” in assessment are allowed to publish purely technical pieces. But, IMHO, the PR road show predominantly panders to power and profit. ETS’s former reputation for scholarly integrity took decades to accumulate. To my observation, it seems to be taking less time to dissemble. RP”

My return comment, however, was blocked. All comments have now been removed from the relevant ETS web page. All comments remain available to read at the Disqus comments manager site, though. The vertical orange bar next to the Nonpartisan Education logo is Disqus’ indication that the comment was blocked by ETS at its web site.

 

Significance of PISA math results

A new round of two international comparisons of student mathematics performance came out recently and there was a lot of interest because the reports were almost simultaneous, TIMSS[1] in late November 2016 and PISA[2] just a week later. They are often reported as 2015 instead of 2016 because the data collection for each was in late 2015 that would seem to improve the comparison even more. In fact, no comparison is appropriate; they are completely different instruments and, between them, the TIMSS is the one that should be of more concern to educators. Perhaps surprising and with great room for improvement, the US performance is not as dire as the PISA results would imply. By contrast, Finland continues to demonstrate that its internationally recognized record of PISA-proven success in mathematics education – with its widely applauded, student-friendly approach – is completely misinforming.

In spite of the popular press and mathematics education folklore, Finland’s performance has been known to be overrated since PISA first came out as documented by an open letter[3] written by the president of the Finnish Mathematical Society and cosigned by many mathematicians and experts in other math-based disciplines:

“The PISA survey tells only a partial truth of Finnish children’s mathematical skills” “in fact the mathematical knowledge of new students has declined dramatically”

This letter links to a description[4] of the most fundamental problem that directly involves elementary mathematics education:

“Severe shortcomings in Finnish mathematics skills” “If one does not know how to handle fractions, one is not able to know algebra”

The previous TIMSS had the 4th grade performance of Finland as a bit above that of the US but well behind by 8th. In the new report, it has slipped below the US at 4th and did not even submit itself to be assessed at 8th much less the Advanced level. Similar remarks apply to another country often recognized for its student-friendly mathematics education, the Netherlands, home of the PISA at the Freudenthal Institute. This decline was recognized in the TIMSS summary of student performance[1]with the comparative grade-level rankings as Exhibits 1.1 and 1.2 with the Advanced[5] as Exhibit M1.1:

pastedimageBy contrast, PISA[2] came out a week later and…

Netherlands 11
Finland 13
United States 41

Note: These include China* (just below Japan) of 3 provinces, not the country – if omitted, subtract 1.

Why the difference? The problem is that PISA was never for “school mathematics” but for all 15-year-old students in regard to their “mathematics literacy[6]”, not even mathematics at the algebra level needed for non-remedial admission to college much less the TIMSS Advanced level interpreted as AP or IB Calculus in the US:

“PISA is the U.S. source for internationally comparative information on the mathematical and scientific literacy of students in the upper grades at an age that, for most countries, is near the end of compulsory schooling. The objective of PISA is to measure the “yield” of education systems, or what skills and competencies students have acquired and can apply in these subjects to real-world contexts by age 15. The literacy concept emphasizes the mastery of processes, understanding of concepts, and application of knowledge and functioning in various situations within domains. By focusing on literacy, PISA draws not only from school curricula but also from learning that may occur outside of school.”

Historically relevant is the fact that conception of PISA at the Freudenthal Institute in the Netherlands included heavy guidance from Thomas Romberg of the University of Wisconsin’s WCER and the original creator of the middle school math ed curriculum MiC, Mathematics in Context. Its underlying philosophy is exactly that of PISA, the study of mathematics through everyday applications that do not require the development of the more sophisticated mathematics that opens the doors for deeper study in mathematics; i.e., all mildly sophisticated math-based career opportunities, so-called STEM careers. In point of fact, the arithmetic of the PISA applications is calculator-friendly so even elementary arithmetic through ordinary fractions – so necessary for eventual algebra – need not be developed to score well.

 

[1] http://timss2015.org/timss-2015/mathematics/student-achievement/
[2] http://nces.ed.gov/pubs2017/2017048.pdf (Table 3, page 23)
[3] http://matematiikkalehtisolmu.fi/2005/erik/PisaEng.html
[4] http://matematiikkalehtisolmu.fi/2005/erik/KivTarEng.html
[5] http://timss2015.org/advanced/ [Distribution of Advanced Mathematics Achievement]
[6] https://nces.ed.gov/timss/pdf/naep_timss_pisa_comp.pdf

Wayne Bishop, PhD
Professor of Mathematics, Emeritus
California State University, LA