Because such thinking has become part of the national rhetoric, nobody bats an eye anymore when words like “unity” and “collective” are shorthand for “conformity” and “the herd.”

Recently, Mayor Segarra was thrown under the bus for the way in which he requested a more just search process for the replacement superintendent. Actually, the request itself was demonized. This all could have easily been turned into an episode of The Simpsons:



It felt that disingenuous to listen to…all of this talk about divisiveness, as if that is the instant and only possible consequence of differing viewpoints. Diverse opinions could not possibly, so it seems, lead to richer discussions and actions. All this makes one wonder if anyone actually does give a damn about the youth.

This week, Robert Cotto, Jr. was given a scolding by Achieve Hartford!. He kicked off March with an op-ed in the Hartford Courant which received far more attention than most of their articles do, as evidenced in the social media ratings; as of publication, his op-ed was shared 154 times on Facebook, tweeted 14, and added to Digg five times. On March 10th, Achieve Hartford! sent an email, basically telling Cotto he was wrong for letting the public know about an issue that concerns the public — taxpayers, parents, students, teachers, and concerned community members alike. Their email follows:

Contrary to views recently expressed by Hartford Board of Education member Robert Cotto, Jr., in his op-ed in the Hartford Courant March 1st, entitled “Hartford’s Claim of School Success Flawed,” the quality of education for Hartford students has indeed improved over the past three years and students’ lives are being positively impacted by Hartford’s reform strategies.

Here is some background.  Over the past two years, the State has introduced enhanced assessment methods to better evaluate the progress of special education students.  In 2009, the state introduced and piloted the Modified Assessment System (MAS) in order to more accurately assess those special education students whose severe disabilities prohibit them from achieving on the standard assessments.  Following a protocol established by the State Department of Education, these students  are deemed by their school-level Planning and Placement Teams (PPT) as unable to reach proficiency on the “standard” CMT or CAPT specifically due to their disability and despite any accommodations.  They then become exempt from taking the standard CMT and CAPT in Math and Reading and are directed to take a different test, administered by the State and referred to as the MAS.  In Hartford in 2010, approximately 9% of reading and math test takers were transitioned to the MAS.  It is the job of teachers and staff on school level PPTs to direct all eligible students to take the MAS test – a much more useful instrument for these children.

As a result of the transition to MAS, the comparison of the data between 2008 and the present is not entirely consistent.  To be sure, the transition did impact test scores in Hartford and the absolute gains that were reported – as was the case in all districts throughout the state.  As a result of the statewide impact, comparable data still exists among all districts.  We noted this back in our November 18, 2010 Education Matters!, where, for the 2010 school year, we highlighted one way to interpret the data and suggested the need to continuously improve upon available methods.

Mr. Cotto maintains that the largest explanation for a rise in student performance in Hartford is the exclusion of certain special education test takers from the data reported by the State.  However, based upon analysis we have done, that assertion is not accurate.

As part of our work, we looked at the reported data and made assumptions to account for the impacts of the MAS transition.  We used two different methods of analysis.  The first examined the performance with all special education excluded for the past three years, and the second normalized the 2008 special education student performance over 2009 and 2010.  While not perfect, both of these methodologies are sound in their approach.

The results were consistent.   For approximately 86% of Hartford students – those unaffected by the state’s change in testing policy – performance still increased meaningfully (> 6 pts) over the past three years.  The result was similar using the additional normalization look, which took into account 100% of students.   Importantly, while the absolute gains reported with these scenarios are somewhat lower than those based on the fully aggregated data that the state reported, relativity to state performance is similar to what was previously reported – with gains approximately 4 times that of the State in 2010, over 2007.

It is essential that all school districts in Connecticut maintain consistent use of data along with a defined means of how that data will be used.  Hartford is no exception, and the District needs to work to ensure the highest standards of reporting.  With the statewide transition to MAS, consistency slipped somewhat in 2009 and 2010, but the direction of Hartford’s reform remains unchanged as the District continues to improve at a rate that outpaces the State.  The achievement gap is still closing.  The reform strategies in Hartford are not flawed and are having an encouraging impact, with the commitment to shutting down chronically low-performing schools and redesigning them making a positive difference for Hartford children and families.

Hartford’s progress is real – but fragile.  Views expressed by Board of Education members on issues such as data integrity – and analogies to models of fraud – should not be taken lightly.  Given the students and families Board members pledge to serve, the Board as a whole needs to have a collaborative voice as to direction.  While this does not always mean having unanimous votes on all issues, each board member should be expected to approach every issue with positive intent.

Mr. Cotto has brought forth these MAS data issues at previous Board meetings, and he is entitled to raise such concerns and get any questions he has answered within the protocols of the Board, and with appropriate staff assistance and support.  However, the advancement of individual viewpoints outside of those protocols should not work to undermine a Board’s agreed upon collective direction.  That is not in the best interest of students.

They went there. Questioning authority and requesting a fair evaluation is apparently “not in the best interest of students.” That claim is nothing more than a red herring and a bold attempt at silencing a member of the Board of Education who has not fallen into step with the rest.

What gets buried in this ranting email is that Achieve Hartford!’s methodology has included “[making] assumptions” after studying data in order “to account for the impacts of the MAS transition,” which they even admit is “not perfect,” though they still have the audacity to claim “both of these methodologies are sound in their approach.”

Another tidbit that gets lost is the admission of inconsistency:

It is essential that all school districts in Connecticut maintain consistent use of data along with a defined means of how that data will be used.  Hartford is no exception, and the District needs to work to ensure the highest standards of reporting.  With the statewide transition to MAS, consistency slipped somewhat in 2009 and 2010

So, there are concessions that not all of the data is as black and white as they had previously claimed.

One section of this email could even be thrown right back at them:

Views expressed by Board of Education members on issues such as data integrity – and analogies to models of fraud – should not be taken lightly.

People are not taking this lightly. The data may be correct, but still mislead the public about which conclusions to reach regarding how these gains were made. It’s not a stretch to believe that this organization and many other stakeholders, so to speak, are counting on everyone to conclude that test score increases are due to the school reforms which involve closing down neighborhood schools and reopening them as “Choice” schools.

Maybe it’s worth considering that Cotto is correct. There are other conclusions we can come to about what this data represents. While the number of students taking the CAPT and CMTs, as well as the number of students taking the MAS are included in the data provided by Achieve Hartford!, they are less forthcoming about what gains, if any, were made by those taking the Modified Assessment System, claiming that data from 2009 was not available. This is where “reporting consistency slipped.” Data from 2010 showed a large gap between special ed students taking the standardized tests and students taking the MAS; the difference is such that several hundred more students took the MAS than those who are considered special ed and took the standardized ones. To look more closely, one school had all of its special ed students take the MAS.

It would be grossly unfair to expect some students with special needs to meet the same standards as their peers, yet it does raise questions about how schools as a whole are rising to the challenge of meeting the standards set by society regarding how special needs students are taught. Gains are made, yes, but at what cost?

It seems convenient to exclude special education students, just like it seems convenient to shut down schools that are deemed to be losing propositions. Convenience, however, is not the same as true achievement. One hopes that the superintendent-in-waiting takes note of this.

It’s not just the potentially misleading data that people are taking seriously. I have been hearing from a number of teachers in various Hartford schools who are reporting that those who fail the CAPT are given chance after chance to pass its equivalent. Because the CAPT is now linked to graduation requirements, students who fail these tests as sophomores are retested as juniors. Those who still fail are given alternative tests as seniors. These alternative tests do not need to be approved by the State Department of Education and students sometimes have days (instead of hours) to complete them. Not just that, but some are heavily coached in order to “demonstrate proficiency.” If students are learning the material in the end, there is not much to complain about, but it does make one think about the purpose of making the CAP tests themselves so high stakes.

What exactly do we learn from any of these tests?