MoCoEdBlog Final Post

We stopped actively blogging over a year ago, but the Montgomery County Education Blog (MoCoEdBlog) is going into a deeper sleep.  The MoCoEdBlog was an experiment in combining professional work with support for a local school system that our children attend.  Our goal was simple: to use our professional experience in a way to support MCPS and provide a counter voice to those who criticize the every move of the school system often without any knowledge of the complexities of the issues MCPS is facing.   Our hope was to be inclusive of different points of view and engage in a thoughtful discussion that was not centered on our own children but on some of the big decisions that in education today, including teacher evaluation and new standards for student success.

We learned a lot in this process.  We learned first about each other and our different perspectives on MCPS.  The internal conversations as we debated what to write about were fascinating in how they drew on different perspectives of equity and school improvement and the reality of systems like MCPS and MCPS itself.  We learned that engaging the system of MCPS can be difficult because over years the central office has developed a public relations approach that is not designed for dialog.  We also discovered that the amount of time this kind of project takes from our busy professional lives is significant.

The times have changed since we began this effort and there is a new leader for MCPS who has instilled confidence that he understands the critical issues facing the system and has a solid team assembled to be able to address these challenges and so we are going to put this project aside and see how things develop with this promising new leadership.  As individuals we remain ready to support MCPS in any ways we can, but don’t find the blog the best .

MCPS Review of Assessments: Shifts, Recommendations, and Questions

On June 22, the MCPS Board of Education received a briefing from staff led by Maria Navarro the Chief Academic Officer about the kinds of assessments students are given in different grades and different schools.  This review was undertaken at the direction of the Maryland State Superintendent of Education, Lilian Lowrey who asked Maryland school districts to look for options to reduce the assessment burden on the students and the system.  This is important work that is worth attention and follow-through.

While presumably other Maryland districts are going through this exercise, the MCPS staff presented a comprehensive and polished briefing with various options that are being considered at different levels of the system.  This is an important area for the individual schools and also for the ways that MCPS is managed.  The only people who think that educational assessments are easy are those who have little experience with them. They are hard and it was clear that the team that briefed the Board on June 22 was aware of how hard this area is to consider comprehensively.   In some cases, what MCPS is considering could be very important for the system and should be encouraged.  Some areas raised questions for me and in a few area MCPS may want to think about how their ideas might be implemented given some of the changes going on in the field.  This is not to say what was presented was wrong, but rather that the directions articulated by the MCPS staff relate to other movements in the field.

The observations, recommendations, and questions that follow are given with the recognition that this review seems very well done and well ahead of many of the majority of school districts in the United States: large and small.

Three Notable Shifts

In this work I see MCPS is tapping into three important shifts in the field of educational data.  These shifts have not always been highlighted by the staff as special, but MCPS’ process of evaluating different options for reducing the testing burden is revealing ideas that align with some new movements in educational research and policy.

Shift to more continuous and embedded nature of assessments

In some of the recommendations for restructuring middle school assessments the idea of distributing the exams across multiple class sessions rather than in a single summative block was proposed.  This is like what we see in new research into using the data feeds from digital technologies that can provide multiple data points (some from assessments and many from other activity) that can be used quickly and on a more immediate basis.  If MCPS does experiment with parsing out assessments over multiple periods then they will be preparing for this future.  At the same time, this shift bring’s complications.  Much of the statistical practice that is used in education is based on rigid control of testing conditions.  Once data is decoupled from the end-measures and made more continuous then how the information can be used (it’s reliability) for important consequences changes as well. These intermediate/continuous measures are much harder to use for high-stakes because they are usually less statistically reliable.

Focus on range of student experiences

Maria Navarro, the Chief Academic Officer discussed the use of different kinds of data, including surveys to understand the “range of student experiences” in response to comments by Dr. Docca who was asking how these changes would impact children of color.  Many educators have long recognized that learning performance is only one important factor to attend to.  What is newer now is that people in the data movement are developing different kinds of measures of student experience and that the kinds of surveys that MCPS and other districts are utilizing today, including Harvard’s Tripod project (http://tripoded.com/), are getting at these kinds of data.  The coming years should see more and more of these kinds of data and the response of Dr. Navarro may be repeated in future meetings and conversations about different kinds of students.  The Board should continue to ask for more data about the range of student experiences beyond test scores.  I believe that it is in this kind of information that the solutions to these challenges will likely be found.

Alignment and Coherence

Dr. Lang, the Associate Superintendent for Curriculum and Instruction discussed the issue of alignment of assessments.  It was presented as alignment in terms of policies and other assessments like PARCC, but the issue is much bigger and the field is just now beginning to wake up to the challenges of connecting different kinds of assessments and other forms of information (see the first recommendation below).  This is an area where educational data is different from data in other fields and where education is a more challenging field to implement data-driven solutions than other fields.  While it is easy to bring two different scores for the same subject together (ex: math), ensuring that those scores are measuring the same things requires a lot of technical work.  This is an increasing challenge for education where more and more technologies that are easy to access such as Khan Academy of Dreambox Learning that can provide data on student work and show that data in tables, charts, and dashboards. The visual displays don’t tell educators whether or not two data points can be comparable or how they relate.  This has contributed to some educators distrusting educational data because this lack of alignment translates into incoherence in terms of what to do with students.

Three Recommendations

I have three recommendations for MCPS in this process and in going forward with their work on assessments and data.

Broaden concept of data

In a number of places in the briefing the term data and assessments were used interchangeably.  This is clearly consistent with most ways policies use these terms, but this needs to change.  Assessments have been seen both in education school and policy circles as the most valid data—the hard data—that is needed to evaluate educational performance.  The problem is assessment data is widely variable in it quality and in how clearly represents the things that matter most like instruction.  In many parts of the curriculum (ex: social studies, life science, advanced literacy) there are major gaps in quality assessments.

The field has begun to broaden its conception of data to still feature assessments in a central role, but to acknowledge that other kinds of evidence from surveys, from digital tools, and especially artifacts that teachers also use in their daily work.  These teacher tools are also developing very quickly and so the kinds of information they can produce and how it can be analyzed is developing rapidly.

In broadening the kinds of evidence considered as core data beyond assessments, the ways the information are used can also be thought of broadly. Several years ago the use of data and other information in educator conversations was less valued than it is today.  A few years ago, the idea of data-driven decision making and using test results to directly inform instruction was more popular than it is today largely because these ideas of using assessments as a primary lever of reform have not worked.  Assessment data is rarely at a granularity and specificity that can be used in the practice of teaching.  Rather, these data can help teachers see their work at the level of a school and the long-term trajectories of students.  Professional learning communities looking at a range of information, including the assignments themselves and examples of student work alongside assessments are considered structures that can actually enable meaningful change. The idea of the central office collecting assessments from the schools as was suggested in the June 22 meeting may seem to some educators a case of central office controlling local processes.  However, if the view of assessments is broad (ex:  beyond test scores) and if the uses are also broad (ex: to understand practice variation as well as measure student learning) then those on the front line may have less concerns and see this sharing of information about assessments as part of organizational learning and professional development.

Guard Against Unreasonable Expectations

Many people looking at the use of assessments and data as levers of change in education hold very logical, simple, and often unworkable impressions of what these tools can do.  Assessments are critical and important, but even the state of the art assessments are often weak when it comes to making change. They must be used in concert with coherent commitments and organizational approaches to ensuring all students learn.  As MCPS moves forward with reviewing their assessment strategies, policymakers should be realistic about the limits of this part of the system.  There are no silver bullets in education and assessments are a problematic area.

 

Network with other districts

MCPS is often a few steps ahead of the rest of the nation, but the work they are doing to come up with better ways to use assessments are not unique to this county.  Almost all districts should be looking into this and many large ones in Maryland are as well.  I recommend MCPS connect with these others and perhaps develop a common conversation group to share ideas, concerns, and best practices.

Three Questions

This effort by the Board and administration has been one of the more important initiatives in my view because assessments are deeply connected to just about everything that is in the core of school system functions.  It doesn’t involve busses and building services, but almost anything involving teachers and school leaders is increasingly related to assessment. I offer three questions.

How will this be sustained?

Once the school year begins and after the task for the state superintendent is completed, how will this reflection be sustained?  How will this not become another historical exercise and instead a part of a continuous improvement process.  As groundbreaking as this kind of review is (well begun with its broad involvement of stakeholders) the next steps may be more complicated and require leadership because they will involve making changed. As Mr. Barclay noted on June 22, MCPS is without a full-time permanent superintendent.  How will MCPS continue while searching for a new leader and how will the next superintendent continue this initiative into the next school year and beyond?  It would seem the board will need to continue to focus on this topic and maintain an engaged dialog because the topics will remain complex and connected to all the important work MCPS does. It is possible that the Board itself will need to take on some of the work that might otherwise be part of the job of a superintendent.

Where are The Assessments Professionals?

The review that was presented on June 22 showed broad stakeholder communications with the inclusion of the union and various formal and informal structures.  This is important not only because the success of any effort this large will benefit from buy in, but also because these stakeholders have important perspectives that can really inform the system’s leadership.  At the same time there are others who specialize in assessments, including assessments like the PARCC and MSA, who could help MCPS understand the implications of possible decisions.  The University of Maryland and George Washington University are both close and include experts in measurement and assessment who might be able to inform the planning for next steps.  How is MCPS taking advantage of these local experts in their process of review?

Where are Technology and Shared Accountability Offices?

One of the features of most large school systems (like many companies of old) is that they tend to be walled off into different groups or silos with stronger internal communication than external.  This can lead to inefficiencies and lack of a coherence as each unit pursues its own agenda.  Strong top-down leadership can get different groups marching to the same beat, but often this comes with downsides, especially in socially oriented organizations like school systems.

The June 22 briefing was led by the Chief Academic Officer and Associate Superintendent for Curriculum and Instruction.  This makes sense because this part of the system is responsible for instructional tools and planning.  To treat these matters as technical matters first might not put the initiative on as strong an instructionally-relevant footing, which needed.  At the same time, the Office of Shared Accountability, which manages the testing process and has many of the technical experts with statistical knowledge important for interpreting assessments, should also be involved in this process.  Similarly, the Chief Technology Officer could have an important role in the shaping of new solutions as many of the newer kinds of assessments that can provide more relevant and actionable information run on new platforms that the technology office will manage.  At some point, these two areas of MCPS will need to be principally engaged in any future discussions of MCPS assessments.

An Important Opportunity to Learn From

The MCPS BOE is right to follow through on this discussion and to ask staff to continue to make it a priority.  It is important for MCPS’ future.   MCPS is like just about every other US district today that has a patchwork of assessment systems that the senior staff who presented this briefing appropriately classify as those that are A) externally required and B) that the district has some control over.  They also correctly describe the important interrelationships between these data and related areas of practice such as communications, grading, scheduling, and aligning with state and federal policy initiatives.  MCPS is unlike many districts in the way they went about this review.  While the trigger for this review seems to have been the Maryland State Department of Education’s (MDSE) interest in lowering the testing burden on students and teachers, this review is an important opportunity to look at one of the more challenging and important aspects of the district’s operation.  School systems are run by a combination of procedure, intuition/professional judgment, and data.  This combination forms a kind of “nervous system” for the districts and assessments often are the most important visible part of this nervous system.  Assessments are more than an end measure of student learning, they touch just about every aspect of the work that gets done in MCPS from policy evaluation to teaching.

This kind of review has little precedence in school reform.  Very few districts like MCPS have undertaken this kind of evaluation and so there few examples of how to do this.  MCPS’s approach has included a combination of information gathering from inside and outside the system as they have begun to frame several different kinds of approaches to reducing testing to free up instructional time.  It is comparatively easy to look professional in a routine and practiced task, but much more challenging when the challenge is new.   My view is that te MCPS staff have taken strong steps this new area and in the process shown a mature process of developing a set of alternatives by iterating through input and analysis. Most school systems do not have the capacity for this kind of exploratory work. Other districts embarking on a similar review would have a lot to learn by watching how MCPS has approached this task.

Next Step for the MCPS Board of Ed

by Stephanie Halloran and Mark Simon

While we are surprised and perplexed by the Board’s decision not to renew Superintendent Josh Starr’s contract, we can understand the benefits of not discussing personnel matters in public, and of focusing on next steps, versus dwelling on the process, no matter how clumsy.

At the same time, moving forward in a constructive manner requires better understanding of the key concerns that drove the decision. Washington Post reporter Donna St. George wrote regarding Dr. Starr’s departure, “Superintendent Joshua P. Starr will step down in two weeks, abruptly ending his tenure after failing to convince a majority of the school board that he was leading Maryland’s largest school system in the right direction.” Really?

Yesterday, as teachers gathered for MCEA’s monthly Rep Assembly, one after another spoke of their shock, surprise, and disappointment. There was a long back-and forth between teachers and MCEA president Doug Prouty in which the recurring sentiment was “we don’t want to hear about a ‘different direction.’ We felt pretty good about the direction we were going. Dr. Starr was selected based on an extensive process including focus groups of teachers and parents. We felt we had been heard.” Today’s follow-up Washington Post story quotes the PTA and NAACP expressing similar surprise and displeasure at the loss of a leader whose vision they strongly support.

To ensure credibility going forward, BOE members must not only respond to these concerns, but ensure that their differences, if any, with Dr. Starr’s direction are clearly laid out.

If the four BOE members didn’t like the direction Starr was leading MCPS, it’s a bit confounding that three members of the BOE campaigned for election without mentioning differences over direction or that they intended to fire the superintendent. The one who seemed most surprised when the new Board took office and voted his departure in January was Dr. Starr. None of the four who voted not to renew his contract at any time put forward any differences with his agenda, his vision, or objected to his bold battle with the US Department of Education and MSDE over standardized testing and teacher evaluation.

The four members of the Board owe the public answers to some questions and “personnel matter” must not be an excuse for those BOE members to duck answering those questions. Before the BOE goes on with its business, the four members need to clarify their substantive concerns. First, because as elected officials, they are accountable to the public, most of whom seem taken aback by this seemingly sudden move. Second, we deserve to know their vision for the school system and whether it is different than Dr. Starr’s, in order to support the selection process for the next superintendent.

Did they let Dr. Starr go because he was not leading the school system in the right direction? If so, with which aspects of his direction did they disagree? Was it his focus on the social-emotional life of students and teachers in schools? Was it his commitment to get standardized testing in perspective and not let it drive teacher or principal evaluation? Was it that the achievement gap based on race was narrowing but not meeting research based policy goals that the BOE had set? Or was it simply something about his style?

Here’s an interview with Starr on the day of his departure:   http://n.pr/1yrpu7g  What about this man’s vision is not exactly what Montgomery County wants?

It looks doubtful that the BOE is going to schedule that clarification conversation as part of their regular business meetings. But perhaps they will re-think and see the value of doing that. If not, We would advise Council Education Committee Chair Craig Rice and Committee Member Marc Elrich to hold a meeting of the education committee, and respectfully request members of the BOE attend, to answer questions that need to be clarified. It is an appropriate oversight role for the Council to fix a process that has left all of us confused. And it is critical to setting us on the right path as we embark on a complicated search for the right superintendent.

Sudden Departure of Dr. Starr

The sudden departure of Dr. Joshua Starr from MCPS has been a difficult process to watch for many in the MoCoEdBlog community for several reasons.  One of the most significant of them is that as a collection of education professionals we considered Josh Starr a colleague and have appreciated his openness and his ideas, which many of us found to be well reasoned and good for the educational system.  We are a collection of individuals with different backgrounds and views and we do not speak with one voice, but rather have come together to create a forum for discussion about serious issues affecting Montgomery County’s educational challenges and opportunities.  We have refrained from commenting during the Board’s deliberations.  We are not elected officials with any official role, but rather professionals who come together to share our knowledge of MCPS and the national educational world.  We were not close to the discussions between Dr. Starr and the Board and we don’t know what they know.  We do, however, have a lot of questions.

 

In the coming weeks, we plan to discuss some issues we believe are important for the Montgomery County education community, including the process by which Dr. Starr was let go and what kind of superintendent MCPS should have in the future.  Having exercised this authority the Board now can and should take responsibility for setting a clear direction for the system.  The days of having the superintendent dictate policy to them (Weast would say openly “The Board works for me”) are over.  Now our Board of Education needs to articulate a vision and explain their thinking.  We at MoCoEdBlog are ready to support whoever is responsible for leading our schools.  It is with great sadness for us that it will not be Josh Starr whose open style, big ideas, and vision for change we admired.

Burnie Bond
Stephanie Halloran
Joe Hawkins
Rick Kahlenberg
Phil Piety
Elena Silva
Mark Simon
Elaine Weiss

 

Jerry and Jody’s Kids: Where are They Now?

This past October, the Montgomery County Public Schools (MCPS) renamed Broad Acres Elementary School after its former principal Jody Leleck, who died of cancer in 2012.  The school now is named the Jody Leleck Elementary School at Board Acres.[i]  Leleck is credited with raising the academic achievement levels of Board Acres students over a five-year period between 1999 and 2004.[ii]

Leleck’s friends agree that renaming Broad Acres after her was a noble way to honor her legacy.  I did not know Leleck, and yet on the surface, I also agree with the renaming.  But recently, when I posted a comment about the renaming on a Leleck friend’s Facebook page, asking, “Where are the Leleck kids now?,” you would have thought I had called for a Spanish-Inquisition of the Leleck legacy.

Perhaps the friend thought I was questioning the Leleck Broad Acres legacy.  Honestly, all I think I was asking was if is there was any real interest in knowing what became of the Leleck kids.  After all, isn’t knowing how these students turned out later in middle school or high school or later as adults the point to having a legacy?

 

The origins of Jerry’s Kids

As Leleck was settling into her new chores at Broad Acres Elementary school, Jerry Weast was settling into his first school year as the new MCPS superintendent.  During that first year, Weast kicked off a major early childhood initiative that poured additional resources into elementary schools located mostly in low-income Montgomery County neighborhoods, including the Broad Acres community.  The Weast initiative, officially referred to as the Early Success Performance Plan (the Plan), included “ … a series of interwoven early education inititiaves, including reduced class size, full-kindergarten, revised curriculum assessments aligned with curriculum, professional development, and increased family/school communication.”[iii]  For the first year alone, experts documented the initial price tag of the Plan at $100 million.[iv]  MCPS was indeed investing a lot of resources into bettering the lives of its poor students.

 

MCPS students exposed to the Plan soon became known as Jerry’s kids.  One might catch a Board of Education meeting on cable TV, and when discussing the Plan and how it was moving along, it was common to hear a Board member ask, “How are Jerry’s kids doing?”  Researchers focused on documenting the Plan, and its impacts, also referred to the various cohorts of students exposed to the Plan as Jerry’s kids.[v]

 

Over the years, I’ve raised the same question about the Weast legacy as I did earlier about the Leleck legacy.  Frequently, I have asked friends, experts, and researchers familiar with MCPS and the Plan, “Where are Jerry’s kids?”

 

That first cohort of Jerry’s kids should be college sophomores now

 

One can easily do the math.  The first cohort of Jerry’s kids entered MCPS as kindergarteners during the 1999-2000 school year.  And so, 15 years later, assuming college was a goal—and that is a typical goal for most MCPS graduates, the vast majority of Jerry’s kids (Class of 2013) should be college sophomores. The second cohort should be college freshmen (Class of 2014), the third should be MCPS high school seniors (Class of 2015), and on and on.  A typical MCPS class is roughly 10,000 students, and a typical cohort of Jerry’s kids is roughly 1,000 students.[vi]

 

Keep in mind that Jerry’s kids (various cohorts) are narrowly defined here as those MCPS students, especially the students of color, exposed to the Plan starting in 1999, although there have been attempts to broaden the group beyond these specific cohorts.  For example, in 2010, a Pew report[vii] made the claim that the Plan produced impacts across all grade levels, and that the Plan also resulted in more MCPS seniors enrolling in college. But in 2010, that very first cohort of Jerry’s kids sat in the 9th grade.  The notion that the Plan already had impacted college enrollments for MCPS students exposed to the Plan logically was not possible.  If college enrollments were increasing, and they were (this is well documented by MCPS), that outcome was not caused the Plan.

 

Unfortunately Leading for Equity does not tell us where Jerry’s kids are

 

To date, there have been a few organized attempts to document the Plan and its impacts.[viii]  The best, and perhaps the most exhaustive attempt at documenting the Plan, is the 2009 book Leading for Equity: The Pursuit of Excellence in Montgomery County Public Schools.  For free, one can find a great deal of the book’s content at: http://www.montgomeryschoolsmd.org/leadingforequity/ .

 

Leading for Equity documents the various moving parts under the Plan, as well as how they fit together.  There is an entire chapter devoted to how MCPS unions partnered with MCPS management and the Board of Education to support and implement the Plan.  There is another chapter devoted solely to discussing the data systems MCPS put in place to monitor the Plan.  The book concludes with a chapter that prescribes a step-by-step blueprint for replicating the Plan outside of MCPS.  But what’s missing from Leading for Equity is any specific data that allow readers to figure out what happened to Jerry’s kids.

 

There are a few sweeping bulleted data points in an early chapter of Leading for Equity; however, it is clear that what’s presented does not at all represent data tied to specific cohorts of Jerry’s kids.  For example, one data point is: “In 1999, 36 percent of all students enrolled in algebra by eight grade, including 17 percent of African American students and 14 percent of Hispanic students. In 2008, 60 percent of all students, including 38 percent of African American and 39 percent of Hispanic students, enrolled in algebra by eighth grade.”

 

This change in a single data point is impressive.  But unfortunately, the first cohort of Jerry’s kids were 7th graders during the 2007-08 school year (2008), and so the change reported says little about the impact of the Plan.  But even if the first cohort of Jerry’s kids had been 8th graders, this Algebra data point, is not how one would show and track change if the goal was to demonstrate the Plan’s impact.  What we want to see would be comparisons of Algebra enrollment rates between cohorts of non-Jerry’s kids and Jerry’s kids.  A table such as the one below might sufficient.  Such a table would trend Algebra enrollment rates across various senior classes from 2006 (Class of 2010) through 2011 (Class of 2015).  If the Plan was impacting outcomes in a positive way, one would expect to see steady enrollment rate increases across the classes.  And further, we really must see tables that isolate and focus just on Jerry’s kids.  Simply, throwing out a data point that says, “Black kids increased AP enrollments” says practically nothing specific about Jerry’s kids.

 

Note to readers:  The table below is a suggested template.  The table is intentionally blank.

 

8th grade Algebra enrollment rate (% enrolled)
Cohort Jerry’s kids Total Whites Asians Blacks Hispanics
Class of 2010 (06) no
Class of 2011 (07) no
Class of 2012 (08) no
Class of 2013 (09) yes
Class of 2014 (10) yes
Class of 2015 (11) yes

 

At the middle level, one also could compare Jerry’s kids to kids demographically similar.  As a typical MCPS class snakes it way through the grades, we know that kids come and go.  By middle school, especially by the time 8th grade Algebra rolls around, we could probably muster up a fairly large group of kids that look like Jerry’s kids demographically, but missed the benefits of the Plan because they entered MCPS at later grades.  These groups of Jerry-like kids make for ideal comparison groups.

 

Did Jerry’s kids really close the gap?

 

Jerry Weast retired from MCPS in 2011.  When he retired most MCPS-watchers[ix] concluded that Jerry’s kids were wonder students who had not only narrowed significant long-standing achievement gaps—test scores, graduation rates—but had closed them.  In fact, in early 2010, Weast testifying to a U.S. congressional subcommittee stated with clarity that MCPS had closed its achievement gap.  He told the subcommittee The district (MCPS) is proud of its accomplishments during the last decade in improving the level of student achievement and closing the gap between white and Asian American students and African American and Hispanic students.”[x]

 

In fact, MCPS discovered shortly after Weast’s departure that gaps really had not closed at all.  MCPS’s new superintendent, Joshua Starr, found himself embarking on a new journey to accomplish what Weast had not accomplished—narrow and close significant long-standing achievement gaps.[xi]

 

And so the work of raising the achievement levels of non-whites—mostly black and Latino students—remains a MCPS priority.  And it should remain a priority because the work is not done.  But what amazingly has never been a priority is answering the simple question: Where are Jerry’s kids?  Perhaps Jerry’s kids did close some gaps. Jerry’s kids are generally compared to MCPS whites (and Asians), with those comparisons revealing gaps never closed.  But what if we compared Jerry’s kids only to their peers—those who never benefited from the Plan’s investments but are demographically similar. Shouldn’t we want to know what such comparisons reveal?  Shouldn’t we know if Jerry’s kids outperformed their peers?

 

A long list of questions without answers

 

Let me be more specific here about thing we might want to know.  And so 15-16 years after implementing the Plan, and spending more money than most public school districts can ever hope to spend, the public has no answers to a long list of critically important questions about Jerry’s kids.  Questions such as:

 

  • How many of Jerry’s kids avoided special education?
  • How many of Jerry’s kids exited ESOL language services earlier than expected?
  • How many of Jerry’s kids avoided being suspended?
  • How many of Jerry’s kids exited the 5th grade reading on grade level?
  • How many of Jerry’s kids repeated a grade; never repeated a grade?
  • How many of Jerry’s kids departed MCPS as they snaked their way through MCPS?
  • How many of Jerry’s kids entered high school with Algebra 1 completed in a prior grade?
  • How many of Jerry’s kids were still enrolled in MCPS by the time they hit high school?
  • How many of Jerry’s kids dropped out of high school?
  • How many of Jerry’s kids became involved and active high school students; participated in extracurricular activities?
  • How many of Jerry’s kids enrolled in an Advanced Placement course?
  • How many of Jerry’s kids enrolled in college?

 

Take, for example, the special education question.  The Plan and its investments might have resulted in Jerry’s kids being better prepared academically as they advanced through the MCPS grades.  One could hypothesize, for example, that the Plan resulted in more of Jerry’s kids reading better and performing math at higher skill levels.  These academic gains would then have had a serious impact on reducing the need for Jerry’s kids to receive special education services.  In turn, these reductions in special education placements would have a positive impact on MCPS budget expenditures, perhaps demonstrating that the Plan eventually provides special education cost savings—a win-win, right?  And yet, 15 years later, we find ourselves not knowing much at all about what the Plan impacted.

 

Is it too late to track Jerry’s kids?

 

In a perfect world, MCPS planners should have put resources on the table upfront to track Jerry’s kids from their first day of kindergarten through high school and into adulthood.  MCPS has done some tracking of Jerry’s kids through the primary grades, but one cannot find any references to research reports tracking various cohorts of Jerry’s kids through the middle school grades, into high schools, and then beyond on the MCPS Office of Shared Accountability’s website.  As noted above, the first cohort of Jerry’s kids are college sophomores this year, assuming college was a goal.

 

The suggestion to track Jerry’s kids from their first day of kindergarten, through the grades, and then beyond high school graduation into adulthood, typically is referred to in research circles as longitudinal research.  A well-executed longitudinal study of Jerry’s kids would document and track all of Jerry’s kids, even those exiting MCPS across the grades (e.g., those who might depart MCPS because their family moves out of state).  At the high school level, tracking also would include following those who dropped out or those who exited high school successfully but decided not to attend college (e.g, tracking even those who entered the miliary).  To gauge the true impacts of the Plan requires that we know what the outcomes were for all of Jerry’s kids.

 

Longitudinal research studies are fairly common and often are funded by the federal government.  Some studies are extremely ambious in scope and nature.  Take, for example, the National Children’s Study, a federally funded longitudinal study managed by the National Institutes of Health (NIH).  Eventually, it will end up following 100,000 U.S. children from before birth through age 21.  The goal of the research is to determine what impacts children’s health and well-being.[xii]

 

The federal government has a fairly solid history of tracking the development of young children. For example, the U.S. Department of Education’s Early Childhood Longitudinal Study program includes three longitudinal studies that examine child development, school readiness, and early school experiences.[xiii]  Over time, these studies have generated a wealth of knowledge about what impacts school outcomes in the elementary grades.  We know, for example, that mothers who consistently read to their young children end up with better and more developed readers.

 

One also finds universities engaged in longitudinal studies.  Princeton and Columbia Universities are partnering to conduct the Fragile Families and Child Wellbeing Study.[xiv]  This longitudinal study is following a cohort of almost 5,000 children, most of whom are born to unmarried parents.

 

Perhaps one of the best known longitudinal study is the HighScope Perry Preschool Study.[xv]  This study has been tracking a cohort of youngsters who experienced Head Start-like preschool experiences in the early 1960s.  The study successfully has tracked 97% of the orginal cohort members through adulthood.  There is a fascinating video available that takes a brief look at outcomes for the cohort members who reached age 40.  Click here to view the video:  http://www.highscope.org/Content.asp?ContentId=611 .  The outcomes are extremely impressive, and they underscore the long-term and lasting impacts of exposing poor youngsters to high-quality preschool experiences.  Shouldn’t we want the same for Jerry’s kids?  And how about Jody’s kids?

 

And so, perhaps back to the future is in order

 

In 1999, a wise MCPS would have pre-planned a longitudinal study to track Jerry’s kids.  Going back in time now and recreating complete student records for a cohort of Jerry’s kids might prove to be cost prohibitive, especially, tracking down the records of Jerry’s kids who left MCPS.  Tracking is fairly inexpensive when it is planned, but finding hundreds of Jerry’s kids who are now “to the winds” would be an expensive endeavor.  It always is less expensive to gather data moving forward than it is to gather data moving backwards.

 

Going back, however, is possible and it has been successfully executed by MCPS in the past.   In 1985, a MCPS researcher conducted a study of Head Start graduates and uncovered positive, long-lasting impacts.[xvi]  This 1985 study was a historical review of MCPS academic records.  So, perhaps back to the future is once again in order for MCPS.  MCPS could pick the Class of 2015, the third official Jerry’s kids cohort, and back track to uncover everything we need to determine where Jerry’s kids are, including those that departed MCPS early.  And while MCPS researchers are at it they also could figure out what happened to Jody’s kids.  Not knowing these legacies is not just a shame but also an embarrassing misstep that prevents us from knowing if the Plan really worked.

 

Postscript comment on Jody’s kids versus Jerry’s kids

 

I’m sure some will argue that Jody’s kids and Jerry’s kids are not part of the same conversation.  I completely disagree, and believe they are critically linked.  Jody’s kids were exposed to additional resources beyond Weast’s original Plan.[xvii]  Nonetheless, much of what took place at Broad Acres Elementary School was the Plan.  Regardless, I think it would be a fascinating undertaking (study) to figure out the answers to both questions, where are Jody’s kids and where are Jerry’s kids?  And I would hypothesize that Jody’s kids ought to be achieving a levels slightly above Jerry’s kids.  In fact, one way to view Jody’s kids is to simply see them as kids exposed to the Plan on steoroids.

[i]http://www.washingtonpost.com/local/education/the-legacy-of-a-school-transformed-name-changed-to-honor-educator-jody-leleck/2014/10/11/526bdaf2-42b5-11e4-b47c-f5889e061e5f_story.html .

 

[ii]It is important to point out that in some documents, the time period contributed to the Leleck years is 1999 to 2004. In other documents, the time period is shorter and documents successes, for example, over a two-year period, 2003-04. See, for example, http://www.mooneyinstitute.org/sites/default/files/ITUL%20BAES%20CaseStudy(15)%203-1-10.pdf , written by Mark Simon.

 

[iii]http://www.montgomeryschoolsmd.org/info/CTBS2003/PDF/2003CTBSLongitudinalStudy.pdf,  p.1.

 

[iv]The authors of the 2009 book Leading for Equity frequently cite the Plan’s first year price tag as $100 million.

 

[v]http://www.montgomeryschoolsmd.org/leadingforequity/pdf/HarvardCase-DifferientiatedTreatment.pdf

 

[vi]http://www.montgomeryschoolsmd.org/info/CTBS2003/PDF/2003CTBSLongitudinalStudy.pdf,  p.13.

 

[vii]http://www.pewtrusts.org/en/research-and-analysis/reports/0001/01/01/lessons-in-early-learning

 

[viii]One can find MCPS reports at this website: http://sharedaccountability.mcpsmd.org/reports/list.php .

 

[ix]http://www.bethesdamagazine.com/Bethesda-Magazine/March-April-2011/The-Last-Lessons-of-Jerry-Weast/.

 

[x]Testimony of Dr. Jerry D. Weast, Superintendent of Schools, Montgomery County Public Schools.

Hearing of the United States Senate Appropriations Committee, Subcommittee on Labor, Health and Human Services, and Education, and Related Agencies. January 21, 2010.

 

[xi]http://www.montgomeryschoolsmd.org/uploadedFiles/departments/superintendent/transitionalplan/SuperintendentsTransitionTeamReport.pdf

 

[xii]https://www.nationalchildrensstudy.gov/Pages/default.aspx

 

[xiii]http://nces.ed.gov/ecls/index.asp

 

[xiv]http://www.fragilefamilies.princeton.edu/index.asp

 

[xv]http://www.highscope.org/content.asp?contentid=219

 

[xvi]http://files.eric.ed.gov/fulltext/ED263977.pdf

 

[xvii]http://www.mooneyinstitute.org/sites/default/files/ITUL%20BAES%20CaseStudy(15)%203-1-10.pdf

Diversifying the MCPS Teaching Staff

The Montgomery County Council’s Office of Legislative Oversight (OLO) recently released a report that showed a “demographic mismatch” between the Montgomery County Public Schools (MCPS) students and teachers. According to the report, which compared the race and ethnicity of students to faculty, the county’s teaching staff is disproportionately white relative to its student body. The MCPS student body is 33 percent white, 27 percent Latino, 21 percent Black, and 14 percent Asian, while the MCPS teaching staff is 76 percent white, 5 percent Latino, 13 percent Black and 5 percent Asian.

This “mismatch” is not unique to MCPS. A recent state-by-state analysis of student-teacher demographics by the Center for American Progress (CAP) found that there is a significant “diversity gap” between teachers and students in every state in the nation. Using a “parity index” similar to the one used in the OLO report, the study calculated how close each state was to a demographic match (zero equaling a perfect teacher-student match and 100 equaling a perfect mismatch). California had the biggest mismatch, with an index of 44, but Maryland wasn’t far behind with an index of 40.

Using the CAP index, which subtracted the percentage of teachers of color from the percentage of students of color, MCPS would be a 43. Notably, the states that are furthest from parity are also among the most racially diverse (e.g. California), compared to racially homogeneous Vermont, which had the smallest mismatch with an index of 4.  States like California and Maryland, and districts like MCPS, have Latino, African-American, Asian-American and other student populations that are fast out-pacing the relatively static and disproportionately white teacher populations, and a lot of work to do to balance out the teacher-student demographics. The parity index used by OLO is different in its calculation and level of detail (by subgroup, for example), but offered the same conclusion:  the teaching staff does not reflect the great diversity of its students and the county should do more to diversify its teaching ranks.

Waiting for the teaching staff to “catch up” to the students is not the answer. While racial/ethnic diversity in teaching has increased over the last couple decades, according to a recent analysis of federal School and Staffing Survey data, this increase pales in comparison to the explosive growth of student diversity. In fact, the CAP study found that the diversity gap had grown worse since 2011. Unfortunately, the recent OLO study doesn’t show any trend data to see how the MCPS teaching population has changed over time. Curious, I decided to look at some rough numbers from my own alma mater, Springbrook high school, to see how things might have changed in twenty five years. Back in 1989, roughly 55 percent of Blue Devils were white, 30 percent African American, 9 percent Asian and 6 percent Latino. Today, the demographics have shifted but it remains an incredibly diverse mix of students, with 42 percent African American, 34 percent Latino, 12 percent Asian and 9 percent white. In terms of race and ethnicity, the student body is more of a melting pot than it was twenty-five years ago. This is heartening to know, since I believe that the experience of going to a school like Springbrook forced me and my peers to confront, understand and value racial, ethnic and cultural diversity more than most.

Interestingly, I attribute this experience to the diversity of the students, not the faculty, which is far more diverse now than it was in 1989. Then, Springbrook’s faculty was only about 5 percent “non-white” (a handful of African American, Hispanic and Asian-American teachers). Today, the number of non-white teachers has jumped to 40 percent. Still, in terms of parity, Springbrook doesn’t look good, given that 90+ percent of its students are not white.

Superintendent Starr has said he plans to aggressively tackle the issue of staff diversity but it’s not yet clear how that will happen. Here are a few ideas:

  1. Diversify for need’s sake, not diversity’s sake. We tend to include everything under the tent of diversity, which makes defining and executing a clear recruitment and development strategy nearly impossible. The goal is not merely to balance the racial and ethnic make-up of the student and staff populations (step into an MCPS school and you’ll see how impossible this would even be). It’s to diversify the staff so we can better address the needs of MCPS students. So, do we want more African American teachers to benefit the large number of African American students who are disproportionately struggling academically? Should this recruitment goal target certain areas (the MCPS study included a helpful breakdown by MCPS clusters)? Specifically, do we want more African American male teachers, since African American boys in particular are over-represented in discipline counts and in placement for special education? (Gender, notably, was not part of the OLO study but is a huge consideration given that teaching is becoming more, not less, female-dominated). Relatedly, Dr. Starr has said he wants to recruit for “cultural competency” and not necessarily race and ethnicity. “Cultural competence” is more difficult to measure but, unlike race or ethnicity, it is also a skill set that can be gained over time (e.g., knowledge of different cultures and customs, the ability to effectively teach students from a variety of cultures). While we may want it all—a well-prepared teaching staff that is culturally competent, balanced by race/ethnicity and gender and whatever else—it is careless and presumptuous to tackle this as one issue. Some of the best bilingual teachers are white and native English speakers, and there are plenty of Latino and Asian and African-Americans who only speak English and are not “culturally competent” just because of their race/ethnicity.

 

  1. Prioritize language skills. One important but unmet goal of the OLO study was to explore gaps in linguistic diversity between staff and students. Lacking data on staff language skills, the study wasn’t able to conclude much on this. But improving the language skills of MCPS teaching staff should be among the most important goals of the system, especially given the rapidly increasing non-English speaking immigrant communities in MCPS. Collecting and tracking data on staff language skills and recruiting for bilingual and multi-lingual teachers are goals that would be widely supported not only by non-English speaking communities but also by the English-speaking population, which continues to actively lobby for more language immersion school options in the county. Partnerships like those that MCPS has recently formed with institutions like the dual-language Ana G. Mendez University System would ostensibly bring many more bilingual teachers into the system. The linguistic diversity of Montgomery County is a strength, and building on this should be a priority.

 

  1. Grow our own. Another long-term strategy for MCPS, and perhaps its best, is to develop our own student-to-teacher pipeline. If MCPS can clarify what skills and characteristics it really needs, its graduates should be able to return to its classrooms as teachers. This would require the availability of more high-quality teacher education programs in the state, which is not within the district’s control (but is certainly something MCPS and the broader community can and should push for). But there’s no doubt that a stronger pipeline can be built, and beyond a few partnerships or a scholarship here and there. Why not build a teacher residency program like its neighbors PG and DC, and others across the nation? Or consider the model of Educators Rising, a spin-off of Future Teachers of America that is starting with high school classes (Intro to Teaching) aimed at engaging and training young students for teaching careers. Given that more than 60 percent of teachers in the nation teach within twenty miles of where they went to high school, this seems like a good bet for MCPS. We have the population for a strong pipeline of high-quality teachers that reflect the demographics of the county. We just need to cultivate it.

 

 

Which Washington-area system does best at funding its neediest schools?

(re-posted from The Fordham Institute)

In the era of No Child Left Behind—and at a time of growing concern about income inequality—virtually every school system in the country claims to be working to narrow its student achievement gaps. But are they putting their money where their mouth is?

The data in our brand new D.C. Metro Area School Spending Explorer website allow us to answer this question for school districts inside the Beltway. Specifically, we can determine whether and to what degree they are spending additional dollars on their neediest schools.

To be sure, ever since the Coleman Report, it’s been hard to find a direct relationship between school spending and educational outcomes. Still, basic fairness requires that systems spend at least as much on educating poor students as affluent ones, and investments that might make a difference in narrowing achievement gaps (such as hiring more effective, experienced teachers and providing intensive tutoring to struggling students) do require big bucks.

There are lots of wonky ways to compute the fairness of education spending, but we’re going to use a measure that makes sense to us. Namely: How much extra does a district spend on each low-income student a school serves? Compared to what districts spend on behalf of non-poor students? Ten percent? Twenty percent? Fifty percent?

Read the methodology section below for details on how we got to these numbers (they are estimates, and apply only to elementary schools), but here are our conclusions.

School System Extra spending for low-income students Over a floor of…
Arlington County Public Schools 80.5% $11,817
Fairfax County Public Schools 34.1% $10,669
Montgomery County Public Schools 31.7% $11,464
District of Columbia Public Schools 21.2% $13,514
Alexandria City Public Schools 14.4% $13,120
D.C. Public Charter Schools 5.9% $15,243
Prince George’s County Public Schools 1.9% $10,385

For example, in Arlington County, the district spends close to $12,000 per student at its low-poverty schools (those with very few poor children). But it spends north of $21,000, or 81 percent more, for each student who is eligible for a free or reduced price lunch—significantly boosting the resources of its highest-poverty schools.

Let us be clear that school systems aren’t necessarily achieving these spending outcomes by design. As we explain in the “Drivers of School Spending” section of our D.C. Metro Area School Spending Explorer website, they may not even have been aware of these differences. That’s because individual schools in a given district don’t actually have “budgets” of their own; they are generally given a certain number of staff positions (driven by the number of students they serve) and might be eligible for extra programs or resources depending on need.

Nor is it likely that poverty rates are the only things driving these differences. Larger schools, for example, tend to spend less per-pupil than smaller schools (costs for staff like nurses can be spread over more students); districts might also be providing extra resources to schools with large numbers of special education students or English language learners. So we know that our analysis is oversimplifying what’s causing these patterns.

With those caveats in mind, what to make of these results? The outliers are fascinating. Arlington—with its sky-high tax base and gentrifying population—definitely goes the distance for its high-poverty schools. On the other hand, poverty-stricken Prince George’s County appears to be doing practically nothing to spend what little money it has on its toughest schools. (It makes us wonder how it meets federal “supplement, not supplant” requirements.)

And these findings are more than a little embarrassing for Montgomery County, which prides itself on its commitment to “social justice,” and has an explicit policy of sending extra resources to its highest poverty schools. Yet it is bested by Fairfax County (by a little) and Arlington (by a lot).

Per-pupil spending on high-poverty schools

Let’s look at this question through another lens: Specifically, the perspective of low-income students and parents in the Washington area. What they experience in school is not relative spending but real dollars: How much money does a particular school have to devote to teacher salaries, extra programs, etc.?

So: How much do high-poverty schools in the Washington area spend per pupil, and how does that vary by school system? (Again, we only used data for elementary schools.)

Here’s what we found:


School System
Average spending for high poverty schools* Range of spending for high poverty schools* Number of high poverty schools*
Arlington County  $18,216  $17,604 – 18,827 2
D.C. Charter Schools  $16,136  $13,145 – 19,847 18
Alexandria City  $14,501  $12,734 – 17,272 3
D.C. Public Schools  $14,497  $13,095 – 16,391 10
Fairfax County  $13,821  $12,225 – 17,548 7
Montgomery County  $13,613  $11,862 – 15,698 10
Prince George’s County  $10,607  $7,981 – 16,493 50

* 75% or more Free or Reduced Price Lunch enrollment, primary schools only (i.e., no K-8 schools included)

Arlington again earns plaudits for its generosity towards its high-poverty schools, though by our count there are only two of them. High-poverty charter schools in Washington are well funded too, though it’s important to note that they tend to be extremely high-poverty; more than two-thirds of the eighteen charter schools in our analysis top the 85 percent poverty mark. To the extent that low-income students bring extra resources along with them (including federal Title I dollars), the results for Washington’s charter schools make sense. (And note: These numbers are for operational costs only; they don’t include facilities funding, which is where DC’s charters are at a huge funding disadvantage compared to DCPS.)

Note the numbers (again) for Fairfax and Montgomery County. If Superintendent Josh Starr is an “equity warrior,” what does that make the folks across the river?

The big story here, though, is Prince George’s County and its shockingly low spending for its fifty (!) high-poverty elementary schools. The averages are bad enough—spending that is almost 30 percent lower than for DCPS high-poverty schools and almost a quarter less than Montgomery County spends on similar schools. But looking at specific schools makes the picture even more devastating.

Consider District Heights Elementary, which spends just $7,981 per student, although 77 percent of its pupils qualify for subsidized lunches. Compare that to Moten Elementary in the District, which spends $14,723 for each of its students (76 percent eligible for a free or reduced price lunch)—or almost twice as much. The schools are less than seven miles apart.

Therefore, if a low-income mom moves from the District of Columbia to Prince George’s County, and her child attends high-poverty public schools in both locales, her child’s new school will have dramatically lower-paid (and/or less experienced) teachers, fewer special programs, fewer specialists, larger class sizes, or all of the above.

It’s hard not to conclude that Washington’s rapid gentrification—which is pushing many needy families from the District to Prince George’s County—is leading to a very inequitable outcome, at least in terms of school spending.

As Marguerite Roza has argued for years, school systems ought to live their values. If doubling-down on the education of poor children is something these systems (and their residents) support, they need at least to know whether their dollars are reaching the neediest children. Now we know that some of the Washington-area school districts could be doing a whole lot more for their low-income students. And the state of Maryland almost certainly could and should be doing more for Prince George’s County. Who will act to fix these problems?

Methodology

To find out how we estimated the per-pupil spending of each school in the Washington, D.C. area, see the methodology section of our D.C. Metro Area School Spending Explorer website; once we had those numbers, the next challenge was to understand the relationship between schools’ poverty rates and their spending. The first step was to estimate the “floor” of per-pupil expenditures (PPE) for each district, and then figure out how much extra they spend on low-income students. Elementary, middle, and high schools tend to have dissimilar spending patterns, so we only included elementary schools when calculating estimates. (There are lots more elementary schools than middle or high schools.)

To make our estimates for each district, we regressed school-level PPE against the percentage of students eligible for free or reduced price lunch (FRPL). The spending floor was derived from the result’s constant coefficient. The extra dollars allocated to low-income students were set equal to the FRPL coefficient. (More simply, we scatter-plotted FRPL (x-axis) and PPE (y-axis) for each district. We then calculated the lines of best fit: the y-intercept is the spending floor and the slope is extra spending.)

From there, it was a simple matter of dividing extra spending by the spending floor to find the extra spent on low-income students. It’s a rough estimate, of course, since we didn’t include any controls and we assume a linear relationship. But minus Prince George’s County, Alexandria, and D.C. Charters, FRPL confidence levels were greater than 99 percent. R-squared values were also large, with Montgomery County at the low end (.24) and Arlington at the high (.85). Because of this analysis’ descriptive nature, the lack of significance and the low R-squared values for the other three districts is not a problem. The numbers are low because none have a strong pattern of progressive expenditures, school-to-school. With a coefficient of 192.6 and an r-squared of -.008, Prince George’s County’s pattern isn’t just weak—it’s nearly non-existent.

For Coherence Sake: Defer Using the New PARCC Tests for HS Graduation

On October 7, MCPS sent a letter from Phil Kauffman the President of its Board of Education to the Maryland State Superintendent of Schools requesting the state to reconsider plans to use the new annual test called the Partnership for Assessment of Readiness for College and Careers (PARCC) assessment as end-of-course exams for purposes of fulfilling high school graduation.  For several years the Maryland State Department of Education (MSDE) has been preparing to use these new tests developed by a consortium of states as a replacement for the Maryland State Assessment (MSA) and High School Assessment (HSA) tests that have been used for the No Child Left Behind (NCLB) law’s accountability requirements.

These new PARCC tests hold much promise to improve the information to schools.  They are developed to be aligned with the new Common Core State Standards (CCSS) for literacy and math.  They use state of the art technology for an adaptive testing experience.  They are also unproven and given that they are likely more stringent than the MSA/HAS tests they replace, many students may not pass them, which would require many students to take a transitional class prior to graduating.  The state is considering a two tier approach where there is a criteria (cut-off score) for being considered “college and career ready” and a lower score to allow students to graduate.  This tiered approach has been advocated for by MoCoEdBlog editorial member, Rick Kahlenberg in a piece titled “Hold Students Accountable and Support Them.” 

MCPS is requesting that MSDE delay the implementation of these requirements and enter into a discussion about how to move forward.  Kaufman’s letter posed several important questions:

“If a college-ready cut score differs from the graduation cut score, what is the most meaningful indicator for institutions of higher education or employers? What messages do tiered cut scores send to students? Maryland now requires all students to be assessed for college and career readiness, and those found not ready, must be enrolled in transitional courses. Given this new paradigm, is there benefit added to continuing the requirement to pass end-of-course exit exams to receive a diploma? Moreover, if, during this period of transition from HSA to PARCC, it is appropriate to prohibit use of PARCC for purposes of personnel evaluations, why is it not equally appropriate to delay the use as a high stakes test for students?”

These are important questions.  I will focus on the last one about using the tests for one purpose only: graduation.  I believe that within this question lays an important systemic consideration that Kauffman’s letter only hinted at: if the tests are not ready to be used to evaluate teachers and principals then why should they be used for students?  I believe this is a strong argument in support of MCPS’ request.  While the change from one testing system to another may seem a matter of upgrading the measurement approach, the reality on the ground is that these kinds of changes are not “plug and play.”  There are many interdependent and moving parts in a school system and high-stakes tests impact many of them, including teachers and students alike.  Students are dependent upon the instruction they receive and the instruction is shaped by both the ability of the teachers and the rewards, incentives, and constraints they work within.

To use the tests for one part of the system (student graduation) but not for another part (educator evaluation) would, in my view, create a tension in the system.  It would have part of the organizational focus in one direction and another part in another.  High school achievement, like the achievement gap and many other big problems in education are systemic.  They have multiple interrelated causes and one of the reasons they are so hard to address for MCPS and the rest of the country is that independent solutions rarely address the combination of factors that underlie the problems in complex systems.  Using PARCC for only student accountability is a partial approach.  And, as Kauffman’s letter says, it puts students in the unfair position of being the ones getting the shorter end of the stick as these new tests are tried out on their future first.  For this reason alone, I believe the MCPS request deserves support from both MSDE and Montgomery County’s elected officials.

There are more questions that can be asked of MCPS about their readiness to support PARCC across the system.  Below, I will sketch out some other important factors and end with some questions that could be included in the conversation that MCPS in Kauffman’s letter requests.

Some Observations about PARCC

PARCC Will Initially Be Disruptive

The implementation of the new tests will be disruptive.  How much they will disrupt the work that goes on in schools is not clear.  But, history has shown, including with NCLB, that most large scale changes in schools can “shock the system” and take some time to become assimilated into the routine.  The day to day work of schools is so labor intensive and what teachers do especially is so often based on what they have done in the past that any change such as a new curriculum will take some time to assimilate.  The fact that PARCC will be aligned to the CCSS will help as MD schools will have a couple of years of experience with these standards. Still, the new tests under the best of circumstances will require some adjustment, at least at least in the first year.

The impact on the schools has the potential to be even bigger and more difficult on schools.  If the PARCC results are tied also to high-stakes consequences such as teacher evaluations or school performance.  One of the lessons of NCLB (and there is a lot of research on this) was that schools with greater challenges suffered much more collateral damage than schools with better circumstances.  So, if the PARCC tests will be high-stakes then they will be higher stakes in the schools with the most needs.  The tests can still provide much valuable information and the information should be used, but tying these results to high-stakes for educators would likely be disproportionally absorbed by high-needs schools.  MCPS needs a robust plan that addresses the impacts of PARCC on the system.

Implementation Questions with Technology-Rich Assessments

PARCC Assessments are designed to be delivered on computers rather than paper and pencil. However, not all school systems or school buildings have the same technology and so there are alternative testing approaches that have raised some questions about how PARCC will work when the rubber meets the road. For example, national education expert Rick Hess raised three big issues earlier this year:

  1. Testing under different testing conditions (some in classrooms, some in media centers, some offsite at different locations.
  2. Testing using different devices (ex: computer vs. paper and pencil)
  3. Testing windows that can vary from school to school so that the tests may be taken at different times by different students

None of these issues fundamentally compromises the value of the tests both as well designed instruments and even more being aligned with the CCSS.  But they all can impact the scores in ways that will be really hard to know until after the tests have been administered.  Will the impact on the scores vary based on the kind of school in the same way high-stakes impacts high needs schools more than others?  Quite possibly they will.  MCPS should look at its implementation options and try as best as possible to standardize testing approaches across schools.

Will the Tests Perform as Designed Initially?

When we read that the testing of the tests has gone smoothly, it is important to remember that these are reports from the people who are administering the tests and that smoothly may mean different things to them than to educators.  For example, if the field trial occurs where and when it was planned and the results are able to be tabulated by PARCC, the field trial is smooth from a technical perspective.  This doesn’t mean, however, that the tests were measuring the same things that were taught or that they did as good a job with different populations as the designers hoped.  Larger amounts of real data and more time are required to know this.  Again, it probably will not be until after the first year of full administration that these issues will be clearer.  Also, scuttlebutt from behind the scenes at PARCC has for a few years now has been that the amount of money they began with was less than they needed so don’t be surprised if the quality of the tests is not even; that some parts of the curriculum test more reliably than others.  MCPS should be careful about making inferences based on the results of any part of the curriculum until the broad strengths/weaknesses of the test quality are known.

Some Important Questions for MCPS’ Implementation of PARCC

While it is important to support MCPS’ request, some questions could be asked of them about their plan going forward.

  1. Professional Development and Support. With the recent adoption of CCSS curricula, MCPS along with just about every school district has found the need for professional development was more urgent than expected.  How are the plans coming to train MCPS educators in how to use PARCC?  What lessons have been learned from the pilots thus far about the technology needs as well as the performance of the tests beyond the fairly positive accounts MSDE and PARCC have provided?
  2. PARCC Impact on Technology Budgets. How will PARCC impact the spending decisions throughout the school system?  One of the biggest criticisms of the CCSS has been it is an opportunity for companies to make even more money from education.  School principals, teachers, and even some families are getting inundated with many offers of products that will help prepare students to do well with CCSS and PARCC tests.  Most of these claims are unverified. There is no body that will certify that a product is 100% or 50% CCSS compliant.  There will in the future probably be ways of rating these products this way by the people who use them; but not today.  MCPS would be wise to not to spend too much public money on materials to help prep for first round of tests if it can be avoided.  Much of what is on the market now has been rushed to market and is full of errors. Reviewing materially centrally and making recommendations to schools for how to purchase makes a lot of sense as does working with partner districts to assess the quality of materials and technology.  While MCPS tends to defer a lot to individual schools (site-based management) rather than centrally manage and direct, in this case it may be useful for MCPS to take stock of the products that are out there and provide good technical support to schools.
  3. Accountability Options. One of the driving reasons for high-stakes tests is that not all schools perform as they should and not all schools perform equally well with all groups of students.  Even with all of the many problems with implementation, policies like NCLB have been important ways to see educational differences and also to shift the conversations for many in education towards hard outcomes.  As the sanctions of accountability are even temporarily lifted, what will MCPS be doing to ensure all students are getting the kind of education they deserve?  Will the PARCC test results be combined with other forms of evidence to ask about where there are areas that need improvement and additional attention? While delays in using PARCC for HS graduation make sense, what other external accountability options will MCPS use to ensure all children receive appropriate education.

Summary

As Kauffman’s letter spells out, the issues surrounding the use of PARCC tests for high school graduation are complex and consequential.  The MCPS request to delay implementation of the state’s plan is reasonable.  Whether the state will listen is unclear.  Whether more information about how MCPS is getting ready for PARCC and the new testing and standards paradigms it is part of will help MSDE in their decision is unclear.  For those closer to MCPS—parents, teachers, local elected officials—this kind of information is probably important to have.  For the sake of MCPS’ management thinking and capacity to deal with the difficult and complex problems of student achievement, it is probably important to develop.  MCPS, like pretty much all districts, has traditionally been dependent on state policy and so there may be a tendency when in this role to wait and respond rather than taking the lead and driving the discussion.  MCPS is no ordinary district.  It has not only broad needs but many financial and intellectual resources so it is in a better position than most to lead rather than respond.  The tone and message of Kauffman’s letter suggests this is what MCPS is trying to do.  Let’s hope the state is ready to meet them in a discussion about this difficult issue.

Laurie Halverson: Supports Idea of Standards, Concerns about Common Core

  1. Common Core Standards.The MCPS web site does not say much about “Common Core” standards but instead focuses on its own “Curriculum 2.0” and has teachers and students learning new standards through the county’s developing curriculum and teacher training.  Do you support the Common Core?  Is MCPS doing a good job of navigating the new standards?  And, how would you direct them to do it differently?

I support the general idea of standards for school districts. Standards help parents know whether their child is meeting or exceeding expectations and parents value test scores per school for comparison purposes. Standardized test scores are one tool to help teachers identify students who need support or acceleration. They also provide a gauge for school district leaders in determining whether schools need more support or intervention.

Former state superintendent, Nancy Grasmick said in the May 25, 26, 2010 state board minutes that the purpose of Common Core State Standards (CCSS) was to close the achievement gap. But, how we can expect students who already aren’t performing at the lower performing state standard, to close the gap with a new higher standard without significant professional development and additional resources?  Standards and curriculum by themselves will not close the achievement gap.

It concerns me that 500 early educators signed a statement indicating that they have “grave concerns” about CCSS:  http://www.edweek.org/media/joint_statement_on_core_standards.pdf.  It also concerns me that some top educators, such as Sandra Stotsky and James Milgram who were on the CCSS validation panel, refused to sign a document to approve CCSS. It concerns me that MCPS has a policy (IFA) on curriculum that states that teachers should have ongoing professional development and parents should be partners in the development of the curriculum. Yet, the minimum amount of teacher training has been optional and parents have had little role in curriculum development and don’t even have access to the learning materials at home. It concerns me that there is no plan in place on how anyone in Maryland can give input if we want to change or improve any of the standards. It concerns me that it is costing our state taxpayers a tremendous amount without legislation or a democratic process: While the federal government gave $4 billion in Race to the Top Grants to certain states, it will cost our nation at least $16 billion to implement it. Many states have passed legislation and are taking action to gain back control over the content of curriculum, but Maryland has not.

What I would do differently to direct MCPS as a Board of Education member:

  • I want our schools to move away from teaching to the test and emphasize teaching and learning for all subjects. A resident at Leisure World told me she was taught so well at her school years ago that there was no need to “teach to the test” because she could take any test and perform well-that is what I want for MCPS students.
  • I would want to see measures on how MCPS will evaluate the success of Curriculum 2.0.
  • If MCPS continues to adhere to CCSS next year, I would pursue mandatory professional development for CCSS. I have spoken to teachers who say that students with teachers who skipped the optional training will be at a disadvantage.
  • I would push for more accurate and consistent ways of identifying students for acceleration.
  • I would be involved at the national level, seeking ways for the public to provide feedback on the current standards and how they can be improved in the future.
  • I would also seek changes to the MCPS grading and reporting policy to make sure the report card accurately measures student performance and is easy for parents and students to understand.
  • I would ask for more data such as final exam results per school in comparison with the corresponding course grades the student achieved.
  • Before approving new technology, I would ask financially relevant questions such as, “Will the Chromebooks be compatible for PARCC and MAP testing?” (I heard from an administrator that MAP tests are not compatible with the newly purchased Chromebooks.)

——————————————-

As MCCPTA VP of Educational issues I attended at my expense, a White House Community Partnership Summit at the University of Pennsylvania on March 2, 2012.  I wanted to give feedback to the White House about how Race to the Top policies were affecting us at the local level. Here is a link to my report:

http://mccpta.com/curriculum_dir/2011-2012/WhiteHouseSummit.pdf

 

Here are two excellent links on the CCSS: Building the Machine-The Common Core Documentary http://www.commoncoremovie.com/

Diane Ravitch: Everything You’ve Wanted to Know about Common Core:

http://www.washingtonpost.com/blogs/answer-sheet/wp/2014/01/18/everything-you-need-to-know-about-common-core-ravitch/

 

Maryland State BOE Meeting minutes from May 25,26, 2010 when they discussed Race to the Top and Common Core State Standards:

http://marylandpublicschools.org/NR/rdonlyres/5D922A58-42B9-420F-997F-11CF4B13DEB4/24493/May25262010.pdf

 

Kristin Trible: Supporting Common Core

Responding to a Question from MoCoEdBlog on Common Core Standards.  The MCPS web site does not say much about “Common Core” standards but instead focuses on its own “Curriculum 2.0” and has teachers and students learning new standards through the county’s developing curriculum and teacher training.  Do you support the Common Core?  Is MCPS doing a good job of navigating the new standards?  And, how would you direct them to do it differently?

Response:  I support the Common Core as it is a unique opportunity to ensure that educational standards are part of the national conversation and that we continue to prioritize the education of our children.  The other day, I saw someone share a poorly written test and erroneously blame the Common Core standards.  We continue to see individuals become confused between standards, curriculum, and testing.  It is unfortunate inaccuracies such as this one that are taking away from a truly important conversation on educational standards and, just as importantly, what methods we will put in place to gather feedback and determine how to improve upon them in the future.   The current standards may not yet be perfect but it is up to each school system to work with the standards and implement a strong, comprehensive curriculum that meets or exceeds those standards.  MCPS has had its share of missteps in the implementation, but change was going to be difficult no matter what.  It’s time to move forward and ensure our teachers are well supported and enthusiastic about teaching in a new manner.  If slowing down is appropriate, we can slow down.  I would not support turning back.