Category Archives: Data and Information

Which Washington-area system does best at funding its neediest schools?

(re-posted from The Fordham Institute)

In the era of No Child Left Behind—and at a time of growing concern about income inequality—virtually every school system in the country claims to be working to narrow its student achievement gaps. But are they putting their money where their mouth is?

The data in our brand new D.C. Metro Area School Spending Explorer website allow us to answer this question for school districts inside the Beltway. Specifically, we can determine whether and to what degree they are spending additional dollars on their neediest schools.

To be sure, ever since the Coleman Report, it’s been hard to find a direct relationship between school spending and educational outcomes. Still, basic fairness requires that systems spend at least as much on educating poor students as affluent ones, and investments that might make a difference in narrowing achievement gaps (such as hiring more effective, experienced teachers and providing intensive tutoring to struggling students) do require big bucks.

There are lots of wonky ways to compute the fairness of education spending, but we’re going to use a measure that makes sense to us. Namely: How much extra does a district spend on each low-income student a school serves? Compared to what districts spend on behalf of non-poor students? Ten percent? Twenty percent? Fifty percent?

Read the methodology section below for details on how we got to these numbers (they are estimates, and apply only to elementary schools), but here are our conclusions.

School System Extra spending for low-income students Over a floor of…
Arlington County Public Schools 80.5% $11,817
Fairfax County Public Schools 34.1% $10,669
Montgomery County Public Schools 31.7% $11,464
District of Columbia Public Schools 21.2% $13,514
Alexandria City Public Schools 14.4% $13,120
D.C. Public Charter Schools 5.9% $15,243
Prince George’s County Public Schools 1.9% $10,385

For example, in Arlington County, the district spends close to $12,000 per student at its low-poverty schools (those with very few poor children). But it spends north of $21,000, or 81 percent more, for each student who is eligible for a free or reduced price lunch—significantly boosting the resources of its highest-poverty schools.

Let us be clear that school systems aren’t necessarily achieving these spending outcomes by design. As we explain in the “Drivers of School Spending” section of our D.C. Metro Area School Spending Explorer website, they may not even have been aware of these differences. That’s because individual schools in a given district don’t actually have “budgets” of their own; they are generally given a certain number of staff positions (driven by the number of students they serve) and might be eligible for extra programs or resources depending on need.

Nor is it likely that poverty rates are the only things driving these differences. Larger schools, for example, tend to spend less per-pupil than smaller schools (costs for staff like nurses can be spread over more students); districts might also be providing extra resources to schools with large numbers of special education students or English language learners. So we know that our analysis is oversimplifying what’s causing these patterns.

With those caveats in mind, what to make of these results? The outliers are fascinating. Arlington—with its sky-high tax base and gentrifying population—definitely goes the distance for its high-poverty schools. On the other hand, poverty-stricken Prince George’s County appears to be doing practically nothing to spend what little money it has on its toughest schools. (It makes us wonder how it meets federal “supplement, not supplant” requirements.)

And these findings are more than a little embarrassing for Montgomery County, which prides itself on its commitment to “social justice,” and has an explicit policy of sending extra resources to its highest poverty schools. Yet it is bested by Fairfax County (by a little) and Arlington (by a lot).

Per-pupil spending on high-poverty schools

Let’s look at this question through another lens: Specifically, the perspective of low-income students and parents in the Washington area. What they experience in school is not relative spending but real dollars: How much money does a particular school have to devote to teacher salaries, extra programs, etc.?

So: How much do high-poverty schools in the Washington area spend per pupil, and how does that vary by school system? (Again, we only used data for elementary schools.)

Here’s what we found:


School System
Average spending for high poverty schools* Range of spending for high poverty schools* Number of high poverty schools*
Arlington County  $18,216  $17,604 – 18,827 2
D.C. Charter Schools  $16,136  $13,145 – 19,847 18
Alexandria City  $14,501  $12,734 – 17,272 3
D.C. Public Schools  $14,497  $13,095 – 16,391 10
Fairfax County  $13,821  $12,225 – 17,548 7
Montgomery County  $13,613  $11,862 – 15,698 10
Prince George’s County  $10,607  $7,981 – 16,493 50

* 75% or more Free or Reduced Price Lunch enrollment, primary schools only (i.e., no K-8 schools included)

Arlington again earns plaudits for its generosity towards its high-poverty schools, though by our count there are only two of them. High-poverty charter schools in Washington are well funded too, though it’s important to note that they tend to be extremely high-poverty; more than two-thirds of the eighteen charter schools in our analysis top the 85 percent poverty mark. To the extent that low-income students bring extra resources along with them (including federal Title I dollars), the results for Washington’s charter schools make sense. (And note: These numbers are for operational costs only; they don’t include facilities funding, which is where DC’s charters are at a huge funding disadvantage compared to DCPS.)

Note the numbers (again) for Fairfax and Montgomery County. If Superintendent Josh Starr is an “equity warrior,” what does that make the folks across the river?

The big story here, though, is Prince George’s County and its shockingly low spending for its fifty (!) high-poverty elementary schools. The averages are bad enough—spending that is almost 30 percent lower than for DCPS high-poverty schools and almost a quarter less than Montgomery County spends on similar schools. But looking at specific schools makes the picture even more devastating.

Consider District Heights Elementary, which spends just $7,981 per student, although 77 percent of its pupils qualify for subsidized lunches. Compare that to Moten Elementary in the District, which spends $14,723 for each of its students (76 percent eligible for a free or reduced price lunch)—or almost twice as much. The schools are less than seven miles apart.

Therefore, if a low-income mom moves from the District of Columbia to Prince George’s County, and her child attends high-poverty public schools in both locales, her child’s new school will have dramatically lower-paid (and/or less experienced) teachers, fewer special programs, fewer specialists, larger class sizes, or all of the above.

It’s hard not to conclude that Washington’s rapid gentrification—which is pushing many needy families from the District to Prince George’s County—is leading to a very inequitable outcome, at least in terms of school spending.

As Marguerite Roza has argued for years, school systems ought to live their values. If doubling-down on the education of poor children is something these systems (and their residents) support, they need at least to know whether their dollars are reaching the neediest children. Now we know that some of the Washington-area school districts could be doing a whole lot more for their low-income students. And the state of Maryland almost certainly could and should be doing more for Prince George’s County. Who will act to fix these problems?

Methodology

To find out how we estimated the per-pupil spending of each school in the Washington, D.C. area, see the methodology section of our D.C. Metro Area School Spending Explorer website; once we had those numbers, the next challenge was to understand the relationship between schools’ poverty rates and their spending. The first step was to estimate the “floor” of per-pupil expenditures (PPE) for each district, and then figure out how much extra they spend on low-income students. Elementary, middle, and high schools tend to have dissimilar spending patterns, so we only included elementary schools when calculating estimates. (There are lots more elementary schools than middle or high schools.)

To make our estimates for each district, we regressed school-level PPE against the percentage of students eligible for free or reduced price lunch (FRPL). The spending floor was derived from the result’s constant coefficient. The extra dollars allocated to low-income students were set equal to the FRPL coefficient. (More simply, we scatter-plotted FRPL (x-axis) and PPE (y-axis) for each district. We then calculated the lines of best fit: the y-intercept is the spending floor and the slope is extra spending.)

From there, it was a simple matter of dividing extra spending by the spending floor to find the extra spent on low-income students. It’s a rough estimate, of course, since we didn’t include any controls and we assume a linear relationship. But minus Prince George’s County, Alexandria, and D.C. Charters, FRPL confidence levels were greater than 99 percent. R-squared values were also large, with Montgomery County at the low end (.24) and Arlington at the high (.85). Because of this analysis’ descriptive nature, the lack of significance and the low R-squared values for the other three districts is not a problem. The numbers are low because none have a strong pattern of progressive expenditures, school-to-school. With a coefficient of 192.6 and an r-squared of -.008, Prince George’s County’s pattern isn’t just weak—it’s nearly non-existent.

For Coherence Sake: Defer Using the New PARCC Tests for HS Graduation

On October 7, MCPS sent a letter from Phil Kauffman the President of its Board of Education to the Maryland State Superintendent of Schools requesting the state to reconsider plans to use the new annual test called the Partnership for Assessment of Readiness for College and Careers (PARCC) assessment as end-of-course exams for purposes of fulfilling high school graduation.  For several years the Maryland State Department of Education (MSDE) has been preparing to use these new tests developed by a consortium of states as a replacement for the Maryland State Assessment (MSA) and High School Assessment (HSA) tests that have been used for the No Child Left Behind (NCLB) law’s accountability requirements.

These new PARCC tests hold much promise to improve the information to schools.  They are developed to be aligned with the new Common Core State Standards (CCSS) for literacy and math.  They use state of the art technology for an adaptive testing experience.  They are also unproven and given that they are likely more stringent than the MSA/HAS tests they replace, many students may not pass them, which would require many students to take a transitional class prior to graduating.  The state is considering a two tier approach where there is a criteria (cut-off score) for being considered “college and career ready” and a lower score to allow students to graduate.  This tiered approach has been advocated for by MoCoEdBlog editorial member, Rick Kahlenberg in a piece titled “Hold Students Accountable and Support Them.” 

MCPS is requesting that MSDE delay the implementation of these requirements and enter into a discussion about how to move forward.  Kaufman’s letter posed several important questions:

“If a college-ready cut score differs from the graduation cut score, what is the most meaningful indicator for institutions of higher education or employers? What messages do tiered cut scores send to students? Maryland now requires all students to be assessed for college and career readiness, and those found not ready, must be enrolled in transitional courses. Given this new paradigm, is there benefit added to continuing the requirement to pass end-of-course exit exams to receive a diploma? Moreover, if, during this period of transition from HSA to PARCC, it is appropriate to prohibit use of PARCC for purposes of personnel evaluations, why is it not equally appropriate to delay the use as a high stakes test for students?”

These are important questions.  I will focus on the last one about using the tests for one purpose only: graduation.  I believe that within this question lays an important systemic consideration that Kauffman’s letter only hinted at: if the tests are not ready to be used to evaluate teachers and principals then why should they be used for students?  I believe this is a strong argument in support of MCPS’ request.  While the change from one testing system to another may seem a matter of upgrading the measurement approach, the reality on the ground is that these kinds of changes are not “plug and play.”  There are many interdependent and moving parts in a school system and high-stakes tests impact many of them, including teachers and students alike.  Students are dependent upon the instruction they receive and the instruction is shaped by both the ability of the teachers and the rewards, incentives, and constraints they work within.

To use the tests for one part of the system (student graduation) but not for another part (educator evaluation) would, in my view, create a tension in the system.  It would have part of the organizational focus in one direction and another part in another.  High school achievement, like the achievement gap and many other big problems in education are systemic.  They have multiple interrelated causes and one of the reasons they are so hard to address for MCPS and the rest of the country is that independent solutions rarely address the combination of factors that underlie the problems in complex systems.  Using PARCC for only student accountability is a partial approach.  And, as Kauffman’s letter says, it puts students in the unfair position of being the ones getting the shorter end of the stick as these new tests are tried out on their future first.  For this reason alone, I believe the MCPS request deserves support from both MSDE and Montgomery County’s elected officials.

There are more questions that can be asked of MCPS about their readiness to support PARCC across the system.  Below, I will sketch out some other important factors and end with some questions that could be included in the conversation that MCPS in Kauffman’s letter requests.

Some Observations about PARCC

PARCC Will Initially Be Disruptive

The implementation of the new tests will be disruptive.  How much they will disrupt the work that goes on in schools is not clear.  But, history has shown, including with NCLB, that most large scale changes in schools can “shock the system” and take some time to become assimilated into the routine.  The day to day work of schools is so labor intensive and what teachers do especially is so often based on what they have done in the past that any change such as a new curriculum will take some time to assimilate.  The fact that PARCC will be aligned to the CCSS will help as MD schools will have a couple of years of experience with these standards. Still, the new tests under the best of circumstances will require some adjustment, at least at least in the first year.

The impact on the schools has the potential to be even bigger and more difficult on schools.  If the PARCC results are tied also to high-stakes consequences such as teacher evaluations or school performance.  One of the lessons of NCLB (and there is a lot of research on this) was that schools with greater challenges suffered much more collateral damage than schools with better circumstances.  So, if the PARCC tests will be high-stakes then they will be higher stakes in the schools with the most needs.  The tests can still provide much valuable information and the information should be used, but tying these results to high-stakes for educators would likely be disproportionally absorbed by high-needs schools.  MCPS needs a robust plan that addresses the impacts of PARCC on the system.

Implementation Questions with Technology-Rich Assessments

PARCC Assessments are designed to be delivered on computers rather than paper and pencil. However, not all school systems or school buildings have the same technology and so there are alternative testing approaches that have raised some questions about how PARCC will work when the rubber meets the road. For example, national education expert Rick Hess raised three big issues earlier this year:

  1. Testing under different testing conditions (some in classrooms, some in media centers, some offsite at different locations.
  2. Testing using different devices (ex: computer vs. paper and pencil)
  3. Testing windows that can vary from school to school so that the tests may be taken at different times by different students

None of these issues fundamentally compromises the value of the tests both as well designed instruments and even more being aligned with the CCSS.  But they all can impact the scores in ways that will be really hard to know until after the tests have been administered.  Will the impact on the scores vary based on the kind of school in the same way high-stakes impacts high needs schools more than others?  Quite possibly they will.  MCPS should look at its implementation options and try as best as possible to standardize testing approaches across schools.

Will the Tests Perform as Designed Initially?

When we read that the testing of the tests has gone smoothly, it is important to remember that these are reports from the people who are administering the tests and that smoothly may mean different things to them than to educators.  For example, if the field trial occurs where and when it was planned and the results are able to be tabulated by PARCC, the field trial is smooth from a technical perspective.  This doesn’t mean, however, that the tests were measuring the same things that were taught or that they did as good a job with different populations as the designers hoped.  Larger amounts of real data and more time are required to know this.  Again, it probably will not be until after the first year of full administration that these issues will be clearer.  Also, scuttlebutt from behind the scenes at PARCC has for a few years now has been that the amount of money they began with was less than they needed so don’t be surprised if the quality of the tests is not even; that some parts of the curriculum test more reliably than others.  MCPS should be careful about making inferences based on the results of any part of the curriculum until the broad strengths/weaknesses of the test quality are known.

Some Important Questions for MCPS’ Implementation of PARCC

While it is important to support MCPS’ request, some questions could be asked of them about their plan going forward.

  1. Professional Development and Support. With the recent adoption of CCSS curricula, MCPS along with just about every school district has found the need for professional development was more urgent than expected.  How are the plans coming to train MCPS educators in how to use PARCC?  What lessons have been learned from the pilots thus far about the technology needs as well as the performance of the tests beyond the fairly positive accounts MSDE and PARCC have provided?
  2. PARCC Impact on Technology Budgets. How will PARCC impact the spending decisions throughout the school system?  One of the biggest criticisms of the CCSS has been it is an opportunity for companies to make even more money from education.  School principals, teachers, and even some families are getting inundated with many offers of products that will help prepare students to do well with CCSS and PARCC tests.  Most of these claims are unverified. There is no body that will certify that a product is 100% or 50% CCSS compliant.  There will in the future probably be ways of rating these products this way by the people who use them; but not today.  MCPS would be wise to not to spend too much public money on materials to help prep for first round of tests if it can be avoided.  Much of what is on the market now has been rushed to market and is full of errors. Reviewing materially centrally and making recommendations to schools for how to purchase makes a lot of sense as does working with partner districts to assess the quality of materials and technology.  While MCPS tends to defer a lot to individual schools (site-based management) rather than centrally manage and direct, in this case it may be useful for MCPS to take stock of the products that are out there and provide good technical support to schools.
  3. Accountability Options. One of the driving reasons for high-stakes tests is that not all schools perform as they should and not all schools perform equally well with all groups of students.  Even with all of the many problems with implementation, policies like NCLB have been important ways to see educational differences and also to shift the conversations for many in education towards hard outcomes.  As the sanctions of accountability are even temporarily lifted, what will MCPS be doing to ensure all students are getting the kind of education they deserve?  Will the PARCC test results be combined with other forms of evidence to ask about where there are areas that need improvement and additional attention? While delays in using PARCC for HS graduation make sense, what other external accountability options will MCPS use to ensure all children receive appropriate education.

Summary

As Kauffman’s letter spells out, the issues surrounding the use of PARCC tests for high school graduation are complex and consequential.  The MCPS request to delay implementation of the state’s plan is reasonable.  Whether the state will listen is unclear.  Whether more information about how MCPS is getting ready for PARCC and the new testing and standards paradigms it is part of will help MSDE in their decision is unclear.  For those closer to MCPS—parents, teachers, local elected officials—this kind of information is probably important to have.  For the sake of MCPS’ management thinking and capacity to deal with the difficult and complex problems of student achievement, it is probably important to develop.  MCPS, like pretty much all districts, has traditionally been dependent on state policy and so there may be a tendency when in this role to wait and respond rather than taking the lead and driving the discussion.  MCPS is no ordinary district.  It has not only broad needs but many financial and intellectual resources so it is in a better position than most to lead rather than respond.  The tone and message of Kauffman’s letter suggests this is what MCPS is trying to do.  Let’s hope the state is ready to meet them in a discussion about this difficult issue.

What Happens to the Good News for MCPS?

Recently, I looked for a way to share some observations of some good things happening in at MCPS.  These were not big research-driven observations, but what I saw in very local and personal encounters as I will describe below.  I found in searching MCPS web sites no place or mechanism to communicate this information.  Many who have worked with MCPS for a long time have said it can be an insular kind of organization; at times ignoring criticism as it pursues its plans, which may also be related to a harsh and at times unfair tone of its critics.  This dynamic was one of the reasons we started the MoCoEdBlog, to provide input—balanced input— about important topics that we had some substantive knowledge about.

Summer Worker

This past summer a reading specialist at my son’s school organized a summer reading program where she provided books and games/puzzles for kids to work on during the weeks school was out.  My kids are in a language immersion program and one of the biggest challenges for families who do not speak the language at home is early reading.  The specialist on her own time organized playdates at playgrounds where the kids could run around while parents exchanged books and information about student progress.  Summer progress, or for some cases what is called the “summer melt” when kids regress, is very important for preparing kids for the coming school year.   This was individual initiative taken by a teacher beyond what was required in order to help the students.  It is no small thing, especially for the families of those students.

My Good Friend’s School

I have a very good friend whose son and mine have been pals since they were 4 years old in Montessori school.  While our son went into MCPS in kindergarten hers stayed few more years in Montessori where he did well in some subjects but also was delayed in reading in part because of undiagnosed perceptual issues.   When she brought her son into MCPS she found initially different views of what was the best grade and best approach.  Over the last two years, while our kids played, I have heard her describe the collaborative approach her MCPS School (Flower Valley) has taken, how the principal listens and they have worked together, the universal and complete commitment to her child by that school.  It has been a heartwarming and very encouraging story to hear how he is understood and valued and has also now steadily climbed up in school performance.  How big a deal is this?  If it is your child or a child you know it is a big deal, naturally.  It is also highly likely that in this school this is not an isolated success story, but part of a culture of doing the right things.  This is an example of the kind of school-level autonomy that MCPS is practicing working very well where those in the school are empowered to do the right things and in the case of this school they do.

My Kids Ate Salad

It may not seem like a big deal that both of my kids ate salads at school not too long ago.  They are kids after all.  However, they ate these salads at lunch and came home to explain what they ate and how good it was during the Farm-to-School week MCPS recently had.  I know people who have been advocating for MCPS to have more healthy food and snack options and have expressed frustration over what they see as a preference for institutional food over more locally grown options.  Seeing that the Farm-to-School week not only happened, but it worked educationally was very interesting.  This is a state program implemented by MCPS and it shows for me the important role MCPS can take helping to promote healthy lifestyles.

Who Gets This Good News?

So in feeling motivated to share these observations with someone in MCPS, I went to the new website and found no place to provide this kind of feedback.  There is no central suggestion box where anyone who has a comment can send it and expect it will land at the right office  and also be part of a systemic process of sensing how well different parts of the system are performing.   Would it be meaningful to know about individual employees going above and beyond to serve kids or to know about schools that seem able and willing to organize for student success?   Would it be also helpful to collect other kinds of comments that maybe are indications of uncertainty or parts of the system in need of support?  I think it would.  While there is an Office of Public Information, there is no function I can see for public feedback.  This might be as simple as a small addition to a website, that digital suggestion box, or might involve staff aggregating and disseminating this information.  One of the new trends in performance evaluation involves surveys that provide information on teacher and school activity.  With these kinds of instruments coming in the future and with the kinds of specific and useful comments that test scores can never provide, perhaps MCPS might want to develop some capacity for handling this information.  My suggestion is to begin small with an easy and accessible place for feedback.

New MCPS Report Cards –Good Ideas, Sign of the Times, Room for Improvement by Phil Piety

Sometimes education seems like the place where good ideas do not come to die, but rather to find out how complex the problems they are trying to solve really are.  One they run into the difficult realities of educational practice they rarely die quickly even if their authors hope they do.  The new Montgomery County Public School (MCPS) report cards featuring  the letters I for “in progress”, P for “proficient,” and ES for “exceeds standards” is another great idea that is based on a sound logic, but that has run into difficult implementation. I actually don’t think the new MCPS report cards should die, but neither do I think they should live on in their current form.

The design of these new report cards is based on what is called standards-based grading (SBG) where the information delivered to parents is about what the students know or don’t know based on the standards they are to be taught to. Historically letter grades have been considered too broad to be useful in diagnosing problems with learning.  A student receives a B, but what is it that he or she needs to learn to make an A?  Traditional grades are also known to be subjective.  A student may have mastered the content, but didn’t look like they were learning or have the attitude the teacher wanted could get a B while another student who didn’t show as much achievement but made tremendous efforts could get an A.  Research has shown that traditional grading practices vary from teacher to teacher.  Many teachers communicate much more than achievement through traditional grades and SBG is an effort to standardize grading around what students should be learning.

I think SBG is a good idea and MCPS’s use of it admirable.  As a parent of kids in MCPS, however, I have found the results less than useful.  I find what Washington Post Education Writer Donna St. George called the Plethora of Ps difficult to use.  I have found it difficult to match the proficiency mark in one area difficult to reconcile with other parts of the report card and what we see coming home.   After two years of experience with this new SBG report card and hearing many stories of frustration from other parents, I can see four issues that are worth considering.

  1. Producing proficiency (Ps) is now considered a primary responsibility of a teacher.  This focus on standards proficiency is the result of the standards-based accountability that has been going on since before No Child Left Behind (NCLB).  While many educators think this is over emphasized in comparison to other areas of social and emotional development, the reality today is that proficiency counts.  In the past many teachers felt that grades should be distributed across the different letters with a certain percentage getting As and another percentage Bs, etc.  Many teachers then distributed grades accordingly and sometimes the reason a student got one grade versus another seemed arbitrary yo fit that pattern.  In today’s climate there is belief that a teacher’s job is to get all kids to academic proficiency and so for many a report card that has many Ps in it shows teachers did their job. This means in the world of educators and teachers there is a built-in implicit incentive to produce Ps.
  2. There are disconnected data points.  One of the biggest problems with the MCPS SBG approach is that standards in the report card are presented without some important context.  In the area of elementary math, for example, there are many many standards shown and each is given an individual evaluation.  However, the way the new CCSS math standards are designed is around what are called “learning trajectories” that string different standards together into these sequences of proficiency.  These areas, such as “Number and Operations in Base Ten” have many individual standards across multiple grades.  The MCPS report cards read like an inventory of the standards that are intended to be taught at a given grade (ex: grade 3) and don’t show the larger trajectories with related standards at lower/higher grades.  In reality many students can perform in areas outside of their assigned grades.  Including this multi-grade context (with a meaningful graphic representation) will probably help everyone make sense out of the data points.  Without some better information organization, parents see an ocean of P’s, I’s, and rarely ES’s, but the overall picture of learning is lost.In addition, most of the standards in the MCPS report cards are summary standards that are actually composed of many more detailed sub-standards.  For example, a fourth grade standard to “understand place value” is defined as three different kinds of understandings and to know whether a student actually has mastered one of these sub-standards would usually require multiple tasks.  So each I on a report card should be represented by Is, Ps, and/or ESs on the sub-standards.
  3. The new report cards dropped important information that was valuable for parents and students.  The shift from traditional to SBG seems to have been abrupt and important teacher comments and other information that would help parents understand the classroom environment and how they child is doing omitted.  Even if that “legacy” information can be replaced in the future, phasing it out slowly and giving parents a chance to learn how to use the new report cards is probably a good idea.
  4. SBG reports can still be subjective.  While the shift from traditional letter grades to SBG is intended to put the focus on what students have actually learned as opposed to teacher perceptions, the reality is that the much of this assessment of achievement is still largely subjective.  The tools MCPS uses such as MAP-R and MAP-M do not produce detailed analysis by standard of what students know or don’t know.  Until we reach a point where every score from I to P to ES (or whatever other coding system is used) can be backed up by examples of student work and where students can achieve a P or ES by multiple means then these report cards are not much better than traditional letter grades, although they can give the impression that they are.  Because there is no mark for “uncertain” there may be cases where when teachers are in doubt they assign a P or an I rather than indicate the true assessment of not sure, which adds to the confusion parents are experiencing.

In summary, the new MCPS report cards are based on a good idea of SBG or standards-based grading that is intended to shift the focus on what students have learned rather than a subjective evaluation of the teacher. In reality, they don’t quite achieve that goal yet. They are like many ideas to improve education—including evaluating teachers based on how much students learn (value added modeling) or holding school accountable for making sure that no child is left behind—that seem quite reasonable at first, but are much more complex and difficult to do well in practice.    MCPS intentions should be commended for being out in front of this effort and trying SBG well before the rest of the country.  At the same time, being on the bleeding edge when there is so little research and a big learning curve for parents and educators alike is risky.  Focusing on how to improve this area should be a priority for both the school system and parents who should make their information needs known. Without some advocacy for better information in the future, the next several years may see parents continuing to struggle to make good use of the reports that come home.