Author Archives: Philip Piety

MCPS Review of Assessments: Shifts, Recommendations, and Questions

On June 22, the MCPS Board of Education received a briefing from staff led by Maria Navarro the Chief Academic Officer about the kinds of assessments students are given in different grades and different schools.  This review was undertaken at the direction of the Maryland State Superintendent of Education, Lilian Lowrey who asked Maryland school districts to look for options to reduce the assessment burden on the students and the system.  This is important work that is worth attention and follow-through.

While presumably other Maryland districts are going through this exercise, the MCPS staff presented a comprehensive and polished briefing with various options that are being considered at different levels of the system.  This is an important area for the individual schools and also for the ways that MCPS is managed.  The only people who think that educational assessments are easy are those who have little experience with them. They are hard and it was clear that the team that briefed the Board on June 22 was aware of how hard this area is to consider comprehensively.   In some cases, what MCPS is considering could be very important for the system and should be encouraged.  Some areas raised questions for me and in a few area MCPS may want to think about how their ideas might be implemented given some of the changes going on in the field.  This is not to say what was presented was wrong, but rather that the directions articulated by the MCPS staff relate to other movements in the field.

The observations, recommendations, and questions that follow are given with the recognition that this review seems very well done and well ahead of many of the majority of school districts in the United States: large and small.

Three Notable Shifts

In this work I see MCPS is tapping into three important shifts in the field of educational data.  These shifts have not always been highlighted by the staff as special, but MCPS’ process of evaluating different options for reducing the testing burden is revealing ideas that align with some new movements in educational research and policy.

Shift to more continuous and embedded nature of assessments

In some of the recommendations for restructuring middle school assessments the idea of distributing the exams across multiple class sessions rather than in a single summative block was proposed.  This is like what we see in new research into using the data feeds from digital technologies that can provide multiple data points (some from assessments and many from other activity) that can be used quickly and on a more immediate basis.  If MCPS does experiment with parsing out assessments over multiple periods then they will be preparing for this future.  At the same time, this shift bring’s complications.  Much of the statistical practice that is used in education is based on rigid control of testing conditions.  Once data is decoupled from the end-measures and made more continuous then how the information can be used (it’s reliability) for important consequences changes as well. These intermediate/continuous measures are much harder to use for high-stakes because they are usually less statistically reliable.

Focus on range of student experiences

Maria Navarro, the Chief Academic Officer discussed the use of different kinds of data, including surveys to understand the “range of student experiences” in response to comments by Dr. Docca who was asking how these changes would impact children of color.  Many educators have long recognized that learning performance is only one important factor to attend to.  What is newer now is that people in the data movement are developing different kinds of measures of student experience and that the kinds of surveys that MCPS and other districts are utilizing today, including Harvard’s Tripod project (http://tripoded.com/), are getting at these kinds of data.  The coming years should see more and more of these kinds of data and the response of Dr. Navarro may be repeated in future meetings and conversations about different kinds of students.  The Board should continue to ask for more data about the range of student experiences beyond test scores.  I believe that it is in this kind of information that the solutions to these challenges will likely be found.

Alignment and Coherence

Dr. Lang, the Associate Superintendent for Curriculum and Instruction discussed the issue of alignment of assessments.  It was presented as alignment in terms of policies and other assessments like PARCC, but the issue is much bigger and the field is just now beginning to wake up to the challenges of connecting different kinds of assessments and other forms of information (see the first recommendation below).  This is an area where educational data is different from data in other fields and where education is a more challenging field to implement data-driven solutions than other fields.  While it is easy to bring two different scores for the same subject together (ex: math), ensuring that those scores are measuring the same things requires a lot of technical work.  This is an increasing challenge for education where more and more technologies that are easy to access such as Khan Academy of Dreambox Learning that can provide data on student work and show that data in tables, charts, and dashboards. The visual displays don’t tell educators whether or not two data points can be comparable or how they relate.  This has contributed to some educators distrusting educational data because this lack of alignment translates into incoherence in terms of what to do with students.

Three Recommendations

I have three recommendations for MCPS in this process and in going forward with their work on assessments and data.

Broaden concept of data

In a number of places in the briefing the term data and assessments were used interchangeably.  This is clearly consistent with most ways policies use these terms, but this needs to change.  Assessments have been seen both in education school and policy circles as the most valid data—the hard data—that is needed to evaluate educational performance.  The problem is assessment data is widely variable in it quality and in how clearly represents the things that matter most like instruction.  In many parts of the curriculum (ex: social studies, life science, advanced literacy) there are major gaps in quality assessments.

The field has begun to broaden its conception of data to still feature assessments in a central role, but to acknowledge that other kinds of evidence from surveys, from digital tools, and especially artifacts that teachers also use in their daily work.  These teacher tools are also developing very quickly and so the kinds of information they can produce and how it can be analyzed is developing rapidly.

In broadening the kinds of evidence considered as core data beyond assessments, the ways the information are used can also be thought of broadly. Several years ago the use of data and other information in educator conversations was less valued than it is today.  A few years ago, the idea of data-driven decision making and using test results to directly inform instruction was more popular than it is today largely because these ideas of using assessments as a primary lever of reform have not worked.  Assessment data is rarely at a granularity and specificity that can be used in the practice of teaching.  Rather, these data can help teachers see their work at the level of a school and the long-term trajectories of students.  Professional learning communities looking at a range of information, including the assignments themselves and examples of student work alongside assessments are considered structures that can actually enable meaningful change. The idea of the central office collecting assessments from the schools as was suggested in the June 22 meeting may seem to some educators a case of central office controlling local processes.  However, if the view of assessments is broad (ex:  beyond test scores) and if the uses are also broad (ex: to understand practice variation as well as measure student learning) then those on the front line may have less concerns and see this sharing of information about assessments as part of organizational learning and professional development.

Guard Against Unreasonable Expectations

Many people looking at the use of assessments and data as levers of change in education hold very logical, simple, and often unworkable impressions of what these tools can do.  Assessments are critical and important, but even the state of the art assessments are often weak when it comes to making change. They must be used in concert with coherent commitments and organizational approaches to ensuring all students learn.  As MCPS moves forward with reviewing their assessment strategies, policymakers should be realistic about the limits of this part of the system.  There are no silver bullets in education and assessments are a problematic area.

 

Network with other districts

MCPS is often a few steps ahead of the rest of the nation, but the work they are doing to come up with better ways to use assessments are not unique to this county.  Almost all districts should be looking into this and many large ones in Maryland are as well.  I recommend MCPS connect with these others and perhaps develop a common conversation group to share ideas, concerns, and best practices.

Three Questions

This effort by the Board and administration has been one of the more important initiatives in my view because assessments are deeply connected to just about everything that is in the core of school system functions.  It doesn’t involve busses and building services, but almost anything involving teachers and school leaders is increasingly related to assessment. I offer three questions.

How will this be sustained?

Once the school year begins and after the task for the state superintendent is completed, how will this reflection be sustained?  How will this not become another historical exercise and instead a part of a continuous improvement process.  As groundbreaking as this kind of review is (well begun with its broad involvement of stakeholders) the next steps may be more complicated and require leadership because they will involve making changed. As Mr. Barclay noted on June 22, MCPS is without a full-time permanent superintendent.  How will MCPS continue while searching for a new leader and how will the next superintendent continue this initiative into the next school year and beyond?  It would seem the board will need to continue to focus on this topic and maintain an engaged dialog because the topics will remain complex and connected to all the important work MCPS does. It is possible that the Board itself will need to take on some of the work that might otherwise be part of the job of a superintendent.

Where are The Assessments Professionals?

The review that was presented on June 22 showed broad stakeholder communications with the inclusion of the union and various formal and informal structures.  This is important not only because the success of any effort this large will benefit from buy in, but also because these stakeholders have important perspectives that can really inform the system’s leadership.  At the same time there are others who specialize in assessments, including assessments like the PARCC and MSA, who could help MCPS understand the implications of possible decisions.  The University of Maryland and George Washington University are both close and include experts in measurement and assessment who might be able to inform the planning for next steps.  How is MCPS taking advantage of these local experts in their process of review?

Where are Technology and Shared Accountability Offices?

One of the features of most large school systems (like many companies of old) is that they tend to be walled off into different groups or silos with stronger internal communication than external.  This can lead to inefficiencies and lack of a coherence as each unit pursues its own agenda.  Strong top-down leadership can get different groups marching to the same beat, but often this comes with downsides, especially in socially oriented organizations like school systems.

The June 22 briefing was led by the Chief Academic Officer and Associate Superintendent for Curriculum and Instruction.  This makes sense because this part of the system is responsible for instructional tools and planning.  To treat these matters as technical matters first might not put the initiative on as strong an instructionally-relevant footing, which needed.  At the same time, the Office of Shared Accountability, which manages the testing process and has many of the technical experts with statistical knowledge important for interpreting assessments, should also be involved in this process.  Similarly, the Chief Technology Officer could have an important role in the shaping of new solutions as many of the newer kinds of assessments that can provide more relevant and actionable information run on new platforms that the technology office will manage.  At some point, these two areas of MCPS will need to be principally engaged in any future discussions of MCPS assessments.

An Important Opportunity to Learn From

The MCPS BOE is right to follow through on this discussion and to ask staff to continue to make it a priority.  It is important for MCPS’ future.   MCPS is like just about every other US district today that has a patchwork of assessment systems that the senior staff who presented this briefing appropriately classify as those that are A) externally required and B) that the district has some control over.  They also correctly describe the important interrelationships between these data and related areas of practice such as communications, grading, scheduling, and aligning with state and federal policy initiatives.  MCPS is unlike many districts in the way they went about this review.  While the trigger for this review seems to have been the Maryland State Department of Education’s (MDSE) interest in lowering the testing burden on students and teachers, this review is an important opportunity to look at one of the more challenging and important aspects of the district’s operation.  School systems are run by a combination of procedure, intuition/professional judgment, and data.  This combination forms a kind of “nervous system” for the districts and assessments often are the most important visible part of this nervous system.  Assessments are more than an end measure of student learning, they touch just about every aspect of the work that gets done in MCPS from policy evaluation to teaching.

This kind of review has little precedence in school reform.  Very few districts like MCPS have undertaken this kind of evaluation and so there few examples of how to do this.  MCPS’s approach has included a combination of information gathering from inside and outside the system as they have begun to frame several different kinds of approaches to reducing testing to free up instructional time.  It is comparatively easy to look professional in a routine and practiced task, but much more challenging when the challenge is new.   My view is that te MCPS staff have taken strong steps this new area and in the process shown a mature process of developing a set of alternatives by iterating through input and analysis. Most school systems do not have the capacity for this kind of exploratory work. Other districts embarking on a similar review would have a lot to learn by watching how MCPS has approached this task.

For Coherence Sake: Defer Using the New PARCC Tests for HS Graduation

On October 7, MCPS sent a letter from Phil Kauffman the President of its Board of Education to the Maryland State Superintendent of Schools requesting the state to reconsider plans to use the new annual test called the Partnership for Assessment of Readiness for College and Careers (PARCC) assessment as end-of-course exams for purposes of fulfilling high school graduation.  For several years the Maryland State Department of Education (MSDE) has been preparing to use these new tests developed by a consortium of states as a replacement for the Maryland State Assessment (MSA) and High School Assessment (HSA) tests that have been used for the No Child Left Behind (NCLB) law’s accountability requirements.

These new PARCC tests hold much promise to improve the information to schools.  They are developed to be aligned with the new Common Core State Standards (CCSS) for literacy and math.  They use state of the art technology for an adaptive testing experience.  They are also unproven and given that they are likely more stringent than the MSA/HAS tests they replace, many students may not pass them, which would require many students to take a transitional class prior to graduating.  The state is considering a two tier approach where there is a criteria (cut-off score) for being considered “college and career ready” and a lower score to allow students to graduate.  This tiered approach has been advocated for by MoCoEdBlog editorial member, Rick Kahlenberg in a piece titled “Hold Students Accountable and Support Them.” 

MCPS is requesting that MSDE delay the implementation of these requirements and enter into a discussion about how to move forward.  Kaufman’s letter posed several important questions:

“If a college-ready cut score differs from the graduation cut score, what is the most meaningful indicator for institutions of higher education or employers? What messages do tiered cut scores send to students? Maryland now requires all students to be assessed for college and career readiness, and those found not ready, must be enrolled in transitional courses. Given this new paradigm, is there benefit added to continuing the requirement to pass end-of-course exit exams to receive a diploma? Moreover, if, during this period of transition from HSA to PARCC, it is appropriate to prohibit use of PARCC for purposes of personnel evaluations, why is it not equally appropriate to delay the use as a high stakes test for students?”

These are important questions.  I will focus on the last one about using the tests for one purpose only: graduation.  I believe that within this question lays an important systemic consideration that Kauffman’s letter only hinted at: if the tests are not ready to be used to evaluate teachers and principals then why should they be used for students?  I believe this is a strong argument in support of MCPS’ request.  While the change from one testing system to another may seem a matter of upgrading the measurement approach, the reality on the ground is that these kinds of changes are not “plug and play.”  There are many interdependent and moving parts in a school system and high-stakes tests impact many of them, including teachers and students alike.  Students are dependent upon the instruction they receive and the instruction is shaped by both the ability of the teachers and the rewards, incentives, and constraints they work within.

To use the tests for one part of the system (student graduation) but not for another part (educator evaluation) would, in my view, create a tension in the system.  It would have part of the organizational focus in one direction and another part in another.  High school achievement, like the achievement gap and many other big problems in education are systemic.  They have multiple interrelated causes and one of the reasons they are so hard to address for MCPS and the rest of the country is that independent solutions rarely address the combination of factors that underlie the problems in complex systems.  Using PARCC for only student accountability is a partial approach.  And, as Kauffman’s letter says, it puts students in the unfair position of being the ones getting the shorter end of the stick as these new tests are tried out on their future first.  For this reason alone, I believe the MCPS request deserves support from both MSDE and Montgomery County’s elected officials.

There are more questions that can be asked of MCPS about their readiness to support PARCC across the system.  Below, I will sketch out some other important factors and end with some questions that could be included in the conversation that MCPS in Kauffman’s letter requests.

Some Observations about PARCC

PARCC Will Initially Be Disruptive

The implementation of the new tests will be disruptive.  How much they will disrupt the work that goes on in schools is not clear.  But, history has shown, including with NCLB, that most large scale changes in schools can “shock the system” and take some time to become assimilated into the routine.  The day to day work of schools is so labor intensive and what teachers do especially is so often based on what they have done in the past that any change such as a new curriculum will take some time to assimilate.  The fact that PARCC will be aligned to the CCSS will help as MD schools will have a couple of years of experience with these standards. Still, the new tests under the best of circumstances will require some adjustment, at least at least in the first year.

The impact on the schools has the potential to be even bigger and more difficult on schools.  If the PARCC results are tied also to high-stakes consequences such as teacher evaluations or school performance.  One of the lessons of NCLB (and there is a lot of research on this) was that schools with greater challenges suffered much more collateral damage than schools with better circumstances.  So, if the PARCC tests will be high-stakes then they will be higher stakes in the schools with the most needs.  The tests can still provide much valuable information and the information should be used, but tying these results to high-stakes for educators would likely be disproportionally absorbed by high-needs schools.  MCPS needs a robust plan that addresses the impacts of PARCC on the system.

Implementation Questions with Technology-Rich Assessments

PARCC Assessments are designed to be delivered on computers rather than paper and pencil. However, not all school systems or school buildings have the same technology and so there are alternative testing approaches that have raised some questions about how PARCC will work when the rubber meets the road. For example, national education expert Rick Hess raised three big issues earlier this year:

  1. Testing under different testing conditions (some in classrooms, some in media centers, some offsite at different locations.
  2. Testing using different devices (ex: computer vs. paper and pencil)
  3. Testing windows that can vary from school to school so that the tests may be taken at different times by different students

None of these issues fundamentally compromises the value of the tests both as well designed instruments and even more being aligned with the CCSS.  But they all can impact the scores in ways that will be really hard to know until after the tests have been administered.  Will the impact on the scores vary based on the kind of school in the same way high-stakes impacts high needs schools more than others?  Quite possibly they will.  MCPS should look at its implementation options and try as best as possible to standardize testing approaches across schools.

Will the Tests Perform as Designed Initially?

When we read that the testing of the tests has gone smoothly, it is important to remember that these are reports from the people who are administering the tests and that smoothly may mean different things to them than to educators.  For example, if the field trial occurs where and when it was planned and the results are able to be tabulated by PARCC, the field trial is smooth from a technical perspective.  This doesn’t mean, however, that the tests were measuring the same things that were taught or that they did as good a job with different populations as the designers hoped.  Larger amounts of real data and more time are required to know this.  Again, it probably will not be until after the first year of full administration that these issues will be clearer.  Also, scuttlebutt from behind the scenes at PARCC has for a few years now has been that the amount of money they began with was less than they needed so don’t be surprised if the quality of the tests is not even; that some parts of the curriculum test more reliably than others.  MCPS should be careful about making inferences based on the results of any part of the curriculum until the broad strengths/weaknesses of the test quality are known.

Some Important Questions for MCPS’ Implementation of PARCC

While it is important to support MCPS’ request, some questions could be asked of them about their plan going forward.

  1. Professional Development and Support. With the recent adoption of CCSS curricula, MCPS along with just about every school district has found the need for professional development was more urgent than expected.  How are the plans coming to train MCPS educators in how to use PARCC?  What lessons have been learned from the pilots thus far about the technology needs as well as the performance of the tests beyond the fairly positive accounts MSDE and PARCC have provided?
  2. PARCC Impact on Technology Budgets. How will PARCC impact the spending decisions throughout the school system?  One of the biggest criticisms of the CCSS has been it is an opportunity for companies to make even more money from education.  School principals, teachers, and even some families are getting inundated with many offers of products that will help prepare students to do well with CCSS and PARCC tests.  Most of these claims are unverified. There is no body that will certify that a product is 100% or 50% CCSS compliant.  There will in the future probably be ways of rating these products this way by the people who use them; but not today.  MCPS would be wise to not to spend too much public money on materials to help prep for first round of tests if it can be avoided.  Much of what is on the market now has been rushed to market and is full of errors. Reviewing materially centrally and making recommendations to schools for how to purchase makes a lot of sense as does working with partner districts to assess the quality of materials and technology.  While MCPS tends to defer a lot to individual schools (site-based management) rather than centrally manage and direct, in this case it may be useful for MCPS to take stock of the products that are out there and provide good technical support to schools.
  3. Accountability Options. One of the driving reasons for high-stakes tests is that not all schools perform as they should and not all schools perform equally well with all groups of students.  Even with all of the many problems with implementation, policies like NCLB have been important ways to see educational differences and also to shift the conversations for many in education towards hard outcomes.  As the sanctions of accountability are even temporarily lifted, what will MCPS be doing to ensure all students are getting the kind of education they deserve?  Will the PARCC test results be combined with other forms of evidence to ask about where there are areas that need improvement and additional attention? While delays in using PARCC for HS graduation make sense, what other external accountability options will MCPS use to ensure all children receive appropriate education.

Summary

As Kauffman’s letter spells out, the issues surrounding the use of PARCC tests for high school graduation are complex and consequential.  The MCPS request to delay implementation of the state’s plan is reasonable.  Whether the state will listen is unclear.  Whether more information about how MCPS is getting ready for PARCC and the new testing and standards paradigms it is part of will help MSDE in their decision is unclear.  For those closer to MCPS—parents, teachers, local elected officials—this kind of information is probably important to have.  For the sake of MCPS’ management thinking and capacity to deal with the difficult and complex problems of student achievement, it is probably important to develop.  MCPS, like pretty much all districts, has traditionally been dependent on state policy and so there may be a tendency when in this role to wait and respond rather than taking the lead and driving the discussion.  MCPS is no ordinary district.  It has not only broad needs but many financial and intellectual resources so it is in a better position than most to lead rather than respond.  The tone and message of Kauffman’s letter suggests this is what MCPS is trying to do.  Let’s hope the state is ready to meet them in a discussion about this difficult issue.

What Happens to the Good News for MCPS?

Recently, I looked for a way to share some observations of some good things happening in at MCPS.  These were not big research-driven observations, but what I saw in very local and personal encounters as I will describe below.  I found in searching MCPS web sites no place or mechanism to communicate this information.  Many who have worked with MCPS for a long time have said it can be an insular kind of organization; at times ignoring criticism as it pursues its plans, which may also be related to a harsh and at times unfair tone of its critics.  This dynamic was one of the reasons we started the MoCoEdBlog, to provide input—balanced input— about important topics that we had some substantive knowledge about.

Summer Worker

This past summer a reading specialist at my son’s school organized a summer reading program where she provided books and games/puzzles for kids to work on during the weeks school was out.  My kids are in a language immersion program and one of the biggest challenges for families who do not speak the language at home is early reading.  The specialist on her own time organized playdates at playgrounds where the kids could run around while parents exchanged books and information about student progress.  Summer progress, or for some cases what is called the “summer melt” when kids regress, is very important for preparing kids for the coming school year.   This was individual initiative taken by a teacher beyond what was required in order to help the students.  It is no small thing, especially for the families of those students.

My Good Friend’s School

I have a very good friend whose son and mine have been pals since they were 4 years old in Montessori school.  While our son went into MCPS in kindergarten hers stayed few more years in Montessori where he did well in some subjects but also was delayed in reading in part because of undiagnosed perceptual issues.   When she brought her son into MCPS she found initially different views of what was the best grade and best approach.  Over the last two years, while our kids played, I have heard her describe the collaborative approach her MCPS School (Flower Valley) has taken, how the principal listens and they have worked together, the universal and complete commitment to her child by that school.  It has been a heartwarming and very encouraging story to hear how he is understood and valued and has also now steadily climbed up in school performance.  How big a deal is this?  If it is your child or a child you know it is a big deal, naturally.  It is also highly likely that in this school this is not an isolated success story, but part of a culture of doing the right things.  This is an example of the kind of school-level autonomy that MCPS is practicing working very well where those in the school are empowered to do the right things and in the case of this school they do.

My Kids Ate Salad

It may not seem like a big deal that both of my kids ate salads at school not too long ago.  They are kids after all.  However, they ate these salads at lunch and came home to explain what they ate and how good it was during the Farm-to-School week MCPS recently had.  I know people who have been advocating for MCPS to have more healthy food and snack options and have expressed frustration over what they see as a preference for institutional food over more locally grown options.  Seeing that the Farm-to-School week not only happened, but it worked educationally was very interesting.  This is a state program implemented by MCPS and it shows for me the important role MCPS can take helping to promote healthy lifestyles.

Who Gets This Good News?

So in feeling motivated to share these observations with someone in MCPS, I went to the new website and found no place to provide this kind of feedback.  There is no central suggestion box where anyone who has a comment can send it and expect it will land at the right office  and also be part of a systemic process of sensing how well different parts of the system are performing.   Would it be meaningful to know about individual employees going above and beyond to serve kids or to know about schools that seem able and willing to organize for student success?   Would it be also helpful to collect other kinds of comments that maybe are indications of uncertainty or parts of the system in need of support?  I think it would.  While there is an Office of Public Information, there is no function I can see for public feedback.  This might be as simple as a small addition to a website, that digital suggestion box, or might involve staff aggregating and disseminating this information.  One of the new trends in performance evaluation involves surveys that provide information on teacher and school activity.  With these kinds of instruments coming in the future and with the kinds of specific and useful comments that test scores can never provide, perhaps MCPS might want to develop some capacity for handling this information.  My suggestion is to begin small with an easy and accessible place for feedback.

New MCPS Report Cards –Good Ideas, Sign of the Times, Room for Improvement by Phil Piety

Sometimes education seems like the place where good ideas do not come to die, but rather to find out how complex the problems they are trying to solve really are.  One they run into the difficult realities of educational practice they rarely die quickly even if their authors hope they do.  The new Montgomery County Public School (MCPS) report cards featuring  the letters I for “in progress”, P for “proficient,” and ES for “exceeds standards” is another great idea that is based on a sound logic, but that has run into difficult implementation. I actually don’t think the new MCPS report cards should die, but neither do I think they should live on in their current form.

The design of these new report cards is based on what is called standards-based grading (SBG) where the information delivered to parents is about what the students know or don’t know based on the standards they are to be taught to. Historically letter grades have been considered too broad to be useful in diagnosing problems with learning.  A student receives a B, but what is it that he or she needs to learn to make an A?  Traditional grades are also known to be subjective.  A student may have mastered the content, but didn’t look like they were learning or have the attitude the teacher wanted could get a B while another student who didn’t show as much achievement but made tremendous efforts could get an A.  Research has shown that traditional grading practices vary from teacher to teacher.  Many teachers communicate much more than achievement through traditional grades and SBG is an effort to standardize grading around what students should be learning.

I think SBG is a good idea and MCPS’s use of it admirable.  As a parent of kids in MCPS, however, I have found the results less than useful.  I find what Washington Post Education Writer Donna St. George called the Plethora of Ps difficult to use.  I have found it difficult to match the proficiency mark in one area difficult to reconcile with other parts of the report card and what we see coming home.   After two years of experience with this new SBG report card and hearing many stories of frustration from other parents, I can see four issues that are worth considering.

  1. Producing proficiency (Ps) is now considered a primary responsibility of a teacher.  This focus on standards proficiency is the result of the standards-based accountability that has been going on since before No Child Left Behind (NCLB).  While many educators think this is over emphasized in comparison to other areas of social and emotional development, the reality today is that proficiency counts.  In the past many teachers felt that grades should be distributed across the different letters with a certain percentage getting As and another percentage Bs, etc.  Many teachers then distributed grades accordingly and sometimes the reason a student got one grade versus another seemed arbitrary yo fit that pattern.  In today’s climate there is belief that a teacher’s job is to get all kids to academic proficiency and so for many a report card that has many Ps in it shows teachers did their job. This means in the world of educators and teachers there is a built-in implicit incentive to produce Ps.
  2. There are disconnected data points.  One of the biggest problems with the MCPS SBG approach is that standards in the report card are presented without some important context.  In the area of elementary math, for example, there are many many standards shown and each is given an individual evaluation.  However, the way the new CCSS math standards are designed is around what are called “learning trajectories” that string different standards together into these sequences of proficiency.  These areas, such as “Number and Operations in Base Ten” have many individual standards across multiple grades.  The MCPS report cards read like an inventory of the standards that are intended to be taught at a given grade (ex: grade 3) and don’t show the larger trajectories with related standards at lower/higher grades.  In reality many students can perform in areas outside of their assigned grades.  Including this multi-grade context (with a meaningful graphic representation) will probably help everyone make sense out of the data points.  Without some better information organization, parents see an ocean of P’s, I’s, and rarely ES’s, but the overall picture of learning is lost.In addition, most of the standards in the MCPS report cards are summary standards that are actually composed of many more detailed sub-standards.  For example, a fourth grade standard to “understand place value” is defined as three different kinds of understandings and to know whether a student actually has mastered one of these sub-standards would usually require multiple tasks.  So each I on a report card should be represented by Is, Ps, and/or ESs on the sub-standards.
  3. The new report cards dropped important information that was valuable for parents and students.  The shift from traditional to SBG seems to have been abrupt and important teacher comments and other information that would help parents understand the classroom environment and how they child is doing omitted.  Even if that “legacy” information can be replaced in the future, phasing it out slowly and giving parents a chance to learn how to use the new report cards is probably a good idea.
  4. SBG reports can still be subjective.  While the shift from traditional letter grades to SBG is intended to put the focus on what students have actually learned as opposed to teacher perceptions, the reality is that the much of this assessment of achievement is still largely subjective.  The tools MCPS uses such as MAP-R and MAP-M do not produce detailed analysis by standard of what students know or don’t know.  Until we reach a point where every score from I to P to ES (or whatever other coding system is used) can be backed up by examples of student work and where students can achieve a P or ES by multiple means then these report cards are not much better than traditional letter grades, although they can give the impression that they are.  Because there is no mark for “uncertain” there may be cases where when teachers are in doubt they assign a P or an I rather than indicate the true assessment of not sure, which adds to the confusion parents are experiencing.

In summary, the new MCPS report cards are based on a good idea of SBG or standards-based grading that is intended to shift the focus on what students have learned rather than a subjective evaluation of the teacher. In reality, they don’t quite achieve that goal yet. They are like many ideas to improve education—including evaluating teachers based on how much students learn (value added modeling) or holding school accountable for making sure that no child is left behind—that seem quite reasonable at first, but are much more complex and difficult to do well in practice.    MCPS intentions should be commended for being out in front of this effort and trying SBG well before the rest of the country.  At the same time, being on the bleeding edge when there is so little research and a big learning curve for parents and educators alike is risky.  Focusing on how to improve this area should be a priority for both the school system and parents who should make their information needs known. Without some advocacy for better information in the future, the next several years may see parents continuing to struggle to make good use of the reports that come home.