Are you a teacher? Access our free library of standards-driven lessons now!

Top 10 Principles of LDC Module Peer Review

Hot on the heels of the IU13 LDC National Peer Review Studio, the team at LDC wanted to share with the community ten things you should know to help finesse your module-reviewing skills.

Literacy Design Collaborative LDC Modules 1127 November 28, 2016

Hot on the heels of the IU13 LDC National Peer Review Studio, the team at LDC wanted to share with the community ten things you should know to help finesse your module-reviewing skills. They are*:

1. Know the rubric.

It is your Constitution. Granted, that means it is sometimes hard to interpret, but every score must be an attempt to apply the rubric's specific language and meaning.

2. Trust evidence, not intuition.

Intuition is a powerful force, but it is also highly subjective or specific to the individual. Calibration with other scorers requires us to base our judgments on the evidence that everyone can see, not on what a particular person feels.

3. Match evidence to language in the rubric.

A safe rule of thumb: If you select an indicator on the rubric, be sure you can identify evidence that justifies that selection in the module itself.

4. Begin with the end in mind.

Since a Good-to-Go score is the goal for any dimension, begin with the Good-to-Go criteria first when matching evidence to the rubric. You want to approach each dimension with these criteria in mind and then consider whether or not the module design meets them.

5. When scoring each dimension, isolate your judgment.

A dimensional rubric like the peer review rubric is not designed to assess one's overall impression of a module. Rather, it is isolating variables, distinguishing between relative strengths and weaknesses. Be mindful that a module's low score in one scoring dimension does not cloud your judgment on the scoring of other, unrelated dimensions.  

6. When scoring holistically, base judgments on the preponderance of evidence.

After scoring each dimension, we ask peer reviewers to make two holistic judgments - a holistic score for the Teaching Task and a holistic score for the Instructional Ladder.  The holistic score should be based on the preponderance or strength of the body of evidence.  However, there are often times when a single problem is big enough that it must be weighted more heavily when determining the holistic score. We sometimes call this a "fatal flaw" or a "deal breaker.”

7. Score only what you can see.

Scorers who attend to exactly what is presented on the page rather than making inferences about the module author’s intent tend to score more accurately. That shouldn't surprise us: It is easier to agree on what is than on what could be. A score is always based on what is.  Avoid “filling in the gaps” of a task or instructional ladder with strong potential. Instead, score what is present and give formative feedback to help the module author fully achieve that potential.

8. Resist seduction: one good element does not equal a good module.

You read a particularly well designed task prompt and after that, the module designer can do no wrong. (This is known as the "halo effect.")  One exceptional strength does not cancel out the weaknesses.

9. Recognize pre-loaded template elements.

Some LDC module templates provide a standardized set of Academic Standards, Skills List, and Instructional Ladder Elements that are meant to be selected and adapted for a particular module.  Focus on how well aligned those standards, skills, and instructional elements are to the demands of the teaching task and whether the teacher has sufficiently customized those elements for the specific purposes of the module.  

10. Know your biases; minimize their impact.

The trick is not to rid yourself of bias; that's impossible. But you do need to recognize what your biases are, and be mindful of how they can trigger first impressions that color all judgments that follow.


To learn more, please contact marketing@ldc.org.

*Adapted from TeaMSS "Top Ten Scoring Principles" (Measured Progress and the Stanford Center for Assessment, Learning, & Equity for the Literacy Design Collaborative, 2013).

Eleven Key Writing Products for College and Career

Which student products can take LDC classrooms closest to the work called for by college assignments and common career demands? Here’s a short list, developed in collaboration with Dr. Barrie Olson, of key products that LDC tasks could ask students develop, along with a description of the kind of work students will do.

Susan Weston LDC Basics 847 December 17, 2015

Which student products can take LDC classrooms closest to the work called for by college assignments and common career demands?

Here’s a short list, developed in collaboration with Dr. Barrie Olson, of key products that LDC tasks could ask students develop, along with a description of the kind of work students will do.

 

 

 

 

 

 

 

 

 

 

 

 

 

 Naturally, some of the products can overlap as well as be assigned separately. For example, a literature review can be a distinct college-syllabus expectation and it can also become part of a proposal or the introduction section of an IMRaD report. Similarly, a response paper may be a briefer form of critique/evaluation/review.  

It’s listed separately because it appears so often as a college assignment and also because many careers include regular requests of the form, “Can you check this report out and tell me if it’s helpful?” Overall, each product was selected based on confidence in its value for students as they pursue higher education, employment, and civic participation as adults.

In the list above, the products are shown in approximate disciplinary clusters.  Literary analysis and rhetorical analysis are classic assignments for English classes, just as lab reports are staples of scientific work and IMRaD reports a standard for upper level work in the sciences. The list intentionally puts argument essays and explanations at the end because they’re closest to the “plain vanilla” option in task design and least likely to pull students towards thinking distinctive to a specific field.


 The LDC draft “Product Typology for LDC Teaching Tasks” describes these products and identifies current Good to Go and Exemplary tasks that fit each type, including links to each listed module.

Those links show:

  • 37 tasks with argument essays as the writing product
  • 28 explanation tasks
  • 16 literary analysis tasks
  • 7 rhetorical analysis tasks 

However, there are just four modules that fit any of the other seven types:


 That means LDC is wide open for great new work on those products, and on proposals, IMRaD reports, response papers, and critique/evaluation/review assignments. For younger students, that work may involve just a few features of each type of work, rather than a full and formal attempt to create a complete version of the product.

The “Product Typology” is also still in draft form, exactly to invite partner thoughts and comments on how well it captures the most important products for career and college work. Holding the list to 10 or 11 entries is an important priority, but some of the entries may still need another tweak based on your feedback.

If you have questions and suggestions, please do share them with Danielle, LDC’s Manager of Professional Learning (danielle@ldc.org).

  

Formative Feedback for Teachers: An Essential Support for LDC Implementation

This summer at the College and Career Readiness Networking Conference, a participant in my breakout session entitled “Building LDC Capacity: One Regional Service Agency’s Approach” inquired, “How can you send teachers off to implement full LDC modules after only two days of LDC professional development? They don’t know enough yet. They aren’t ready.” This was a question I had been grappling with for quite some time.

Kelly Galbraith LDC Modules 832 December 3, 2015

This summer at the College and Career Readiness Networking Conference, a participant in my breakout session entitled “Building LDC Capacity: One Regional Service Agency’s Approach” inquired, “How can you send teachers off to implement full LDC modules after only two days of LDC professional development? They don’t know enough yet. They aren’t ready.” This was a question I had been grappling with for quite some time. You see, in IU13’s LDC training model, teachers design and implement two modules over the course of a single school year with only four days of regional professional development. Shocking, I know. 

A little background: Lancaster-Lebanon Intermediate Unit 13 (IU13) is an educational service agency in Pennsylvania providing professional development to twenty-two local school districts as well as charter and nonpublic schools. Schools sign on to send teacher teams to regional LDC professional development. Teacher teams participate in two full days of training to design their first modules, they go back to their classrooms to teach them, and then they return to IU13 half way through the year to learn how to score student work and to design second modules. You can read more about IU13’s work with LDC here.         

The LDC instructional design process prompts teachers, administrators, and school district leaders to think about systematizing distributed literacy instruction across K–12 in English language arts, science and technical subjects, and social studies. This involves a deep examination of existing curriculum, instructional practice, and assessment methods. This participant in my session was a seasoned LDC facilitator, and she was pointing out that deep implementation doesn’t just happen in four days with two modules. She’s right.

This whole idea of front-loading professional development reminds me of what Doug Fisher and Nancy Frey (2013) talk about with close reading. “During close reading, the teacher does not provide much in the way of pre-teaching or front-loading of content. The structure of the lesson itself is the scaffolding that was once delivered through frontloading” (p. 48). Too much front-loading of a text can eliminate the need for students to read it at all because they’ve done so much pre-reading work beforehand.

However, having students learn by doing, by grappling with complex text and making meaning from it with just enough support (repeated readings and discussion), can help to increase students’ independence, one of the goals of the Common Core State Standards. Just like close reading, educators learn LDC instructional design through implementation. Teachers use their expertise and their new learning about LDC to iterate in the classroom and collaborate with one another on problems of practice.

It’s no secret that effective LDC professional development (or any professional development, for that matter) should be sustained over time, be clearly aligned with state standards and local curricula, provide job-embedded support, and encourage collaboration (Crafton & Kaiser, 2011; Desimone, Porter, Garet, Yoon, Birman, 2002; Johnson, Fargo & Kahle 2010; Penuel, Fishman, Yamaguchi & Gallagher, 2007; Tchsannan-Moran & McMaster, 2009). Of course this involves more formal professional development, as we know it, but much of the professional learning occurs when smart teachers plan content-specific literacy mini-tasks, field-test them, and collaboratively examine student work, all with the right supports (Penuel, et al., 2007; Research for Action, 2014; Tchsannan-Moran & McMaster, 2009).   

 Our challenge at IU13 has been to figure out what those “right” supports are, and I can say with certainty that formative feedback is one of them. The peer review feature in LDC CoreTools has been an amazing addition to our LDC professional development model. It has given us the ability to provide specific, formative feedback on module design. We all know that great writing instruction for students happens in the margins with feedback from teachers or peers. The same is true for LDC module design. Providing specific, formative feedback on modules through peer review can accelerate LDC professional development because the feedback is individualized to the content and context of each module.

 There are no shortcuts to providing feedback for learning just as there are no shortcuts for providing specific feedback on student writing. However, LDC CoreTools has made it significantly easier. In the 2014–15 school year, I formatively peer reviewed each and every first module (over 60 of them) as part of our professional development in order to accelerate teacher learning. Reviewing each module informed my facilitation of LDC professional development, and it prompted teachers to think about and revise their work.

 Now, in 2015–16 as part of an effort to expand formative peer review and increase our regional capacity, IU13 will pilot a winter regional team comprised local teachers and administrators who have also participated in national LDC peer review training. The goal is to increase opportunities for educators in our region to give feedback to one another on task and instructional ladder design. The conversations that will occur around peer review will, no doubt, deepen LDC knowledge and expertise in our region.  

 As I mentioned earlier, formative peer review is only one of the supports we have established at IU13, and it has become an essential component of our model. Just like many of you, we are dealing with a reality of dwindling budgets and substitute shortages. Right now, I’m not counting on more professional development time with teachers outside of their classrooms in the initial LDC training year. I’m focusing on providing more effective supports, and I’m always looking for better solutions—perhaps additional, site-based coaching, or LDC online courses.

We have four days, two modules, one school year and amazing teachers and administrators who see the value in collegial feedback and how it can deepen professional learning. In our case, we find that the best way to learn about LDC is to dive right into the work.      

 

REFERENCES:

  • Crafton, L. & Kaiser, E. (2011). The language of collaboration: Dialogue and identity in teacher professional development. Improving Schools, 14(2), 104-116. doi: 10.1177/1365480211410437
  • Desimone, L. M., Porter, A. C., Garet, M. S., Yoon, K. S. & Birman, B. F. (2002). Effects of professional development on teachers' instruction: Results from a three-year longitudinal study. Educational Evaluation and Policy Analysis, 24(2), 81-112. doi: 10.3102/01623737024002081
  • Frey, N. & Fisher, D. (2013). Rigorous reading: 5 access points for comprehending complex texts. Thousand Oaks, CA: Corwin Literacy.
  • Johnson, C., Fargo, J., & Kahle, J. B. (2010). The cumulative and residual impact of a systemic reform program on teacher change and student learning of science. School Science and Mathematics, 110(3), 144-159.
  • Penuel, W. R., Fishman, B.J., Yamaguchi, R., & Gallagher, L. P. (2007). What makes professional development effective? Strategies that foster curriculum implementation. American Educational Research Journal, 44(4), 921-958. doi: 10.3102/0002831207308221
  • Research for Action (2014). Enacting common core instruction: How Intermediate Unit 13 leveraged its position as an educational service agency to implement and scale the LDC initiative. Retrieved from http://eric.ed.gov/?id=ED553290

Keep me up to date with LDC Blog

LDC updates in your inbox.