Скачать книгу

art with the science of using formal assessments of individual students. Reflect on your use of each strategy by filling out the “Strategy Reflection Log” on page 331.

       Common Assessments Designed Using Proficiency Scales

      Teachers who teach the same content at the same level work together in teams to create common formative and summative assessments that provide students with feedback. Consider the following four steps.

      1. Create a proficiency scale for the topic that will be the focus of the common assessment.

      2. Design an assessment that includes items and tasks for score 2.0, 3.0, and 4.0 content.

      3. Score the assessment individually or in cooperation with the other teachers and discuss the results for all students who have taken the common assessment.

      4. Identify those students with common needs based on the assessment results, and group students for instruction according to their needs.

       Assessments Involving Selected-Response or Short Constructed-Response Items

      The teacher administers assessments that employ selected-response and short constructed-response items. Constructed-response items require students to generate a correct answer as opposed to merely recognizing one. Short-answer assessments and oral responses are examples of constructed-response assessments. Following are six types of selected-response items (Marzano, 2006).

      1. Traditional multiple choice: Provides a stem and alternatives, some of which are distractors and one of which is the correct choice

       2. Matching: Provides multiple stems and multiple options

      3. Alternative choice: Provides a stem and two choices that are quite similar

      4. True or false: Provides statements that must be judged as true or false

      5. Fill in the blank: Provides a stem for which only one correct answer is reasonable

      6. Multiple response: Allows for two or more correct responses

       Student Demonstrations

      The students generate presentations that demonstrate their understanding of a topic. Different content areas lend themselves more readily to certain types of demonstrations. For example, subject areas that focus on physical skills (such as physical education, art, and music) frequently use student demonstrations. For those content areas where demonstrations are primarily mental in nature the teacher might ask a student to think aloud while he or she is using the skill, strategy, or process.

      Ask students the following questions during or after a demonstration (Marzano Research, 2016).

      • “What specific skills were you demonstrating?”

      • “What parts do you think you did well?”

      • “On which parts did you struggle?”

      • “What would you do differently if you were to do it again?”

       Student Interviews

      During student interviews, the teacher holds a conversation with individual students about a specific topic and then assigns a score to each student that depicts his or her knowledge of the topic. Following are tips for interviewing students.

      • Have a clear progression to the interview: Instead of asking the student to tell you everything he or she knows about a particular topic, start with the score 2.0 content in the proficiency scale for that topic and then move up through score 3.0 and 4.0 content.

      • Prompt for further information: If a student can’t think of anything else to say, gently prod him or her for further information using the proficiency scale as a prompt.

      • Revisit previous statements: Ask a student to recall topics from earlier in the conversation. Help him or her make connections with the current topic by asking, “How is what you said earlier affected by what we’re talking about now?” Or, you might simply ask the student to explain a previous topic over again. Revisiting a topic can help the student recall information he or she missed the first time around.

      • Ask the student to defend conclusions: When a student draws an inference, makes a prediction, or otherwise states a conclusion, ask him or her to defend or justify the statement. Explore gaps in reasoning by asking, “How did you form that conclusion from this particular information?” Encourage him or her to think about alternative possibilities by asking, “How might these same events have resulted in a different outcome?” or “What sort of information might disprove your conclusion?”

       Observations of Students

      The teacher observes students interacting with the content and assigns a score that represents their level of knowledge or skill regarding the specific topic observed. Students may display their proficiency through demonstration or verbally in response to the teacher’s questions. For example, a teacher may observe a student incorrectly executing the order of operations when working a mathematics problem. The teacher might point out the mistake and then observe the student rework the problem with the correct method.

      A template, like the one in figure 2.3, can help to record your observations.

Image

      Source: Marzano Research, 2016.

      Visit go.SolutionTree.com/instruction for a free reproducible version of this figure.

       Student-Generated Assessments

      The teacher invites students to devise ways they will demonstrate competence on a particular topic at a particular level of the proficiency scale. Student-generated assessments provide a wide variety of ways in which students can demonstrate competence. Use the template in figure 2.4 to help guide students in planning for their assessment.

Image

      Source: Marzano Research, 2016.

       Response Patterns

      The teacher identifies response patterns at score 2.0, 3.0, and 4.0 levels as opposed to adding up points to create an overall score on an assessment. This generative score indicates the content on which students are doing well and the content on which they must improve to move to the next level. There are three approaches a teacher might use to examine response patterns.

      1. Percentage scores: In this method, the teacher computes percentage scores for each score level. For example, say a student acquires 88 percent of the possible points for the score 2.0 level, 50 percent of the points for the score 3.0 level, and 15 percent of the points for the score 4.0 level. Examining the overall pattern, the teacher then determines how well the student performed overall in reference to the scale. This is done by making decisions about the student’s proficiency moving from score 2.0 through score 4.0. The score 2.0 percentage is 88 percent, so the teacher concludes that the student obtained at least a score of 2.0 on the assessment. Next, the student’s percentage score for the 3.0 content was 50 percent. The teacher concludes that this is not enough to warrant an overall score of 3.0, but it is enough to warrant a score of 2.5. The teacher stops at this point. If a student has not provided enough evidence to warrant a score at one level, then he or she is not scored at the next level up.

      2. Response codes: With this approach, each student’s response on each item is coded as correct,

Скачать книгу