Alignment: Overall Summary

The instructional materials reviewed for Third Grade do not meet expectations for Alignment to NGSS, Gateways 1 and 2. Gateway 1: Designed for NGSS; Criterion 1: Three-Dimensional Learning does not meet expectations. The materials do not consistently integrate the three dimensions into learning opportunities for students. No opportunities for student sensemaking occur with the three dimensions and few opportunities for student sensemaking are two dimensional with SEPs and DCIs. The summative assessments are not consistently three dimensional and do not consistently measure the three dimensions for the topic-level objectives (PEs). The lesson level objectives are also not three dimensional. Gateway 1: Designed for NGSS; Criterion 2: Phenomena and Problems Drive Learning does not meet expectations. Phenomena are not present and therefore are not able to be connected to DCIs, presented to students as directly as possible, nor drive learning and use of the three dimensions within or across lessons. Two problems are present at the topic-level that connect to grade-level DCIs. One problem is presented as directly as possible. Both problems elicit but only one problem leverages student prior knowledge. Neither problem drives learning and use of the three dimensions within or across lessons.

Alignment

|

Does Not Meet Expectations

Gateway 1:

Designed for NGSS

0
14
24
28
1
24-28
Meets Expectations
15-23
Partially Meets Expectations
0-14
Does Not Meet Expectations

Gateway 2:

Coherence and Scope

0
16
30
34
N/A
30-34
Meets Expectations
17-29
Partially Meets Expectations
0-16
Does Not Meet Expectations

Usability

|

Not Rated

Not Rated

Gateway 3:

Usability

0
30
50
59
N/A
50-59
Meets Expectations
31-49
Partially Meets Expectations
0-30
Does Not Meet Expectations

Gateway One

Designed for NGSS

Does Not Meet Expectations

+
-
Gateway One Details

The instructional materials reviewed for Grade 3 do not meet expectations for Gateway 1: Designed for NGSS. Criterion 1: Three-Dimensional Learning does not meet expectations. Criterion 2: Phenomena and Problems Drive Learning does not meet expectations.

Criterion 1a - 1c

Materials are designed for three-dimensional learning and assessment.
0/16
+
-
Criterion Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations for Criterion 1a-1c: Three-Dimensional Learning. The materials do not consistently include integration of the three dimensions in at least one learning opportunity per learning sequence. Few learning sequences are meaningfully designed for student opportunity to engage in sensemaking with the SEPs and DCIs, and no learning sequences provide opportunities for three-dimensional sensemaking. The materials do not provide three-dimensional learning objectives at the lesson level and the respective assessments are not consistently three dimensional. The materials provide three-dimensional objectives at the topic level, but summative tasks do not measure student achievement of all learning objectives (PEs) or their associated elements, and few summative assessment tasks are three-dimensional in design.

Indicator 1a

Materials are designed to integrate the Science and Engineering Practices (SEP), Disciplinary Core Ideas (DCI), and Crosscutting Concepts (CCC) into student learning.

Indicator 1a.i

Materials consistently integrate the three dimensions in student learning opportunities.
0/4
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that they are designed to integrate the Science and Engineering Practices (SEPs), Disciplinary Core Ideas (DCIs), and Crosscutting Concepts (CCCs) into student learning opportunities. The instructional materials are organized by segments (four per grade level), with one to two topics, and within the topic are the Quests PBLs, Lessons, and Lab Activities (uConnect, uDemonstrate). Each topic includes two to four 5E lessons. Each 5E lesson consists of four sections: Engage, Explore, Explain and Elaborate, and Evaluate and includes the Quest, texts, and Lab Activities. In two of the 18 lessons, the materials integrate the three dimensions within at least one learning opportunity or activity. The three-dimensional integration occurs within the Activity Labs in the Explore section in both lessons. Across the grade, learning sequences or lessons consistently provide students with an opportunity to engage with and/or develop understanding of the SEP and DCI, but miss the opportunity for students to develop understanding of the CCCs.

Examples where the materials do not incorporate all three dimensions into a learning sequence:

  • In Grade 3, Segment 2, Topic 3, Lesson 1: Life Cycles, students learn about life cycles in living organisms. In the Explore section, students compare life cycles of different plants and animals to identify stages that all life cycles have in common (DCI-LS1.B-E1). In the Explain and Elaborate section, students watch a video about different life cycles then draw the life cycle of a butterfly. In the Evaluate section, students match pictures of the young animals to the adults and look at pictures of an environment and identify animals that could live in the environment (DCI-LS4.C-E1). Students observe a photo to describe why an animal could not survive there. There is a missed opportunity for students to use crosscutting concepts as they develop their understanding of life cycles.
  • In Grade 3, Segment 3, Topic 4, Lesson 1: Survival of Individuals, students learn how living things adapt to survive in their environments. In the uConnect Lab, students create two bird beak models based on pictures and use them to try and pick up different types of food (SEP-MOD-E6). Students record their observations and use them to state what “the shape of a bird’s beak tell[s] you about a bird” (SEP-CEDS-P1). In the Explore section, students are asked to “predict whether a layer of fat will help a sea lion survive in a cold environment.” They are given petroleum jelly, water, and ice and make a plan and record observations (SEP-DATA-P1) and then use evidence from their investigation to answer which sea lion—one with less fat or more fat—would be more likely to survive (DCI-LS4.B-E1). In the Explain and Elaborate section, students compare pictures of different animals and their habitats to learn that animals in different environments have different traits (DCI-LS3.B-E2). In the Quest Check-Ins, students make firsthand observations of plant and animal species and identify the similarities and differences between living things (SEP-INV-P4). They use their observations to explain which traits might be the most important for survival in that environment (DCI-LS4.B-E1). There is a missed opportunity for students to use crosscutting concepts as they develop their understanding of plant and animal adaptations.
  • In Grade 3, Segment 3, Topic 4, Lesson 2: Survival of Groups, students develop an understanding of how some animals form groups to increase chances of survival. In the Explore section, students fold paper in a V-shape and hold it in front of a fan, record observations (SEP-DATA-P1), and describe the difficulty of holding the papers straight. Students are asked to explain if their evidence supports the claim that traveling in flocks helps geese survive (DCI-LS2.D-E1). In the Quest Check-In, students answer questions about group survival in a pond ecosystem (DCI-LS2.D-E1). There is a missed opportunity for students to use crosscutting concepts as they develop their understanding of group survival.
  • In Grade 3, Segment 3, Topic 4, Lesson 3, Survival When Environment Changes, students develop an understanding of how plants and animals respond to changes in their environment. In the Explore section, students determine how rising sea level would affect tigers that live by the sea by modeling what happens to land when the ice melts in nearby water. They measure the amount of land before and after the melt (SEP-DATA-P1) and then explain how a rising sea level would affect a tiger living by the sea (DCI-LS4.D-E1). In the Explain and Elaborate section, students collaboratively design two containment methods for an oil spill (DCI-ESS3.C-E1). They build and test their designs (SEP-MOD-E5) and record observations. Students improve their designs if needed and compare them to see which contains the most oil. In the Quest Check-Ins, students answer the question, “How do you think pond plants and animals will respond to changes caused by the expansion of human communities?” Students choose one of two given solutions to reduce the impact of construction on the pond (i.e., redirect another stream to flow to the pond or fill the pond with water every month) and explain why it will be the better solution. They also have the option to think of their own solution. Students present their ideas (SEP-INFO-P4). There is a missed opportunity for students to use crosscutting concepts as they develop their understanding of survival in response to environmental changes.

Examples where the materials integrate all three dimensions into a learning opportunity within a learning sequence:

  • In Grade 3, Segment 1, Topic 1, Lesson 1: Motion, Explore, students observe and measure the speed of wind-up toy and golf ball, then design a procedure for measuring how fast a toy car and golf ball moves when it is pushed. Students explain how the speed of each object is affected by the push and why they think this is the case. Students discuss the patterns they observe in the objects as they move and eventually slow down and stop (DCI-PS2.A-E2, SEP-INV-P4, CCC-PAT-E2).
  • In Grade 3, Segment 1, Topic 1, Lesson 2: Patterns in Motion, Explore, students record the observed motion of a ball (SEP-INV-P4) and use it to predict if a larger or smaller ball will move in the same way (DCI-PS2.A-E2, CCC-PAT-E2). They plan and conduct an investigation to test their prediction.

Indicator 1a.ii

Materials consistently support meaningful student sensemaking with the three dimensions.
0/4
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that they consistently support meaningful student sensemaking with the three dimensions. The materials contain no instances of three-dimensional sensemaking, where SEPs and CCCs meaningfully support student sensemaking with a DCI. The materials contain two instances of two-dimensional sensemaking, where SEPs or CCCs meaningfully support student sensemaking with a DCI. While most of the 18 lessons engage students with an SEP, the learning focuses on the DCI; these lessons are not consistently designed for the SEPs or CCCs to support student sensemaking in the context of or with that DCI.

Examples of learning sequence where students do not engage in meaningful sensemaking with multiple dimensions:

  • In Grade 3, Segment 2, Topic 3, Lesson 2: Inherited Traits, students engage in a learning sequence to learn how some traits are inherited from parents. Students look at a drawing of adult and young raccoons, record observations, and say how they are similar and different (SEP-INV-P4, SEP-DATA-P1). Students compare and contrast the footprints and notice that the size and shape of footprints vary according to size, age, weight, and movement. This comparison does not require students to understand DCIs related to inherited traits. Later in the learning sequence, students read that offspring look similar but not identical to their parents and share the same types, number, and shapes of parts (DCI-LS3.A-P1). In the Quest Check-In, students read about camouflage and draw a line to match an animal to an environment that might camouflage it. In the uEngineer It activity, students select a plant to improve and answer questions about the plant’s traits and why they want to improve it. Students define the problem this genetic engineering will solve (SEP-AQDP-P3), but do not need to use an understanding of any DCI.
  • In Grade 3, Segment 3, Topic 4, Lesson 2: Survival of Groups, students engage in a learning sequence to develop an understanding of how some animals form groups to increase chances of survival. In the uInvestigate Lab activity, students model a flying V by holding four folded papers in front of a fan (SEP-MOD-E3) to determine how wind affects birds in various positions within a V formation (DCI-LS2.D-E1). They record their observations (SEP-DATA-P1) and explain how the data they collected supports the claim (SEP-CEDS-E2) that “traveling long distances in flocks helps geese survive” (DCI-LS2.D-E1). However, students do not gather enough information from their models to address how being in a group helps the birds survive. They can determine the birds in the back do not blow in the wind as much, but they do not have enough information to make sense of the fact that birds in the back use less energy, as stated in the sample student answer, or how this supports survival.
  • In Grade 3 Segment 3 Topic 5, Lesson 1: Fossils, students engage in a learning sequence to develop an understanding of what fossils are and how they form. Students use a sponge, salt, and water to investigate how minerals can fossilize tissues without changing their shape. Students determine how the sponge, salt, and water represent what happens to fossils (SEP-MOD-E6) and determine reasons why the process may not happen to all fossils. Students do not make meaning about the types of organisms that are no longer found on Earth, nor do they analyze what evidence fossils provide about organisms or their environments, which are ideas that are central to the grade-level DCIs.
  • In Grade 3, Segment 3, Topic 5, Lesson 2: Fossils as a Record, students engage in a learning sequence intended to develop an understanding of how fossil evidence can provide information about organisms and environments that existed long ago. In this learning sequence, students view a set of footprint drawings, record what they observe (SEP-DATA-P1), and make inferences about the footprints of ancient animals. Students do not make sense of the grade-level DCI. They state what they see in the footprint drawings, but do not address that fossils provide evidence about the types of organisms that lived long ago. In the Quest Check-In Lab, students write information they have gathered about each fossil and read a letter from a paleontologist that describes three dig sites (DCI-LS4.A-E2). Students organize the information in a table and say which fossil came from which dig site, but students do not engage in sensemaking as the resource provides students with the answers.

Examples of learning sequence where SEPs or CCCs meaningfully support student sensemaking with the other dimensions:

  • In Grade 3, Segment 1, Topic 1, Lesson 1: Motion, students engage in a learning sequence to develop an understanding of motion and comparative speed. In the uConnect Lab, students make sense of what factors affect the speed of a falling object. Students make three observations of dropped paper and record their observations as well as the time it took for the sheet to hit the ground each time (SEP-INV-P4). They design and implement a plan to make the paper fall faster or slower than in their initial observations. Students explain how their design changes affected how quickly it fell and why they think it was so. They use their observation data to predict whether a larger sheet of paper will take more or less time to reach the ground (DCI-PS2.A-E2), but do not have the opportunity to explain why they believe the larger paper will behave in the predicted way.
  • In Grade 3, Segment 3, Topic 4, Lesson 1: Survival of Individuals, students engage in a learning sequence to develop an understanding of how living things adapt to survive in their environments. In this learning sequence, students plan and conduct an investigation to determine whether a layer of fat helps sea lions survive the cold by comparing reactions to cold between a finger that is covered in petroleum jelly and one that is not (SEP-INV-P2). Students observe and record information (SEP-DATA-P1). During the Quest Check-In, students make firsthand observations of habitats in order to compare living things (SEP-INV-P4). They use their observations to determine traits that are important for survival in that environment (DCI-LS4.B-E1). Students use this evidence to support a claim (SEP-ARG-E4) for whether organisms will survive if the pond dries up (DCI-LS4.C-E1; DCI-LS4.D-E1).

Indicator 1b

Materials are designed to elicit direct, observable evidence for the three-dimensional learning in the instructional materials.
0/4
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that they are designed to elicit direct, observable evidence for three-dimensional learning in the instructional materials. One out of 18 lesson learning objectives is three-dimensional. Seven lesson learning objectives are two-dimensional and the remainder incoroporate either one or zero dimensions.

Formative assessments are frequent and are spread across each lesson. The formative assessments target individual learning or group understanding. The uInvestigate Labs are at the beginning of each lesson, before any reading or investigations, and do not measure any lesson learning. While information gained from the labs could provide formative data to inform instructional next steps, the teacher materials do not include support for using this data or adjusting instruction.

The majority of the Teacher Edition formative assessment questions are discussion-based and no directions are provided to support the teacher in eliciting ideas from each student or adjusting instruction based on student responses.

The Teacher Edition questions that address student understanding directly, in the reading section, allow students to scan for the answer within the text. All Student Edition lesson checks, Interactivity, and Online Quizzes are taken by an individual student as there is a response area they fill in. Interactivities frequently only assess the DCIs. The online quizzes are all multiple choice. While materials rarely address the CCC in the instruction, they regularly are used in a combination with the DCI during assessments.

Examples of lessons that do not have a three-dimensional objective, the formative assessment tasks partially or do not assess student knowledge of all three dimensions; and the materials do not provide guidance to support the instructional process:

  • In Grade 3, Segment 2, Topic 3, Lesson 1: Life Cycles, the lesson objective is “Describe how all life cycles follow the same pattern.” The materials include seven formative assessment tasks, three of which assess individual understanding of the learning objective and are one-dimensional, covering only the DCI. These three formative assessment tasks include the uInvestigate Lab, questions throughout the Explore reading labeled as formative assessment, and a Lesson Check. The lesson starts with the uInvestigate Lab with students using diagrams of three life cycles to compare and contrast life cycles. The student directions state to develop a model but the Guiding Inquiry section of the Teacher Edition reveals that students are expected to just recreate the diagrams using string and paper. Students are asked to state how their models show that life cycles are alike and different (DCI-LS1.B-E1). Throughout the Explore reading, students answer questions that help them understand life cycles which are labeled in the Teacher Edition as a formative assessment. However, the majority of these questions do not explicitly assess the DCI with the exception of two questions that ask students to find commonalities across different life cycles (DCI-LS1.B-E1). In the Lesson Check and Online Quiz, students answer questions about life cycles and why reproduction is important. For example, the final question of the quiz provides an image and asks students to complete a sentence describing the difference between an alligator life cycle and a mosquito life cycle (DCI-LS1.B-E1). The materials do not provide guidance to teachers for using data from any of the formative assessment data to support the instructional process except in the Lesson Check when remediation activities are suggested if students have trouble with the questions.
  • In Grade 3, Segment 3, Topic 5, Lesson 1: Fossils, the lesson objectives are “Describe what a fossil is” and “Describe some ways that fossils form.” These objectives do not address any grade-level DCIs, CCCs, or SEPs. Out of seven formative tasks in the lesson, four assess individual understanding of the lesson objective and none are connected to any of the three dimensions. The three additional tasks assess group understanding of the lesson objective and the DCI level only. In the uInvestigate Lab, students make a plan to investigate how minerals can fossilize tissues without changing their shape, using only a sponge, salt, and water. Students must determine how the sponge, salt, and water represent what happens to fossils. Throughout the Explore reading, students answer questions that are labeled in the teacher edition as formative assessment. These questions focus on concepts of fossils but do not assess understanding of any of the dimensions. One of these noted formative group assessment opportunities includes answering three questions about fossil formation, what makes a fish dissolve, what is the difference between fossil model and cast fossil, and what can a fossil tell scientists about the environment? The materials do not provide guidance to teachers for using formative assessment data to support the instructional process except in the lesson check when remediation activities are suggested if students have trouble with the questions.

Examples of lessons that have a three-dimensional objective, the formative assessment tasks do not assess student knowledge of all three dimensions in the learning objective, and materials do not provide guidance to support the instructional process:

  • In Grade 3, Segment 1, Topic 1, Lesson 4: Balance and Unbalanced Forces, the learning objective is, “Use evidence to explain how balanced and unbalanced forces affect an object’s motion.” The materials include six formative assessment tasks of which two are one-dimensional and four are not connected to any dimensions. In the uInvestigate Lab, students used balanced forces to build a structure that can safely hold a steel ball. Students state which forces they are balancing (DCI-PS2.A-E1). Then students choose materials to build a structure. They draw a picture of their structure with labels showing how they will use the materials. They build the design, state which forces they tried to balance, and determine how their design could be improved. In the Visual Literacy Connection, “How can you move an object?,” students answer three questions about what balanced forces are and what happens when they become unbalanced (DCI-PS2.A-E1), which addresses part of the learning objective that balanced and unbalanced forces affect an object’s motion. Towards the end of the lesson, students use Interactivity: Motion and Friction to get a ball to a certain spot to think about designing a race car. Students are asked to state how friction affected the ball. The assessments do not provide information about student understanding of the learning objective with the exception of a few questions that partially assess the learning objective. The materials do not provide guidance to teachers for using data from any of the formative assessments to support the instructional process except in the Lesson Check when remediation activities are suggested if students have trouble with the questions.
  • In Grade 3, Segment 3, Topic 4, Lesson 3: Survival When Environments Change, the learning objective is “Explain how plants and animals respond to changes in the environment.” Out of eight formative assessment tasks of different formats in the lesson, four assess individual understanding once for the lesson objective and two assess the DCI only. In the uInvestigate Lab, students predict how rising sea levels could affect tigers that live by the sea. They use a cake pan to model what happens to land when the ice melts in nearby water. Students measure the “land” and record data before and after the melt. They use observations to say how a rising sea level would affect tigers living near the sea (DCI-LS4.D-E1). Throughout the Explore reading, students answer questions that are labeled in the Teacher Edition as formative assessment. However, these questions do not explicitly assess the targeted DCI. During the Lesson Check, students answer three questions in a CER format: would animals survive if a forest fire; using evidence, how could a plant or animal respond to changes from a fire (DCI-LS4.D-E1); and how does your evidence support your claim? (SEP-CEDS-E2). The materials do not provide guidance to teachers for using formative assessment data to support the instructional process except in the Lesson Check when remediation activities are suggested if students have trouble with the questions.

Indicator 1c

Materials are designed to elicit direct, observable evidence of the three-dimensional learning in the instructional materials.
0/4
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that they are designed to elicit direct, observable evidence of the three-dimensional learning in the instructional materials. The materials provide three-dimensional learning objectives for the topic level in the form of performance expectations (PEs), but summative tasks measure student achievement of only some learning objectives (PEs) or their associated elements, and few summative assessment tasks are three-dimensional in design.

There are three assessments at each topic level: Evidence-Based Assessment, uDemonstrate Lab Assessment, and the Online Topic Test. The Evidence-Based Assessment is typically four to six questions, the uDemonstrate Lab is a performance-based assessment, and the Online Topic Test is mainly presented as a multiple-choice exam. The DCIs were most often assessed in at least one question on the assessments. The SEPs were occasionally assessed independently or in combination with the DCI. The CCCs were generally not assessed on any of the three assessments.

There are two assessments at the segment level: the California Performance-Based Assessment and the Summative Benchmark Assessment which consists of multiple-choice and free-response questions. The end of year assessment is a 28-question test, with three fill-in-the-blank, three free-response, 14 multiple-choice, and eight selected-response items.

Examples where objectives are three-dimensional, but summative assessment tasks do not fully assess the three-dimensional learning objectives and are not three-dimensional in design:

  • In Grade 3, Segment 1, Topic 2: Electric and Magnetic Forces, the objectives include two PEs: 3-PS2-3 and 3-PS2-4. Not all of the dimensions addressed in these PEs are assessed. There are three summative assessments in the topic. The Evidence-Based Assessment has five questions: three short answers, one multiple-choice, and one fill-in-the-blank. The uDemonstrate Lab Assessment has five short answers. The Online Topic Test consists of 12 multiple-choice questions. Not all dimensions of the performance expectations listed for this segment are assessed. In the Evidence-Based Assessment, one of the five questions partially assesses DCI-PS2.B-E2 for both 3-PS2-3 and 3-PS2-4. For the uDemonstrate, zero dimensions were assessed for 3-PS2-3 and 3-PS2-4 in this assessment. Three of the twelve questions for the Online Topic Test connect to the DCIs for both PEs. Overall, the DCIs are assessed in two of the three assessments, and the CCCs and SEPs are not assessed.
  • In Grade 3, Segment 3, Topic 4: Adaptation and Survival, the objectives include four PEs: 3-LS2-1, 3-LS4-2, 3-LS4-3, and 3-LS4-4. Not all of the dimensions addressed in these PEs are assessed. In the Evidence-Based Assessment, students answer four questions: one free-response, one fill-in-the-blank, and two multiple-choice. Students read a paragraph about anoles. Students look at a table that shows anole type, favorite branch size, and diet. The first question asks the students to explain based on their predictions of why there will be more twig anoles higher in the tree. Question three asks students for evidence. In question four, students choose an answer that explains why anoles’ dewlap color changed over time. This item assesses whether students can identify a cause and effect relationship, but the change is already given to them. This answer is also provided for students in the reading. For the uDemonstrate Lab Assessment, students read a scenario about a black rabbit that gets loose on a summer day in the California desert. They answer six questions about the rabbit’s expected survival (or lack of) compared to a native desert cottontail. The six questions ask students to write a hypothesis about how well the black rabbit could survive, make a plan to test the hypothesis, conduct the test, record data, explain whether the evidence supports the hypothesis, construct an explanation about expected survival of the black rabbit (SEP-CEDS-E2, DCI-LS4.C-E1), and explain which rabbit (i.e., the black rabbit or native desert cottontail) is likely to be caught by a predator (DCI-LS4.B-E1, DCI-LS4.C-E1). None of the CCCs connected to the PEs were assessed. In the Online Topic Test, there are 20 questions and the DCIs are assessed in four of the questions and the CCCs and the SEPs are not assessed. Of the four questions that assessed the DCIs, each question connected with one of the four PEs identified for this topic. Overall, while the DCIs and some of the SEPs were assessed, the CCCs were not assessed in any of the summative assessments.
  • In Grade 3, Segment 3, Topic 5: Fossil Evidence, the objectives include two PEs: 3-LS4-1 and 3-LS4-3. Not all of the dimensions addressed in these PEs are assessed. There are three summative assessments in the topic. For the Evidence-Based Assessment, students observe a diagram of rock layers with fossils in them. Out of five questions, zero questions assess the DCI as students do not answer questions about how fossils provide evidence of organisms that once lived on the Earth and their environments, zero assess the SEP as students do not use data to make sense of phenomena nor construct an argument with evidence, and zero assess the CCC as students do not use a scale to explain phenomena nor use cause and effect to explain the change. The uDemonstrate Lab Assessment has eight free-response questions. Three of the eight questions assess a DCI and none assess the SEP or CCC. Students examine a fossil photo to determine how the animal survived in its environment. For the Online Topic Test, students take a 14-question test made up of multiple-choice, free-response, and selected-response questions. Out of 14 questions, zero assess the DCI as the questions about fossils are about identifying index fossils or how fossils (and models of fossils) are formed. Zero assess the SEP or CCC as students do not make a claim with evidence or construct an argument or explain change using cause and effect or scale. Out of the three summative tasks, some part of the DCIs for 3-LS4-1 and 3-LS4-3 are assessed in the topic. The SEPs and CCCs are not assessed.
  • In Grade 3, Segment 1: Forces, the objectives include four PEs: 3-PS2-1, 3-PS2-2, 3-PS2-3, and 3-PS2-4. Not all of the dimensions addressed in these PEs are assessed. There are two segment level assessments. The California Performance-Based Assessment consists of five short answer questions including a Claim, Evidence, Reasoning (CER) format question. None of the dimensions listed by the publisher as the objectives are addressed in this assessment. Students are presented with a photo of bowling pins being knocked down by a bowling ball and the question of how to make more pins fall is posed in the initial scenario. First students are asked what happens when a ball is pushed and what force makes a bowling ball move. Then students follow a prescribed set of directions to do an experiment on how ramp height affects how many tubes will be knocked over by a ball. Finally, students are asked to construct a CER to answer the question, “What variable would you change in the investigation to knock down more tubes?” Given that students only collected evidence on ramp height in the prescribed investigation, the possible claim and evidence for the CER are very limited meaning that a student may be able to answer correctly based on the limited options and without any understanding of DCIs, SEPs, or CCCs. In the Summative Benchmark Assessment, students respond to eight multiple-choice questions about a magnetic dartboard, pencils rolling down a book ramp, magnetic force of balloons rubbed on hair, and maglev trains. The fifth question asks students to identify when forces are unbalanced and balanced (DCI-PS2.A-E1) in an experiment which assesses the DCI of 3-PS2-1. The DCI of 3-PS2-4 is assessed in two out of eight questions. For example, to answer question eight requires students to identify that maglev trains use the property of magnetic force that the magnets do not need to be in contact (DCI-PS2.B-E2). Neither 3-PS2-2 or 3-PS2-3 is assessed in the eight questions. No SEPs or CCCs are addressed in the eight questions.
  • In Grade 3, Segment 4: Weather Impacts, the objectives include three PEs: 3-ESS3-1, 3-ESS2-2, and 3-ESS2-1. Not all of the dimensions addressed in these PEs are assessed. In the segment, there are two summative assessments. The California Performance-Based Assessment consists of five questions: four short answers and one question where students put data from two tables into bar graphs. Students look at average winter temperature over four years from two different cities in California, identify the pattern for each location, and predict the temperature for the next winter. Three of the five questions assess 3-ESS2-1; one question is three-dimensional and the other two are two-dimensional. None of the questions addressed ESS3-1 or ESS2-2. For the Summative Benchmark Assessment, students complete seven questions online which consist of multiple-choice, fill-in-the-blank, drag-and-drop, and drop-down-menu questions. Students list criteria and constraints for two flooding solutions but do not make a claim about the merit of a solution. They read data in tables, but do not represent it or explain that it can be used to make predictions. None of the dimensions included in the objectives were addressed.

Criterion 1d - 1i

Materials leverage science phenomena and engineering problems in the context of driving learning and student performance.
1/12
+
-
Criterion Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations for Criterion 1d-1i: Phenomena and Problems Drive Learning. The materials include phenomena in 0% of topics and problems in 33% of topics. Since phenomena are not present and only two problems are present, there is a missed opportunity for them to connect to DCIs. Only one of the two problems present is presented directly as possible. Both problems elicit, and one also leverages, student prior knowledge related to the problem. The materials do not include phenomena and problems that drive student learning and use of the three dimensions within and across individual lessons. Across the grade, a concept or a question is used to frame learning across multiple lessons in the topic, rather than a driving phenomenon or problem.

Indicator 1d

Phenomena and/or problems are connected to grade-level Disciplinary Core Ideas.
0/2
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that phenomena and problems are connected to grade-level Disciplinary Core Ideas (DCIs). Across the grade, the materials consist of four segments, with each segment containing one to two topics; each topic includes two to four lessons and activities. The Quest PBL is part of the launch of the Topic and then revisited in each lesson and at the end of the topic. The two problems identified in the Grade 3 materials are located in the Quest PBLs; both problems are connected to DCIs associated with the grade-level performance expectations.

While the materials include sections that label an Anchoring Phenomenon and Investigative Phenomenon, students do not figure out or explain a phenomenon. Rather, these sections contain questions to help build an understanding of the question that center around a DCI or content learning. Because students do not figure out phenomena, the materials present no opportunities to connect a DCI to a phenomenon.

Examples of problems connected to DCIs associated with the grade-level performance expectations:

  • In Grade 3, Segment 1, Topic 1, Quest PBL: Pinwheel Wizard, the challenge is to design a new pinball machine. Students draw a picture of what flippers and bumpers will look like in their pinball game and use arrows to predict where they think the flippers and bumpers will make the ball move. Through this, students build an understanding that objects in contact (e.g., ball, bumper) exert forces on each other (DCI-PS2.B-E1). Students draw a model of their proposed pinball game and show the changes of direction of the ball (DCI-PS2.A-E1). Students build and test their designs and reflect on how flippers can change the motion of the pinball, and how much force is needed for the change (DCI-PS2.A-E1). Using their knowledge of unbalanced forces they make a final model of flippers that will meet the criteria of being able to change the motion of the ball (DCI-PS2.A-E1, DCI-PS2.B-E1).
  • In Grade 3, Segment 4, Topic 6, Quest PBL: Hold on to Your Roll, the challenge is to design a wind-resistant roof. Students develop an understanding of the different weather hazards that may affect a house during each of the four seasons. Students then build prototype roofs and test them to see how they react to strong winds. To complete the challenge, students apply their findings to create a final roof design to demonstrate that humans can reduce the impacts of natural hazards, such as strong winds, on a roof (DCI-ESS3.B-E1).

Indicator 1e

Phenomena and/or problems are presented to students as directly as possible.
0/2
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that phenomena and/or problems are presented to students as directly as possible.

There are four instructional segments in Grade 3, each comprised of one to two topics with a total of six topics altogether. The Quest PBL is part of the launch of the topic and then revisited in each lesson and at the end of the topic. Within the Quest PBLs for this grade level, students solve two problems or design challenges. One of these is presented as directly as possible.

Example of a problem presented as directly as possible:

  • In Grade 3, Segment 1, Topic 1, Quest PBL: Pinball Wizard, the challenge is to design a new pinball machine. The challenge is presented to students with a letter from a game designer and with a video of a pinball game designer discussing his work. Students see pinballs in motion in the video and hear about pinball design in general. Additionally, students interact with a drawing and photos that show the elements of a pinball machine. Because each of these elements is in the context of the challenge, the challenge is introduced in a direct way.

Example of a problem that is not presented as directly as possible:

  • In Grade 3, Segment 4: Topic 6, Quest PBL: Hold on to Your Roll, the challenge is to design a wind-resistant roof. The challenge is presented as a letter from an architect. Students watch a video of an architect explaining her job, but this is not directly connected to the scenario and design challenge in the Quest PBL. Students then view photographs of four different kinds of roofs in the real world (thatched, metal, clay, shingle) and read pro and con of each type of roof. Even though students read the challenge, listen to an architect talk about the job, and see photographs of real roofs in the world all in the context of the design problem. The problem is not introduced in the most direct way possible; students are not presented with a direct way to understand how wind can damage a roof.

Indicator 1f

Phenomena and/or problems drive individual lessons or activities using key elements of all three dimensions.
0/2
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that phenomena and/or problems drive individual lessons or activities using key elements of all three dimensions. Across the grade, the materials do not use phenomena or problems to drive student learning within individual lessons. Frequently, the learning objective focuses on the learning of a DCI or associated element, resulting in a missed opportunity for students to use the three dimensions as they work towards explaining phenomena or solving problems.

There are two identified problems in the Quest PBL in Topic 1 and Topic 6. These problems do not drive the learning throughout the lessons. When these problems drive learning of individual Quest Check-Ins, key elements of all three dimensions are not incorporated and most often exclude the CCC within a lesson. This represents a missed opportunity for students to engage with all three dimensions to make sense of a problem.

Examples where individual lessons or activities are not driven by phenomena and/or problems, and do not engage students with all three dimensions:

  • In Grade 3, Segment 1, Topic 1, Lesson 3: Forces and Motion, a phenomenon or problem does not drive student learning. Rather, the lesson is driven by the objective, “Students will identify the forces acting on an object.” Students plan an investigation to understand how objects move. They read about force, friction, magnetism, electricity and how forces are equal and opposite. Students complete an online lab where they use forces from a tugboat to keep a cargo ship safe. Students draw a picture of what their pinball launcher will look like (SEP-MOD-E5). They are asked to state if their design will be able to apply enough force and what they can do to make sure it will work. Students record their observations (SEP-DATA-P1) of moving a steel ball with a provided material.
  • In Grade 3, Segment 1, Topic 2, Lesson 2: Magnetic Forces, a phenomenon or problem does not drive student learning. Rather, students learn about the topic of magnetic forces. Students learn how they can turn a paperclip into a magnet. Students read about magnets and their poles. Within the instructional sequence, students draw their proposed design (SEP-MOD-E5) to move steel objects from one place to another using a magnet (DCI-PS2.B-E2).
  • In Grade 3, Segment 2, Topic 3, Lesson 2: Inherited Traits, a phenomenon or problem does not drive student learning. Rather, students learn about how organisms inherit characteristics (DCI-LS3.A-E1). Students compare the differences in footprints of a family of raccoons (SEP-INV-P4), read text about traits from parents and variations in traits, and complete an online lab where they use knowledge of plant “parents” to make a new plant. Students reflect about how they are similar and different from their own parents. Students match pictures of animals to the environment where it will be best camouflaged.
  • In Grade 3, Segment 3, Topic 5, Lesson 1: Fossils, a phenomenon or problem does not drive student learning. Rather, students engage with the lesson objective, “Students will describe what a fossil is and ways a fossil is formed.” Students investigate how minerals stick to bone. They read about fossils and how they are formed in sap, ice, and tar before completing an online interactive to learn more about how fossils are formed in tar at the La Brea Tar Pits. Students look at three photos and state if it is a plant, animal, or trace fossil and write observations about the fossil. Students use a sponge, salt, and water to represent what happens to fossils (SEP-MOD-E6), describing what each component represents as they write observations about the fossil (SEP-DATA-P1).
  • In Grade 3, Segment 3, Topic 5, Lesson 2: Fossils as a Record, a phenomenon or problem does not drive student learning. Rather, the lesson is driven by the concept that fossils provide evidence about organisms from long ago (DCI-LS4.A-E2). Students consider what they can learn about an animal from a fossil footprint. Students look at a set of footprint drawings and record what observations and information about each fossil (SEP-DATA-P1). Students read about clues from fossils, the fossil record, and index fossils. They organize the information in a table to indicate which fossil came from which dig site. Students read a letter from a paleontologist which describes three dig sites (DCI-LS4.A-E2).

Indicator 1g

Materials are designed to include both phenomena and problems.
Narrative Evidence Only
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 are designed for students to solve problems in 33% of the topics (2/6 topics). Throughout the materials, 0% of the topics (0/6 topics) are designed for students to explain phenomena. There are four Instructional Segments in Grade 3, each comprised of one to two topics with a total of six topics altogether. Each topic consists of two to four lessons, uConnect labs, uInvestigate labs, and uDemonstrate labs, a Career Connection page, and Quest Problem Based Learning (PBL). The Quest PBL is part of the launch of the topic and then revisited in each lesson and at the end of the topic. Within the grade level, 33% (2/6) topics present problems or design challenges that students solve; these are found in the Quest PBL sections of the materials.

Each Instructional Segment begins with a section labeled as an Anchoring Phenomenon, which provides a focus question for the segment. For example, Instructional Segment 2 provides the question, “What influences the characteristics of organisms?” as the Anchoring Phenomenon. Each topic within a segment provides a question labeled as an Investigative Phenomenon; these questions help build an understanding of the segment-level question. The topic within Segment 2 labels the question, “Why are organisms of the same kind alike but different?”as the Investigative Phenomenon. Each of the three lessons within this topic focus on smaller questions to help students answer the topic-level question. The learning at each of these levels focuses on answering a lesson-, topic-, or segment-level question centered around a DCI or content learning, resulting in missed opportunities for students to explain phenomena that they observe. As a result, students do not figure out phenomena in this grade level.

Examples of problems in the series:

  • In Grade 3, Segment 1: Topic 1, Quest PBL: Pinball Wizard, the challenge is to design a new pinball machine. Students predict the motion of a ball in a given pinball machine by drawing a line on the pictured machine to represent the path of the ball. Students draw a picture of what flippers and bumpers will look like in their pinball game and draw a picture of their launcher. Students build and test flippers and then compare their designs to other students to determine what features were useful and may help them improve their own design. Students draw a final design, including arrows that show the predicted directions the ball will move.
  • In Grade 3,Segment 4: Topic 6, Quest PBL: Hold on to Your Roll, the challenge is to design a wind-resistant roof. Students use information in a table to learn about different materials (wood and tin) and how those materials are affected by wind and water and then choose the material to use for their roof. Students record how the weather may affect the roof of the chosen material in different seasons and determine how to protect the roof in each season. Students then identify the most important criteria for designing a roof and draw a plan for two different designs, before building and testing roof models. Students compare their results to their peers and present their findings and final roof design.

Indicator 1h

Materials intentionally leverage students’ prior knowledge and experiences related to phenomena or problems.
1/2
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 partially meet expectations that they intentionally leverage students’ prior knowledge and experiences related to phenomena or problems. The materials elicit students’ prior knowledge in the lessons or activities with the two problems located in the Quest PBLs. Prompts provide opportunities for students to discuss and share prior knowledge and experience. In one instance, the materials provide an opportunity to leverage that prior knowledge and experience. In the other instance, the materials do not provide guidance or support for students to leverage this knowledge and experience as they design solutions.

Example where materials elicit but do not leverage students’ prior knowledge and experiences related to phenomena and problems:

  • In Grade 3, Segment 1: Topic 1, Quest PBL: Pinball Wizard, the challenge is to design a new pinball machine. In the launch, students discuss whether they have played a good pinball game and describe what made it good. This discussion provides an opportunity to elicit students’ prior knowledge and experiences with playing pinball and how pinball games are designed. However, the materials do not provide teacher guidance for leveraging students’ ideas as a means of solving the challenge.

Example where materials elicit and leverage students’ prior knowledge and experiences related to phenomena and problems:

  • In Grade 3, Segment 4, Topic 6, Quest PBL: Hold on To Your Roll, the problem is to design a wind-resistant roof. In the launch students’ prior knowledge or experiences with the problem are not elicited. Instead, content knowledge through the DCI is elicited as students discuss weather that they are familiar with and during a teacher-guided discussion of ways homes can be damaged as a result of the weather. In the first Quest Check-In, the discussion prompt, “If you were building a home in your area, what kinds of materials might be best suited for the roof? Explain why” elicits student prior knowledge and experience about different types of building materials used locally. This prompt provides opportunity for student knowledge and experiences to be leveraged as students make decisions about the properties of materials for engineering a roof.

Indicator 1i

Materials embed phenomena or problems across multiple lessons for students to use and build knowledge of all three dimensions.
0/4
+
-
Indicator Rating Details

The instructional materials reviewed for Grade 3 do not meet expectations that they embed phenomena or problems across multiple lessons for students to use and build knowledge of all three dimensions. Across the grade, a concept or a question is used to frame learning across multiple lessons in the topic, rather than a driving phenomenon or problem.

Within the six topics in the grade, there are two problems in the Quest PBLs. The two Quest PBLs provide multimodal opportunities for students to engage in developing, evaluating, and revising their thinking as they solve the design challenge. There are few opportunities for students to develop, evaluate, and revise their thinking outside of the Quest PBLs.

While the design challenges in the Quest PBL provide opportunities for students to apply the learning from the lesson and connect across multiple lessons in the topic, they do not drive the learning of the lesson or the topic. Students do not consistently engage in all three dimensions to solve these problems or design challenges.

Examples of topics that do not use phenomena or problems to drive student learning across multiple lessons:

  • In Grade 3, Segment 1, Topic 1: Motion and Forces, a phenomenon or problem does not drive student learning across multiple lessons. Rather, student learning is driven by the concept of motion and forces. Students compare the speed of a wind-up toy to a golf ball before reading text about position, motion, and speed. Students build the longest bridge they can using only three materials. Students learn to describe the motion of an object by comparing three balls and how fast and long they moved before reading text about patterns of motion. Students investigate contact and non-contact forces that act on objects. They use three different materials to see how to make a steel ball move (such as blowing on it through a straw, dropping it, or placing a magnet nearby) before reading about forces that touch and those that do not touch, including gravity, magnetism, and electricity. Students build a structure to hold a steel ball before reading text about balanced and unbalanced forces. Throughout the Quest PBL, students design a new pinball machine. Students apply their understanding of force and motion to design the pinball machine to change the ball’s speed and direction (DCI-PS2.A-E1, CCC-CE-P2) when the handle and ball make contact (DCI-PS2.B-E1). They predict how the motion of the ball changes (DCI-PS2.A-E2, CCC-PAT-E2) when it contacts bumpers and flippers, drawing a model (SEP-MOD-E5) to show changes of direction in their own pinball game. While there is a challenge—designing a new pinball machine—that relates to student learning about motions and forces, the challenge does not drive the learning across multiple lessons. Instead, students apply what they learned about motion and forces as they design a new pinball machine. The Quest PBL introduces the challenge and is revisited after each lesson and then again toward the end of the topic.
  • In Grade 3, Segment 2, Topic 3: Life Cycles and Trait, a phenomenon or problem does not drive student learning across multiple lessons. Rather, student learning is focused on the topics of life cycles and traits. Students observe photographs of organisms to answer questions and conduct short investigations to answer questions about organisms’ traits and life cycles. Students observe photographs of a newly sprouted plant and a grizzly bear mother and her offspring and compare the life cycles of each organism. Students also develop a model of the life cycle of a California newt. Students observe a picture of dogs and construct an argument about why the puppies look the way they do (SEP-ARG-E4, DCI-LS3.A-E1). Students later ask questions to determine whether or not an animal’s unusual fur color could be inherited.
  • In Grade 3, Segment 3, Topic 4: Adaptations and Survival, a phenomenon or problem does not drive student learning across multiple lessons. Rather, student learning is focused on the topic of survival of individuals and groups. Students learn about different ways plants and animals survive in their habitats and how changes in their environment might affect them. Students learn these ideas through reading informational text, investigation about animal traits and behaviors, and the Quest PBL activities, where they examine how changes to a pond environment might affect the living organisms. Within the instructional sequence, students observe plant and animal species, identify the similarities and differences between living things (SEP-INV-P4), and use their observations to explain which traits might be the most important for survival in that environment (DCI-LS4.B-E1).
  • In Grade 3, Segment 4, Topic 6, Weather and Climate, a phenomenon or problem does not drive student learning across multiple lessons. Rather, learning is driven by the concepts of weather and climate. Students learn about the water cycle and that the water on earth, including in the atmosphere, affects weather. Then, in the Quest PBL Check-In, students fill in a chart to compare wood and tin roofs for their performance with wind and water. Students learn about seasonal weather and then in the Quest PBL, they fill in a chart about how the weather of each season might affect the roof and how the roof could be protected. Students learn about weather hazards such as storms, tornadoes, droughts, and floods and that humans can reduce the impacts of such hazards (DCI-ESS3.B-E1). In the Quest PBL Check-In at the end of the lesson, students design, build, and test two model roofs and then compare their findings with their classmates to determine what factors make a roof more wind-resistant (SEP-CEDS-P3). Students learn about climates (DCI-ESS2.D-E2). Then students engage in the final Quest PBL during which they draw a final design for their roof and present their design including an evidence-based explanation of how the design reduces the impact of wind on the roof (CCC-CE-E1). While there is a challenge—designing a wind-resistant roof—that relates to students learning about weather, the challenge does not drive the learning across multiple lessons. Instead, students apply what they learned about weather and climate as they design a new pinball machine. The Quest PBL introduces the challenge and is revisited after each lesson and then again toward the end of the topic.

Gateway Two

Coherence and Scope

Not Rated

+
-
Gateway Two Details
Materials were not reviewed for Gateway Two because materials did not meet or partially meet expectations for Gateway One

Criterion 2a - 2g

Materials are coherent in design, scientifically accurate, and support grade-level and grade-band endpoints of all three dimensions.

Indicator 2a

Materials are designed for students to build and connect their knowledge and use of the three dimensions across the series.
N/A

Indicator 2a.i

Students understand how the materials connect the dimensions from unit to unit.
N/A

Indicator 2a.ii

Materials have an intentional sequence where student tasks increase in sophistication.
N/A

Indicator 2b

Materials present Disciplinary Core Ideas (DCI), Science and Engineering Practices (SEP), and Crosscutting Concepts (CCC) in a way that is scientifically accurate.*
N/A

Indicator 2c

Materials do not inappropriately include scientific content and ideas outside of the grade-level Disciplinary Core Ideas.*
N/A

Indicator 2d

Materials incorporate all grade-level Disciplinary Core Ideas.
N/A

Indicator 2d.i

Physical Sciences
N/A

Indicator 2d.ii

Life Sciences
N/A

Indicator 2d.iii

Earth and Space Sciences
N/A

Indicator 2d.iv

Engineering, Technology, and Applications of Science
N/A

Indicator 2e

Materials incorporate all grade-band Science and Engineering Practices.
N/A

Indicator 2e.i

Materials incorporate grade-level appropriate SEPs within each grade.
N/A

Indicator 2e.ii

Materials incorporate all SEPs across the grade band.
N/A

Indicator 2f

Materials incorporate all grade-band Crosscutting Concepts.
N/A

Indicator 2g

Materials incorporate NGSS Connections to Nature of Science and Engineering
N/A

Gateway Three

Usability

Not Rated

+
-
Gateway Three Details
This material was not reviewed for Gateway Three because it did not meet expectations for Gateways One and Two

Criterion 3a - 3d

Materials are designed to support teachers not only in using the materials, but also in understanding the expectations of the standards.

Indicator 3a

Materials include background information to help teachers support students in using the three dimensions to explain phenomena and solve problems (also see indicators 3b and 3l).
N/A

Indicator 3b

Materials provide guidance that supports teachers in planning and providing effective learning experiences to engage students in figuring out phenomena and solving problems.
N/A

Indicator 3c

Materials contain teacher guidance with sufficient and useful annotations and suggestions for how to enact the student materials and ancillary materials. Where applicable, materials include teacher guidance for the use of embedded technology to support and enhance student learning.
N/A

Indicator 3d

Materials contain explanations of the instructional approaches of the program and identification of the research-based strategies.
N/A

Criterion 3e - 3k

Materials are designed to support all students in learning.

Indicator 3e

Materials are designed to leverage diverse cultural and social backgrounds of students.
N/A

Indicator 3f

Materials provide appropriate support, accommodations, and/or modifications for numerous special populations that will support their regular and active participation in learning science and engineering.
N/A

Indicator 3g

Materials provide multiple access points for students at varying ability levels and backgrounds to make sense of phenomena and design solutions to problems.
N/A

Indicator 3h

Materials include opportunities for students to share their thinking and apply their understanding in a variety of ways.
N/A

Indicator 3i

Materials include a balance of images or information about people, representing various demographic and physical characteristics.
N/A

Indicator 3j

Materials provide opportunities for teachers to use a variety of grouping strategies.
N/A

Indicator 3k

Materials are made accessible to students by providing appropriate supports for different reading levels.
N/A

Criterion 3l - 3s

Materials are designed to be usable and also to support teachers in using the materials and understanding how the materials are designed.

Indicator 3l

The teacher materials provide a rationale for how units across the series are intentionally sequenced to build coherence and student understanding.
N/A

Indicator 3m

Materials document how each lesson and unit align to NGSS.
N/A

Indicator 3n

Materials document how each lesson and unit align to English/Language Arts and Math Common Core State Standards, including the standards for mathematical practice.
N/A

Indicator 3n.i

Materials document how each lesson and unit align to English/Language Arts Common Core State Standards.
N/A

Indicator 3n.ii

Materials document how each lesson and unit align to Math Common Core State Standards, including the standards for mathematical practice.
N/A

Indicator 3o

Resources (whether in print or digital) are clear and free of errors.
N/A

Indicator 3p

Materials include a comprehensive list of materials needed.
N/A

Indicator 3q

Materials embed clear science safety guidelines for teacher and students across the instructional materials.
N/A

Indicator 3r

Materials designated for each grade level are feasible and flexible for one school year.
N/A

Indicator 3s

Materials contain strategies for informing students, parents, or caregivers about the science program and suggestions for how they can help support student progress and achievement.
N/A

Criterion 3t - 3y

Materials are designed to assess students and support the interpretation of the assessment results.

Indicator 3t

Assessments include a variety of modalities and measures.
N/A

Indicator 3u

Assessments offer ways for individual student progress to be measured over time.
N/A

Indicator 3v

Materials provide opportunities and guidance for oral and/or written peer and teacher feedback and self reflection, allowing students to monitor and move their own learning.
N/A

Indicator 3w

Tools are provided for scoring assessment items (e.g., sample student responses, rubrics, scoring guidelines, and open-ended feedback).
N/A

Indicator 3x

Guidance is provided for interpreting the range of student understanding (e.g., determining what high and low scores mean for students) for relevant Science and Engineering Practices, Crosscutting Concepts, and Disciplinary Core Ideas.
N/A

Indicator 3y

Assessments are accessible to diverse learners regardless of gender identification, language, learning exceptionality, race/ethnicity, or socioeconomic status.
N/A

Criterion 3aa - 3z

Materials are designed to include and support the use of digital technologies.

Indicator 3aa

Digital materials are web based and compatible with multiple internet browsers. In addition, materials are “platform neutral,” are compatible with multiple operating systems and allow the use of tablets and mobile devices.
N/A

Indicator 3ab

Materials include opportunities to assess three-dimensional learning using digital technology.
N/A

Indicator 3ac

Materials can be customized for individual learners, using adaptive or other technological innovations.
N/A

Indicator 3ad

Materials include or reference digital technology that provides opportunities for teachers and/or students to collaborate with each other (e.g., websites, discussion groups, webinars, etc.).
N/A

Indicator 3z

Materials integrate digital technology and interactive tools (data collection tools, simulations, modeling), when appropriate, in ways that support student engagement in the three dimensions of science.
N/A
abc123

Report Published Date: 2020/12/15

Report Edition: 2020

Title ISBN Edition Publisher Year
ELVSCI20 CA NEW INST SEG 1 SE G3 0134980174 2020
ELVSCI20 CA NEW INST SEG 2 SE G3 0134980182 2020
ELVSCI20 CA NEW INST SEG 3 SE G3 0134980190 2020
ELVSCI20 CA NEW INST SEG 4 SE G3 0134980204 2020
ELVSCI20 CA NEW TE GR. 3 0134980344 2020

Please note: Reports published beginning in 2021 will be using version 1.5 of our review tools. Version 1 of our review tools can be found here. Learn more about this change.

Science K-5 Review Tool

The science review criteria identifies the indicators for high-quality instructional materials. The review criteria supports a sequential review process that reflects the importance of alignment to the standards then considers other high-quality attributes of curriculum as recommended by educators.

For science, our review criteria evaluates materials based on:

  • Three-Dimensional Learning

  • Phenomena and Problems Drive Learning

  • Coherence and Full Scope of the Three Dimensions

  • Design to Facilitate Teacher Learning

  • Instructional Supports and Usability

The Evidence Guides complement the review criteria by elaborating details for each indicator including the purpose of the indicator, information on how to collect evidence, guiding questions and discussion prompts, and scoring criteria.

To best read our reports we recommend utilizing the Codes for NGSS Elements document that provides the code and description of elements cited as evidence in each report.

The EdReports rubric supports a sequential review process through three gateways. These gateways reflect the importance of alignment to college and career ready standards and considers other attributes of high-quality curriculum, such as usability and design, as recommended by educators.

Materials must meet or partially meet expectations for the first set of indicators (gateway 1) to move to the other gateways. 

Gateways 1 and 2 focus on questions of alignment to the standards. Are the instructional materials aligned to the standards? Are all standards present and treated with appropriate depth and quality required to support student learning?

Gateway 3 focuses on the question of usability. Are the instructional materials user-friendly for students and educators? Materials must be well designed to facilitate student learning and enhance a teacher’s ability to differentiate and build knowledge within the classroom. 

In order to be reviewed and attain a rating for usability (Gateway 3), the instructional materials must first meet expectations for alignment (Gateways 1 and 2).

Alignment and usability ratings are assigned based on how materials score on a series of criteria and indicators with reviewers providing supporting evidence to determine and substantiate each point awarded.

Alignment and usability ratings are assigned based on how materials score on a series of criteria and indicators with reviewers providing supporting evidence to determine and substantiate each point awarded.

For ELA and math, alignment ratings represent the degree to which materials meet expectations, partially meet expectations, or do not meet expectations for alignment to college- and career-ready standards, including that all standards are present and treated with the appropriate depth to support students in learning the skills and knowledge that they need to be ready for college and career.

For science, alignment ratings represent the degree to which materials meet expectations, partially meet expectations, or do not meet expectations for alignment to the Next Generation Science Standards, including that all standards are present and treated with the appropriate depth to support students in learning the skills and knowledge that they need to be ready for college and career.

For all content areas, usability ratings represent the degree to which materials meet expectations, partially meet expectations, or do not meet expectations for effective practices (as outlined in the evaluation tool) for use and design, teacher planning and learning, assessment, differentiated instruction, and effective technology use.

Math K-8

  • Focus and Coherence - 14 possible points

    • 12-14 points: Meets Expectations

    • 8-11 points: Partially Meets Expectations

    • Below 8 points: Does Not Meet Expectations

  • Rigor and Mathematical Practices - 18 possible points

    • 16-18 points: Meets Expectations

    • 11-15 points: Partially Meets Expectations

    • Below 11 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 38 possible points

    • 31-38 points: Meets Expectations

    • 23-30 points: Partially Meets Expectations

    • Below 23: Does Not Meet Expectations

Math High School

  • Focus and Coherence - 18 possible points

    • 14-18 points: Meets Expectations

    • 10-13 points: Partially Meets Expectations

    • Below 10 points: Does Not Meet Expectations

  • Rigor and Mathematical Practices - 16 possible points

    • 14-16 points: Meets Expectations

    • 10-13 points: Partially Meets Expectations

    • Below 10 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 36 possible points

    • 30-36 points: Meets Expectations

    • 22-29 points: Partially Meets Expectations

    • Below 22: Does Not Meet Expectations

ELA K-2

  • Text Complexity and Quality - 58 possible points

    • 52-58 points: Meets Expectations

    • 28-51 points: Partially Meets Expectations

    • Below 28 points: Does Not Meet Expectations

  • Building Knowledge with Texts, Vocabulary, and Tasks - 32 possible points

    • 28-32 points: Meet Expectations

    • 16-27 points: Partially Meets Expectations

    • Below 16 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 34 possible points

    • 30-34 points: Meets Expectations

    • 24-29 points: Partially Meets Expectations

    • Below 24 points: Does Not Meet Expectations

ELA 3-5

  • Text Complexity and Quality - 42 possible points

    • 37-42 points: Meets Expectations

    • 21-36 points: Partially Meets Expectations

    • Below 21 points: Does Not Meet Expectations

  • Building Knowledge with Texts, Vocabulary, and Tasks - 32 possible points

    • 28-32 points: Meet Expectations

    • 16-27 points: Partially Meets Expectations

    • Below 16 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 34 possible points

    • 30-34 points: Meets Expectations

    • 24-29 points: Partially Meets Expectations

    • Below 24 points: Does Not Meet Expectations

ELA 6-8

  • Text Complexity and Quality - 36 possible points

    • 32-36 points: Meets Expectations

    • 18-31 points: Partially Meets Expectations

    • Below 18 points: Does Not Meet Expectations

  • Building Knowledge with Texts, Vocabulary, and Tasks - 32 possible points

    • 28-32 points: Meet Expectations

    • 16-27 points: Partially Meets Expectations

    • Below 16 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 34 possible points

    • 30-34 points: Meets Expectations

    • 24-29 points: Partially Meets Expectations

    • Below 24 points: Does Not Meet Expectations


ELA High School

  • Text Complexity and Quality - 32 possible points

    • 28-32 points: Meets Expectations

    • 16-27 points: Partially Meets Expectations

    • Below 16 points: Does Not Meet Expectations

  • Building Knowledge with Texts, Vocabulary, and Tasks - 32 possible points

    • 28-32 points: Meet Expectations

    • 16-27 points: Partially Meets Expectations

    • Below 16 points: Does Not Meet Expectations

  • Instructional Supports and Usability - 34 possible points

    • 30-34 points: Meets Expectations

    • 24-29 points: Partially Meets Expectations

    • Below 24 points: Does Not Meet Expectations

Science Middle School

  • Designed for NGSS - 26 possible points

    • 22-26 points: Meets Expectations

    • 13-21 points: Partially Meets Expectations

    • Below 13 points: Does Not Meet Expectations


  • Coherence and Scope - 56 possible points

    • 48-56 points: Meets Expectations

    • 30-47 points: Partially Meets Expectations

    • Below 30 points: Does Not Meet Expectations


  • Instructional Supports and Usability - 54 possible points

    • 46-54 points: Meets Expectations

    • 29-45 points: Partially Meets Expectations

    • Below 29 points: Does Not Meet Expectations