Data Rich, Information Poor? Making Sense of Progress Monitoring Data to Guide Decisions
Estimated read time: 1:20
Learn to use AI like a Pro
Get the latest AI workflows to boost your productivity and business performance, delivered weekly by expert consultants. Enjoy step-by-step guides, weekly Q&A sessions, and full access to our AI workflow archive.
Summary
In this comprehensive webinar, the National Center on Intensive Intervention outlines the significance of progress monitoring in educational settings, especially concerning academics and behavior. They discuss common measures for progress monitoring, challenges in accurate data collection, interpretation of data, and how these insights inform instructional changes. Key focus areas include the need for reliable measurement tools, the importance of graphing data for decision-making, and the evaluation process within tiered support frameworks.
Highlights
Understanding how to effectively use progress monitoring data is crucial for guiding educational decisions. 🤓
Graphing allows educators to visually track progress and adjust interventions accordingly. 📉
The effectiveness of interventions can be gauged by how data trends over time, identifying needs for program adjustments. 🔄
Utilizing validated tools ensures accurate and meaningful information for making instructional changes. 🔍
Integrating feedback from multiple educators can refine strategies to better support student growth. 👩🏫👨🏫
Key Takeaways
Progress monitoring helps in assessing both academic and behavioral growth using standardized measures. 📈
Graphing data is key to uncovering patterns that are not obvious with raw data. 🗂️
Reliability and validity of monitoring tools are crucial in making informed decisions. 🎯
Selecting the appropriate monitoring tool can cater to individual student needs, aiding effective interventions. 🧩
Collaboration among educators fosters better adaptation and intervention strategies. 🤝
Overview
Progress monitoring is a standardized approach for evaluating a student's academic and behavioral response to interventions. Through consistent data collection via reliable tools, educators can set and adjust realistic goals, enhancing personalized educational strategies.
Challenges in data collection and interpretation include aligning tools with instructional goals, ensuring data sensitivity, and maintaining fidelity in interventions. Overcoming these challenges requires selecting the right tools and understanding their limitations.
Ultimately, using progress monitoring in a tiered framework like RTI (Response to Intervention) allows for dynamic decision-making. Data-driven instruction can lead to improved outcomes and more tailored educational experiences, fostering better student engagement and success.
Chapters
00:00 - 00:30: Introduction Today's discussion covers progress monitoring measures in academics and behavior. It includes the challenges of collecting accurate data and strategies to optimize this process. Additionally, structured questions to aid data collection will be explored.
00:30 - 05:00: Progress Monitoring Measures Throughout the progress monitoring process, there are key interpretations of data that guide decisions in the intervention process. The chapter explores how different data patterns can reveal insights about student needs and discusses potential changes or adaptations that could enhance student outcomes.
05:00 - 08:00: Purpose of Progress Monitoring Data The chapter discusses the use of standardized methods for tracking student progress in academic and behavioral interventions. It emphasizes the importance of quantitatively evaluating a student's response to instruction or intervention and their progress toward achieving educational goals.
08:00 - 11:00: Selecting Progress Monitoring Tools The chapter titled 'Selecting Progress Monitoring Tools' discusses the importance of ongoing progress monitoring data in achieving academic and behavior goals. It emphasizes that such data can help in enhancing the instruction interventions received by students. For more detailed information and evaluation of specific progress monitoring tools, it directs readers to a link provided on a slide.
11:00 - 14:30: Commonly Used Measures in Reading and Math The chapter discusses commonly used measures in reading and math that are often part of progress monitoring tools. It focuses on general tools and approaches rather than specific commercial products or training modules. The chapter covers various measures sometimes found in these commercial products.
14:30 - 18:30: Behavioral Progress Monitoring Progress monitoring data is utilized to estimate the rate of improvement over time and to set realistic yet ambitious goals for individual students. It is also used to compare the efficacy of different interventions for specific students.
18:30 - 25:30: Graphing and Analyzing Data This chapter discusses the importance of using data to monitor students' progress. It emphasizes how data can alert educators when students are not making sufficient progress, allowing them to consider necessary instructional changes. The chapter also touches upon the analytical process behind deciding the nature of these changes and clarifies the role of progress monitoring systems in assessments.
25:30 - 32:00: Intervention and Content Changes The chapter discusses the use of data for individual student progress monitoring. It emphasizes the importance of selecting brief assessments to minimize disruption to instructional time. The chapter is titled 'Intervention and Content Changes', indicating a focus on adapting educational strategies to better suit individual learning needs.
32:00 - 41:00: Addressing Flat or Slow Data Patterns This chapter discusses the importance of having multiple alternate forms of tests to track student progress over time. It emphasizes the need for measures that are appropriate for the student's grade, age, or instructional level, in accordance with expected standards.
41:00 - 47:00: Understanding Variability in Data This chapter explores how educational material may sometimes be deliberately set below a student's current academic grade level to facilitate learning. By teaching at the student's current skill level, educators aim to build the necessary skills that will empower the student to progress towards achieving grade-level competencies.
47:00 - 53:00: Key Issues and Considerations This chapter addresses key issues and considerations in selecting tools. It emphasizes the importance of choosing tools that provide accurate and meaningful information. The chapter references a chart that offers information on the reliability and validity of various tools, which have been submitted for evaluation. This resource is vital for making informed decisions on tool selection.
53:00 - 58:00: Behavioral Systems in RTI Framework The chapter discusses the various progress monitoring measures used within the Response to Intervention (RTI) framework, particularly in the context of reading. It highlights that many commercially available products incorporate these measures, although not all do. Specific measures mentioned include letter sound fluency, which is frequently used at the kindergarten level, and word identification fluency, used at the first-grade level. These measures are crucial for tracking and enhancing students' reading abilities in early education.
58:00 - 70:00: Challenges and Suggestions in Progress Monitoring This chapter discusses the tools and methodologies used for progress monitoring in different grade levels. At grades two and three, oral reading fluency, also known as passage reading fluency, is commonly used, where students read aloud from passages. For grades four through six, maze fluency is the preferred method. In kindergarten mathematics, progress is often monitored through number identification or quantity discrimination, such as identifying missing numbers.
70:00 - 71:30: Q&A Session The 'Q&A Session' chapter focuses on the concept of computation curriculum-based measures, specifically those used in mathematics education. These measures are designed to sample various objectives aligned with grade-level standards. The chapter may highlight how these measures are integrated into teaching, possibly discussing the benefits and applications for different grade levels, one through six.
Data Rich, Information Poor? Making Sense of Progress Monitoring Data to Guide Decisions Transcription
00:00 - 00:30 great thanks okay so today we'll be discussing common progress monitoring measures that are used in academics and behavior we'll also talk about some of the challenges with collecting accurate and efficient data and how you can plan ahead to optimize your data collection and finally we'll be talking about a series of structured questions you can ask
00:30 - 01:00 throughout the progress monitoring process that can help you interpret the data and make decisions about what your next step will be in the intervention process for example we'll talk about what patterns of data might reveal what they say about student needs and what changes or adaptations could be made to help students do better advanced aside please and again so uh when we do progress monitoring um
01:00 - 01:30 what we're referring to in the intensive intervention center is a standardized standardized method for tracking the progress of students in academics or behavior to evaluate in a quantitative way the student's response to instruction or intervention and their progress toward meeting
01:30 - 02:00 their academic and behavior goals and these kinds of ongoing progress monitoring data can also help us improve the nature of the instruction interventions that students receive and for information about specific progress monitoring tools we encourage you to go to the link that's on the slide where you can find various tools evaluation charts
02:00 - 02:30 of those tools and even training modules on the various tools but we're talking about today is not specific commercial products but rather general progress monitoring tools and approaches the kinds of measures that are sometimes incorporated into those commercial products advance so there are several different
02:30 - 03:00 purposes that we can use progress monitoring data as a slide shows if we start over on the left hand side we can estimate the rate of improvement over time and help us set appropriately ambitious by realistic goals for individual students we can compare the efficacy how well different forms of intervention work for a specific student we can
03:00 - 03:30 be alerted to when students are not making adequate progress and we can use the data to help us determine when an instructional change is needed and help us think analytically about what the nature of that instructional or intervention change might be and i should clarify that in this presentation the kinds of assessments we're talking about progress monitoring systems can be used for
03:30 - 04:00 groups or for an individual student and what we're talking about today is specifically using the data for individual students advance please so when we're selecting a progress monitoring tool for a given student we're always looking for assessments progress monitoring tools where the test is brief because we don't want to eat up too much of our instructional time
04:00 - 04:30 in administering tests we want a measure that has multiple alternate forms so that we can track students progress over the over time but with giving the exact same test every time we index the student's performance we're also looking for measures that are grade age or instructionally appropriate for the standards that they're expected to
04:30 - 05:00 learn i'll note that sometimes in the area of academics we're looking at material that may not be at the student's grade level but lower than the student's academic grade level but we are teaching the skills at that academic level in the service of helping the student make progress toward the grade level goal
05:00 - 05:30 and then we also want to make sure that the tools that we select give us accurate and meaningful information and the tools chart the link that i talked about a few minutes ago gives you information um about the reliability and validity of the various tools that have been submitted to the center for evaluation advance please
05:30 - 06:00 so some of the commonly used progress monitoring measures and again these are not commercial products but many of the commercial products incorporate similar measures not all of them but many of them and the common the frequently used progress monitoring measures in the area of reading at kindergarten letter sound fluency is frequently used at first grade word identification fluency
06:00 - 06:30 is frequently used at grades two and three oral reading fluency sometimes called passage reading fluency in both cases students are reading aloud from passages and then at grades four through six maze fluency and when we're looking at mathematics at kindergarten a frequent progress monitoring tool is number identification or quantity discrimination or identifying missing numbers from a
06:30 - 07:00 series of numbers and then it grades one through six the probably the most commonly used math measures um are the computation curriculum-based measure that sample at grade level all the different objectives that are incorporated in the standards at that grade level and that can be used in conjunction with
07:00 - 07:30 concepts and applications uh cbm measures and sometimes computation and calculating concepts and applications are used uh in conjunction together and sometimes schools will use one or the other advance please for behavior there are two types of progress monitoring measures commonly used or that we would recommend the first is systematic direct observation and the second is direct behavior rating next slide
07:30 - 08:00 please so first we'll talk a little bit about systematic direct observation systematic direct observation is a process of watching a person or environment for a period of time and then systematically recording behavior there are several different methods for actually observing behavior and collecting data which can be broadly divided into two classes one are those that measure a specific aspect of an event such as how often
08:00 - 08:30 an event occurs or how often the how long the event lasts and there are also those that are based on recording what occurs within a particular time frame so for example was a student engaged during each five minute interval some of the examples of systematic direct observation the classroom might be the total number of times a student raises his or her hand the amount of time spent out of seat or the percentage of appropriate peer
08:30 - 09:00 interactions the advantages of direct observation data are that they're a direct representation of the behavior so by directly observing behavior rather than relying on recollection they tend to be more accurate direct observation also is applicable to a wide range of observable behaviors just about any kind of externalizing behavior can be measured using direct observation and they're also adaptable so you can
09:00 - 09:30 measure various dimensions of behavior there are some limitations one is that they can be difficult to implement it's difficult to find blocks of time to sit down and directly watch a student particularly on an ongoing basis and as educators we know that we have many competing responsibilities so if it's difficult to do it might not be used and of course if it's not used then we won't be able to implement
09:30 - 10:00 database individualization next one please but um fortunately there's an alternative method for collecting data that has recently emerged in the literature and is not as labor-intensive as direct observation and this approach is the direct behavior rating or dbr the direct behavior rating can be adapted to focus on a range of target behaviors and also provide an opportunity to
10:00 - 10:30 measure broader and more general outcomes one of the premises of the direct behavior rating is that it's based on the notion that teachers can reliably and accurately rate student behavior on a continuum following some specified period of time and there is research to support that that is the case and the the ratings are then used as data to monitor student progress this method has the greatest research to date the
10:30 - 11:00 one with the most research is the single item scale that allows the rater to rate behavior on a single continuum from 0 to 10. so the numbers on the scale are anchored by terms such as never for example corresponding to zero sometimes corresponding to fifty percent and always corresponding to a hundred percent so those would correspond with ratings of 5 and 10.
11:00 - 11:30 a major advantage of this form of data collection is that it doesn't require constant recording or constant attention instead it allows teachers to instruct and manage their classrooms while they're also providing a research-based method for tracking behavior so like i said there's a growing literature base to support this approach to ongoing monitoring and it pure appears that it may be it may parallel some of the well-established methods that have been
11:30 - 12:00 used in academic uh progress monitoring with respect to accuracy and sensitivity okay next slide please graphing data is is really crucial for decision making because it allows you to see patterns that might not be evident by just looking at raw data so as you can see in this graph in the slide the graph data provides a clear picture
12:00 - 12:30 of behavior change and we can see progress over time the graph shows the direct behavior rating data and clearly reflects lower ratings of disruptive behavior after intervention was implemented next slide please sure and lee i have a couple requests the audio is a little quiet on your end if you wouldn't mind either turning up your phone volume or just trying to speak up a little bit
12:30 - 13:00 okay sure thank you okay so we're going to talk about the academic side of selecting measures advance aside please so there are challenges as we all know in identifying measures that are going to be suitable for the school districts states and the individual students that we work with and here are some of them aligning the measure to the content of
13:00 - 13:30 instruction i see that there have been a few questions about the common core that people have written into the chat box and i think that's a very big issue right now because as these questions are indicating the progress monitoring tools that are out there right now are not aligned with common core i think on the one hand it's important to note that in order to do well on the
13:30 - 14:00 common core you need the kind of foundational students need to be competent with the kinds of foundational skills that are reflected in progress monitoring tools so i don't think that the common core makes those existing progress monitoring tools not useful i think they still are useful and necessary for the kinds of students
14:00 - 14:30 who are requiring intensive intervention what i think needs to happen is that those tools need to be extended and people are working on accomplishing that and i think in the meantime there are things that teachers can do special educators and other teachers can do to keep the prog existing progress monitoring tools useful for example for math problems selecting different problem types on different days
14:30 - 15:00 and asking students for a given problem to write a substantive explanation for how they solve the problem and why they solve the problem that way i think that's a way of using existing materials in an efficient way to extend to the kind of format that we're going to see on the common core test i think something similar can do be done in reading
15:00 - 15:30 besides aligning we need to have measures that are going to be sensitive to change and when you go to the link the tools charts that air has uh created there are data for the various tools on whether and to what extent they are sensitive to student change and what we mean by that is if we're going to be using ongoing progress monitoring data to make decisions about whether students are responding to intervention
15:30 - 16:00 we want to make sure that there are data on the assessment system to indicate that the tool is in fact going that the scores will go up when students in fact learn new material so that's what we we mean by sensitivity to change in terms of data collection we want the tools to be providing enough alternate forms so that we can
16:00 - 16:30 collect data over time and when we're setting when we're selecting a level of an assessment in reading or in math we want to make sure that the material isn't so hard that students will not register increases in scores even though they are learning material and on the other hand we want don't want the material to be
16:30 - 17:00 so easy that within a semester or a year we have to keep changing the assessment system to keep pace with the students learning so we want to select material where the student level of material where the student has some competence but not so much competence that the progress monitoring tool will soon become irrelevant
17:00 - 17:30 me okay would you move the slide please next slide please okay is this the one common challenges behavior data
17:30 - 18:00 yes okay thank you so with respect to be collecting behavior data there are a few challenges one is defining the target behavior accurately so it's important to use objective language that refers to observable characteristics of behavior for example rather than saying jimmy is hyperactive we would specify the behaviors he exhibits such as tapping his pencil getting out of his seat wandering around the classroom and so forth we
18:00 - 18:30 also want to make sure that the instrument allows for the behavior to be readily measured so if for example we're trying to reduce self-stimulatory behavior a frequency count wouldn't reflect an accurate measure of that was behavior it would be better to measure duration or maybe use the direct behavior rating that operationalizes how scores align with the amount or duration of self-stimulatory behavior and we should also describe what the behavior includes and what it
18:30 - 19:00 does not include so for example self-stimulatory behavior might include hand flapping but not foot tapping and finally as far as consistency we want to identify a regular schedule of data collection and consistently adhere to that schedule okay next slide please thanks okay now we're going to switch gears a little bit and talk about questions that are important to ask in order to facilitate
19:00 - 19:30 data analysis and interpretation so we're going to uh talk about different dimensions of graphs and what the data on the graphs can give you insight into so um let's advance the slide deck so uh when we talk about looking when we look at graphs we may find
19:30 - 20:00 patterns that reveal areas in the students profile of strengths or weaknesses that give us insight into how to revise the program to make it stronger for that student so we're going to talk about three dimensions of questions to ask yourself when you're trying to come up with a hypothesis about what direction we need to
20:00 - 20:30 move in to make the program more effective for the student and we're going to talk about data and assessment um that is making sure that the data that you're collecting is giving you the right information um or instead we there's the problem is not with the student but with the assessment system we're going to talk about dosage and fidelity if progress is unsatisfactory
20:30 - 21:00 then one of the questions you probably want to ask yourself is whether implementation of the program rather than the nature of the program is the barrier to the student making progress and then finally and usually the data and the dosage and fidelity we hope that those are both appropriate and often what we're looking at is the content the methods of instruction
21:00 - 21:30 the intensity of instruction in the program to determine how to make the program more effective for the child advanced so sometimes though when we're collecting data and trying to use the information to improve student learning sometimes the
21:30 - 22:00 data are not being collected often enough which means that we don't have enough data on a frequent basis to even formulate a decision about the effectiveness of the program for the student so different data systems progress monitoring tools provide you with information about how often that system is designed to be
22:00 - 22:30 collected so often with curriculum-based measurement tools data are collected on a weekly basis you have to though look at the assessment system that you're implementing to understand what the appropriate schedule for collecting data is the second problem um when we sometimes are when we're questioning whether the data and
22:30 - 23:00 assessment system we're using is appropriate for a given student is we ask ourselves whether the progress monitoring tool is sensitive to the progress that the student is actually making and we can get flat scores which we'll talk about in a few minutes we can get low flat scores when students are really not learning and that is the problem but we can also get low flat scores when the progress monitoring tool that we're
23:00 - 23:30 using is too hard for the child so for example at first grade or for a student whose instructional level is at first grade if we're using oral reading fluency that is students reading a lab from passages often the student can be making progress and we don't see that in the passage reading fluency measure if we were to implement with that child word identification
23:30 - 24:00 fluency measure by contrast for that same student we might sometimes see nice progress so we have to make sure that when we see low flat scores that the system is not the problem uh that when it's not that we're collecting the wrong data um it's that really the student is not learning adequately and the same problem exists at the when we see high flat scores that looks
24:00 - 24:30 like no progress and it may be that the student has feeling doubt on that measure for example maybe a first grade some first grade students need to be in passage reading fluency because they can read 60 or more words in a minute from the word identification fluency measure they need to be advanced to a harder level of the cbm system the third issue is does the measure
24:30 - 25:00 align to the content of intervention we talked a little bit about that before we want to make sure that the assessment systems that we're using for progress monitoring are connected to the objectives and goals that are relevant for the given student and connected in a clear way to the standards of the district and the state and then i already addressed collecting
25:00 - 25:30 data at the right level with respect to dosage and fidelity if a student is supposed to be receiving intensive intervention three times a week for an hour each time let's say we want if and we see that the students progress monitoring scores are not improving
25:30 - 26:00 one of the things we want to do is make sure that the intervention is being implemented at the right dosage that could be group size that could be number of sessions per week number of minutes per session we want to make sure that the intervention is being delivered at the right frequency and duration a second question is did the student receive all aspects of the intervention as it was originally conceived
26:00 - 26:30 and are there other factors that are creating difficulty for the student to receive the intervention as planned for example being absent or behavior issues that need to be integrated into the academic instructional plan and so forth and a third area questions that we might want to consider if we're not seeing responsiveness to intervention is the content and intensity of the
26:30 - 27:00 intervention so um we want to we might want to ask whether we're sure the intervention is a good match for the students skill deficits or problem behavior what specific academic skill deficits have we identified or what function does the problem behavior serve and does the intervention address those deficits or functions also is the intensity appropriate so for example are we pre-prompting social skills frequently enough
27:00 - 27:30 do we need to provide more prompts maybe every 15 minutes rather than just at the beginning of the class period and then to what extent are academic and behavioral issues related there's a lot of research indicated indicating that that academic skills are associated with behavior problems so we want to make sure to address both if we're only intervening to reduce behavior problems but not to improve academic skill deficits then it's not
27:30 - 28:00 likely that we're going to see reductions in behavior problems so this graph shows a student's rating of engagement using the direct behavior rating scale and you can see that following intervention responding improved and that we see an offending trend so after we reviewing these data we would continue with the intervention and i would typically review data after
28:00 - 28:30 probably around five data points to make a decision about whether to continue or to make changes in the intervention but that's just sort of a rule of thumb it depends on a lot of variables including how quickly the intervention is expected to work so some interventions such as cognitive behavioral therapy may take longer to to work or to show progress or observe responsiveness than interventions such as
28:30 - 29:00 self-management or some of the antecedent interventions and also it also may depend on how often we're measuring behavior so if we're only measuring behavior once a week or so we're likely to see more change than if we're measuring it frequently every class period or every day and we also want to look at variables such as students past responsiveness to interventions so students who engaged in problem behaviors or have
29:00 - 29:30 significant academic skill deficits we may not see as quick progress as we would with other students so again back to the graph and the slide in this case intervention appears to be working and i would continue with the intervention until the student met criteria which might be set at something like 80 percent engagement across five consecutive sessions
29:30 - 30:00 so when we look at graphs um especially for academic graphs we're often looking at trend of improvement and what happens with the data that are graphed in many progress monitoring systems is that each score is graphed and after baseline data the first three
30:00 - 30:30 data points are taken you see that vertical dashed line a goal is set which is reflected in that blue line and the star at the end of the blue line and that shows the approximate rate of improvement that we expect to see and when a student's rate of progress in the last instructional phase here we see that second red line where the trend
30:30 - 31:00 line is actually exceeding the goal line then that's a favorable data picture and it's nice to know that the student is on track for achieving the goal at the same time there's long-standing research to show that what we need to do for children who have serious academic deficits those are the
31:00 - 31:30 students requiring intensive intervention is that when we see a data path like this we want to consider increasing that goal so that might mean moving to a higher level of progress of the progress monitoring system or it might mean staying at the same level the progress monitoring system but moving that star higher up on the graph to require more
31:30 - 32:00 and make sure that the goal is appropriately ambitious uh advance the slide please so i think that that covers um that topic okay so in this graph we see that the direct behavior rating of engagement is similar across baseline and intervention phases and we see
32:00 - 32:30 a flat or stable trend so again it would be important to determine whether the function of behavior was accurately identified and whether intervention matches that function another thing to look for is what we refer to as ceiling or floor effects and and dr fuchs mentioned this earlier so we may see that a behavior occurs at a very low rate or or very high rate so for example
32:30 - 33:00 behavior like cutting or fighting may occur at a very low frequency and it's going to be difficult to see changes in that behavior as a result of intervention so in cases like this we might want to identify an alternative target behavior maybe a precursor to fighting such as arguing with peers and then collect data on that behavior or we could collect a data over extended periods of time and and then maybe see
33:00 - 33:30 changes next slide so in terms of academics when we see a flat data pattern like the one on the presentation slide there we have four consecutive data points all below the goal line and from research we know that in the most commonly used data systems progress monitoring systems
33:30 - 34:00 when four data points all consecutive points all fall below the goal line that we do not have sufficient progress can you advance the slide please so the student is not responding to the intervention now as i mentioned previously we want to consider the possibility that the progress monitoring tool
34:00 - 34:30 we're experiencing a floor effect so that maybe the teacher feels quite certain that the student is making progress but our tool is not indexing that progress in which case we need to move to a lower level of monitoring so that we can see progress that is occurring now i know there are a couple of questions that people have been asking about
34:30 - 35:00 i saw one comment that in idaho you have to measure at the instructional level i think that that's not in the law federal law but it might be in the state of idaho guidelines so i think when that's the case i think that it may make sense to monitor at the grade level more periodically for example maybe
35:00 - 35:30 once a month to track how the student is doing on the grade level content but to measure more frequently on a weekly basis at the level where the students instruction needs to be delivered so you know i think this tension between remediating foundational skills and addressing the common core it's a definite tension and it's going to get
35:30 - 36:00 worse but if we're to serve the needs of students who require intensive intervention we can't ignore their foundational deficits because they need to be competent on those precursor skills in order to demonstrate competence on the common core standards so we have to walk a tricky line and i think that we have to communicate what we're doing
36:00 - 36:30 to our supervisors our classroom teachers in a way that helps connect instruction on foundational skills to the common core i i think that that is an argument that can be made articulately and i think we also have to periodically index um criteria behaviors that are close to the common core so
36:30 - 37:00 we can talk articulately to parents and classroom teachers and our administrators about how the students are making progress toward the grade level program so when we have a flat data path the student may not be responding the progress monitoring tool may not be sensitive to the students learning um we also have to consider dosage problems has the intervention been implemented with
37:00 - 37:30 fidelity as you know supposed to as it was planned um and then the final thing which is connected to the first bullet up here is that the student's not responding so we have to interpret that graph as communicating to the teacher it's time to make a change in the intervention program so again when we see this flat data pattern or non-responsive
37:30 - 38:00 non-responsiveness to intervention we first want to consider whether the measurement is too difficult which dr dukes just talked about we also want to evaluate whether the measurement is sensitive sensitive to change so for example in the area of behavior office disciplinary referrals may give us a gross method measure of behavior change but they're not going to be sensitive enough to detect changes on it on an everyday basis we also want to ask
38:00 - 38:30 whether the progress monitoring measure aligns with the intervention content or target behavior are we accurately measuring what we want to measure and if however data and assessment aren't an issue then we next want to consider dosage and fidelity and as i mentioned before we if a student doesn't receive the appropriate dosage of the intervention that's going to be a problem for example if intervention is supposed to occur
38:30 - 39:00 four times a week for 30 minutes we want to make sure that the student received each session for the full length on the other hand if the student received the appropriate dosage of the intervention and it was delivered with fidelity we might want to take a look at the content and intensity of the intervention so for example that we might ask does the intervention appropriately target the function of the student's behavior or that student's specific skill deficits
39:00 - 39:30 does a student have the necessary prerequisite prerequisite skills for the intervention or do we need to back up our instruction for example we may have taught and role-played a response for a student to use to peer provocation but we that student may not be able to engage in that response when she's angry so we might then need to teach an intermediate step like self-talk or self-instructions or relaxation or something of that nature
39:30 - 40:00 and then finally as i also mentioned earlier we want to determine whether the behavior and academic challenges are interrelated so if the behavior is impacted by academic deficits then the plan should also involve both academic supports in addition to behavior supports next slide so this graph shows direct behavior
40:00 - 40:30 rating for disruptive behavior and we see that the data are highly variable with baseline and intervention phases pretty similar so we don't see that the student has adequately responded to our intervention and there's quite a bit of variability next slide so when we have a high variability as shown in this graph on the academic
40:30 - 41:00 side advance the slide please we want to consider several possibilities the first is that the progress monitoring tool is not reliable now we can actually check that for most progress monitoring tools you can go to the national center on intensive interventions tools chart and many of the most commonly used tools for progress
41:00 - 41:30 monitoring are on the tools chart and you can look to see what the reliability and validity of the system is and i think that when you see adequate reliability shown for a given tool then you need to reject the hypothesis that the problem is with the assessment system and instead look to see what's going on in the
41:30 - 42:00 instructional environment and in the child so sometimes we can see a lot of bounce when the test is not being administered in a consistent way one of the things that we want with an academic progress monitoring tool is that the assessment is administered in exactly the same way from day to day that minimizes variability it also helps us understand that when we see
42:00 - 42:30 changes in the scores that they reflect student learning and not a different administration procedure but a third hypothesis about a highly variable great graph is that the student has attentional issues is not engaging either in instruction or and or in the assessment of his learning and
42:30 - 43:00 often we need to consider motivational systems um for the kinds of kids that are in intensive intervention the kinds of things that lee could speak articulately about to ensure that students are working hard attending to what's going on and producing the work that they are capable of both during instruction and during the
43:00 - 43:30 assessment and then the fourth bullet really is harder for us to address and that's when other situational or external factors are affecting performance things about the child's home experience classroom experience and oftentimes we have to work in conjunction with the classroom teacher and the social worker to address those kinds of issues
43:30 - 44:00 okay so so i understand that i'm still hard to hear so i'm going to try to scream into my phone so with variable data first we want to consider data and assessment we want to ask is the progress monitoring tool a very valid and reliable measure and has the assessment been administered and scored with consistency we also want to consider dosage and fidelity is the intervention being delivered consistently and with fidelity or the
44:00 - 44:30 way it was designed and then we want to look at content and intensity so what other factors might impact variability maybe medication changes a lot of times we see that parents are inconsistent with giving medications so we see this kind of variability in behavior in students behavior or sometimes a student's schedule changes and they don't regularly get the intervention we also want to ask are there certain
44:30 - 45:00 days of the week or times of the day that changes in behavior or academic performance are occurring and then we want to take a better look at why that's going on in those days or at those times we also want to ask whether external factors such as home life or interactions with others might be impacting performance maybe a fight on the bus or the student lacks a quiet place to do homework or so on and also is the student engaged in the
45:00 - 45:30 intervention so for example is the student actually self-monitoring his or her behavior is this is the student aware of the goal and motivated to work toward it so in other words is the value of that academic or behavioral outcome meaningful to the student next slide okay this graph shows that engagement is prove improving but at a very slow rate
45:30 - 46:00 so the trend is ascending but it's very gradual and assuming our criterion is eighty percent engagement or actually this is disruptive behav oh that this is engagement uh-huh at 80 or above for five consecutive sessions then we might want to consider an intervention change or provide additional supports to the existing intervention maybe increase the schedule of
46:00 - 46:30 reinforcement or make some other type of change in order to see more rapid progress hi so um on the academic side uh this is a graph we don't like to see with the kinds of students who are in intensive intervention we see a lot of graphs like this and here we have the trend line the red line
46:30 - 47:00 is clearly flat compared to the goal line the student scores are improving somewhat but not enough to meet the long-term goal advance the slide please advance the slide please so if we continue with the program as it's been designed assuming that it's been implemented as planned then the student is not going
47:00 - 47:30 to make the amount of progress that we had hoped for so um this is where uh we get to on the positive side we get as teachers the opportunity to exercise our instructional expertise sometimes it's helpful to interact with colleagues to
47:30 - 48:00 share have sharing sessions where we brainstorm about the nature of productive instructional revisions to prompt better student progress sometimes we need to administer diagnostic assessment because lots of times the kinds of outcome measures that are monitored in progress monitoring tools don't provide rich diagnostic information so an example of that would be
48:00 - 48:30 the oral reading fluency measure we don't get a lot of information direction about whether we should be working on syntax versus semantics whether we should be working on decoding whether we should be working on word level skill generally including decoding and word recognition as opposed to fluency as opposed to
48:30 - 49:00 comprehension so we need to be both reach out to our colleagues and we need to be introspective and analytical about what the students strengths and weaknesses are and where we can make a change in the instructional program that's adequately big to potentially have an impact on the rate at which the progress monitoring scores are going up
49:00 - 49:30 but not so big as to derail the entire program so we want to identify an appropriately ambitious change in the instructional program and this is a good place to be cognizant of the kinds of skills that are valued in the common core system so for example close reading is there an opportunity to modify the
49:30 - 50:00 instructional program in a way that helps the student monitor the gist of the series of points that are being made in that passages or to draw inferences within the passage that are required for a deep understanding if we're working at the word level what kinds of word recognition
50:00 - 50:30 modifications can we make to the program to really try to boost the progress monitoring scores or what are the what's the profile of decoding skills for this child is there a diagnostic assessment we can administer to identify a kind of word decoding pattern that the student clearly hasn't mastered and that is highly teachable so
50:30 - 51:00 we need to um in a way not be discouraged when we see that pattern uh you know inadequately inadequate rate of progress and we need to make an instructional change it's easy to feel dispirited about that but i think we need to see it as an opportunity to uh like i said uh exercise our instructional expertise and be analytical and clear about the nature of the
51:00 - 51:30 instructional change and use when we make that change and draw that solid vertical line on the graph we expect to see improvement and if we don't that helps us think about the next instructional change we're going to be required to make and as we interact with the child implementing the present instructional program we should always be mindful of what uh of generating hypotheses about what the
51:30 - 52:00 student requires to boost the efficacy of the program we're delivering uh major slides please so back to our key issues if you see this slow change in behavior or progress we might want to consider first the data and assessment we want to ask whether the goal isn't appropriate again is it too high or is it unrealistic we also want to think about
52:00 - 52:30 what constitutes typical growth for a student in a particular grade level and also look at the students prior growth rates in the in the area of behavior we might want to look at how long the student has engaged in problem behaviors and whether we would expect them to necessarily disappear quickly if they've been ongoing for a long time with respect to dosage and fidelity we would ask whether the student requires
52:30 - 53:00 an intervention with greater frequency or for a longer duration and then if we look at content intensity we would ask does the content of the intervention adequately adequately address the student's needs for example does the student require more explicit instruction in particular areas of deficit or would the student benefit from more frequent opportunities for feedback in the case of behavior so again this will guide our
53:00 - 53:30 instructional changes okay next slide okay so to summarize we want to remember that an appropriate monitoring tool is valid reliable brief sensitive to change and measures the skill or behavior targeted for intervention and second graphing your data allows you
53:30 - 54:00 to see patterns that we might not be able to otherwise detect with raw data and the third bullet once those data are graphed we can ask questions about the data to determine whether the student was responsive to the intervention and then finally let those hypotheses or conclusions about the data guide your decision making about whether a change to the intervention or assessment is needed
54:00 - 54:30 so at this graph we've mentioned these resources a couple of times this graph lists three resources the national center on responsiveness to intervention that is really focused on implementing and scaling up rti and also has applications with diverse populations it provides a lot of information about the components of rti and screening and also provide some
54:30 - 55:00 information about progress monitoring and database decision decision making but it is primarily focused on academics and then the second website then the national center on intensive interventions this link directs you to our training modules for database individualization and there are a number of modules on different topics related to providing intensive intervention the website is is really a great resource for
55:00 - 55:30 in a number of different areas it has the tools charts for evaluating progress monitoring tools in both academics and behavior and also behavioral intervention or academic interventions and so i recomm recommend you take a look at that website and then the national center on student progress monitoring has a ton of information about progress monitoring and academics it also has a number of webinars available and has resources for families and
55:30 - 56:00 professional development development as well so it's also an excellent resource i don't know lynn if you want to mention anything about else about about these websites i know i actually think the websites are have a lot of really good information and they even have completely designed professional development sessions for people to implement if they want to do local training
56:00 - 56:30 there is something that um i think that i in response to some of the questions that people have posted it's important to see the graphs that we have um incorporated into this into these into this powerpoint as meant to be heuristic so that an actual graph you know that has for example baseline data on the left-hand side and the goal the long-term goal on the
56:30 - 57:00 right-hand side would have to be a lot longer wider than the graphs that we have here and we we're just showing patterns of data to give you the gestalt of what we're talking about and some of the questions that people have asked have to do is how long do you want to implement an intervention before
57:00 - 57:30 making a decision about whether the student is making progress or not and on the one hand the individual progress monitoring tool that you're using should provide guidelines about that i will say that in the for the tools that we have developed at vanderbilt and i think that a lot of the other progress monitoring systems rely on our decision-making frameworks
57:30 - 58:00 what we recommend is that at least four to six weeks of instruction have occurred so that the instructional program is being implemented with quality as planned but we're giving enough time for that program to take some effect so we recommend four to six weeks i think that they by the way the whole framework is different for behavior so we probably
58:00 - 58:30 need to comment on this separately but in academics four to six weeks of instruction and we want to see um when we see four consecutive points above or below the goal line we know that we can uh reliably predict uh that the goal is going to be reached if so date four data points consecutive data points are above the line or that the uh goal will not be reached
58:30 - 59:00 if four consecutive data points are below the line but often we don't have four consecutive points above or below by the time we get eight weekly data points and when we have eight weekly data points then we draw a trend line and we use the trend line steeper or less deep than the goal line than for making the decision about whether to
59:00 - 59:30 raise the goal or whether to make a program change and those are good points i think in the area of behavior we tend to discontinue intervention if it's not effective immediately and we forget to think about that problem behaviors may have been go ongoing for a long time and that they may have been reinforced over time and and that they they're not necessarily going to readily disappear so we want to look for we want to look for progress in the direction
59:30 - 60:00 we would like and also as i mentioned there are a number of different variables to consider when we're evaluating progress including the history of problem behavior and expectations for that particular intervention as far as how long it takes to to take a factor to result in some noticeable behavior change well thank you so much dr fuchs and dr kern we have a few other questions that we received ahead of time
60:00 - 60:30 many people submitted questions whenever they registered for the webinar so i've noted a few but just a general response to all our attendees today many questions were submitted about suggestions about specific progress monitoring tools and while the center doesn't endorse any specific progress monitoring tools or programs the tools charts that are available on the ncii website
60:30 - 61:00 are extremely helpful in reviewing different progress monitoring tools and comparing the reliability validity and sensitivity to change of these tools so those tools are available on the link on the previous page and also on the closing slide that i'll show in just a moment but we have a couple questions that i'll pose to our presenters and if you have any others please continue to type those in the q a box but we received many questions about
61:00 - 61:30 using progress monitoring within an rti framework so how should progress monitoring be used within an rti framework to document interventions that have been tried and to determine if the student is making sufficient progress maybe at that tier or at that level so um can i pose that question to lynn first to answer from the academic realm well the progress monitoring tool or the screening system i guess it would be the progress
61:30 - 62:00 monitoring tool that you're using should specify what the benchmarks for adequate progress are and those can be framed in terms of weekly rate of improvement or the final set of three scores at the end of the intervention run and in the in one of the powerpoints that's
62:00 - 62:30 available on the ncii website uh it's if there there is um a day-long session on uh using progress monitoring uh tools within an rti framework uh we provide in that powerpoint criteria for judging responsiveness so and it's done jointly with those decision making
62:30 - 63:00 frameworks that we provide rely on a combination of the amount of progress the child has made while the intervention was in effect and his final score and that's that dual dual discrepancy criterion is used to decide whether the child moves to a less intensive level of the rti system or a more intensive level
63:00 - 63:30 of the rti system so i encourage you to look at those decision making guides that are in the powerpoint if you're using an assessment system that's quite different from the one that's used as an exemplar in the powerpoint then you should be looking at your progress monitoring system they should be providing benchmarks for
63:30 - 64:00 adequate progress and adequate final status and on the ncii tools chart one of the criteria for judging the adequacy of a progress monitoring tool is the extent to which the system provides this kind of information and by the way there was a question about how to set goals and in that same powerpoint
64:00 - 64:30 there's a whole segment about strategies for setting ambitious goals thank you dr fuchs and um dr kern could you speak to using progress monitoring maybe um within a similar behavior tiered system of support framework sure so unfortunately we're not as advanced in the behavioral uh in the behavioral arena with
64:30 - 65:00 measuring progress on an ongoing basis as the academic area but there are differences there are measurements in each tier and there are differences at tier one typically office disciplinary referrals are used and and um i i think that's that's fine most of the students at tier one don't have a lot of behavioral problems so it's kind of a gross way of of monitoring their progress and then at tier two we've tended to use check and
65:00 - 65:30 connect is the the major um tier two intervention and um that that uh program is accompanied with usually a three-point monitoring scale and i would recommend it tier two to begin using the direct behavior rating it appears to be much more accurate and sensitive to change than a three-point scale and then at tier three i would definitely use the direct behavior rating or even direct observation if that's feasible
65:30 - 66:00 so depending on the severity of student behavior i think it's fine to use some of the ongoing measurements such as odrs office disciplinary referrals or other types of systems but i would i would try to identify something that is going to get you more accurate data at tiers 2 and 3. thank you and i'll pose another question a couple different folks have mentioned
66:00 - 66:30 this issue today that within certain states or maybe certain districts there is a requirement that progress monitoring data is collected um on a student's grade level so that kind of poses a challenge for certain teachers that want data on the student's instructional level so do you have any suggestions for those teachers who are somewhat bound to collecting those grade level probes but who also want
66:30 - 67:00 instructional level data maybe how frequently should they be collecting each type or do you have any suggestions for working with a district or state that has that guideline i guess that really just applies an easier thing to do is to i think it's hard to make a dent in the state system without great effort and i think the thing that i would do in that situation is to monitor at grade level
67:00 - 67:30 uh once a month once every six weeks depending on how frustrating it is for the child and i would progress monitor on a weekly basis on the instructional level and i would also you know have a well-framed argument in my notes for why it makes sense to be doing this combination of progress monitoring to provide a full picture of the students progress
67:30 - 68:00 and why working on foundational skills is an important target for producing grade level outcomes that we can't just you know why we can't just skip to the grade level content and ignore the deficits that exist in the foundational skills for that child thank you i think we have time for about
68:00 - 68:30 one more question so i will read one of the ones that was submitted before the webinar and the registration so do you have any suggestions for addressing fluctuating scores that may be due to the progress monitoring measure for instance maybe fluctuating oral reading fluency scores that may be due to a lack of background knowledges on the passages well we know that reading passages produce
68:30 - 69:00 a fair amount of variability and one source of that variability is the content knowledge that the children have on the specific passage that's one reason why we require if we're making trend decisions we require eight data points because we find that with our passages we can get reliable accurate information about the
69:00 - 69:30 need for an instructional change when we have a trend line that runs across eight passages that's that's a lot of passages um so we're going to have some unusually high and low scores in there but when we do our trend line that's going to essentially um minimize the problem that those very high and very low scores create when we get four consecutive scores above or below the line
69:30 - 70:00 then you know that's despite uh whatever variability is attributable to the passages and we find that when we have that consecutive data point pattern that provides accurate information but because of the variability that is inevitably going to occur when we have passages with content in them
70:00 - 70:30 we want to be very careful about making any decisions about too few data points so for example in screening when we're using oral reading fluency from you know from passages we need to have several scores to make a sound decision or we want to be using passages that the data system the authors of the data system tell us are pretty robust across
70:30 - 71:00 kids variations and background knowledge well thank you so much dr fuchs and dr kern this concludes our webinar today we strongly encourage you to take the survey about your experience with today's webinar and we thank everyone who had any tech issues with the sound we'll definitely work to address those next time so we encourage you to take the survey that will pop up in your browser after the webinar ends by taking the
71:00 - 71:30 survey you will be helping us to improve our future webinars and thank you in advance for your feedback as a reminder this webinar will be recorded and archived on our website and the powerpoint will also be posted along with a q a document so if we didn't get to your specific question today please feel free to email the center and our email address is up on the screen ncii air.org thank you so much for joining today and thank you to our panelists dr fuchs and dr kern have a nice day