jared.negley
Wed, 04/29/2026 - 18:05
Edited Text
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program: A Quasi Experimental
Case Study Comparing In-Person and Virtual WIN Delivery Model
_______________________
A Dissertation
Presented to
The College of Graduate and Professional Studies
Department of Education
Slippery Rock University
Slippery Rock, Pennsylvania
______________________
In Partial Fulfillment
of the Requirements for the Degree
Doctorate of Education
_______________________
by
Sarah Parish
May 2026

Keywords: MTSS, Cyber Education, WIN, Interventions

2
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
COMMITTEE MEMBERS
Committee Chair: Jenna Copper, Ed. D.
Assistant Professor of Education
Slippery Rock University
Committee Member: Whitney Wesley, Ed.D.
Department Chair
Slippery Rock University
Committee Member: John Hicks, Ph.D.
Professor of Education
Slippery Rock University

3
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Abstract
Examining Benchmark Growth within MTSS in a K-12 Cyber Program: A Quasi-Experimental
Case Study Comparing In-Person and Virtual WIN Delivery Models
Multi-Tiered Systems of Supports (MTSS) provide a structured framework for
identifying and addressing students’ academic needs through data-driven interventions. While
MTSS has been widely implemented in traditional school settings, limited research has examined
its effectiveness within cyber education environments. The purpose of this applied research study
was to examine academic growth within a K-12 cyber program implementing an MTSS
framework and to compare the effectiveness of two intervention delivery modalities: in-person
WIN (What I Need) sessions and virtual WIN sessions delivered through Microsoft Teams.
This study used a quantitative quasi-experimental design with Renaissance STAR Benchmark
assessment data collected in August and January. Student growth was measured by changes in
STAR scale scores across the testing windows. Paired samples t-tests revealed statistically
significant academic growth across the sample, while independent samples t-tests indicated no
statistically significant differences between students receiving in-person and virtual WIN
interventions. These findings suggest that structured MTSS intervention systems in cyber
programs can support student academic growth, and that virtual intervention delivery may
provide outcomes comparable to in-person support.

4
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Dedication
This dissertation is dedicated to my family, friends, and loved ones whose encouragement
and support made this journey possible and who have learned that my journeys, academic and
otherwise, tend to follow the scenic route, and to the educators who remain committed to doing
what is best for kids.

5
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Acknowledgements
This journey would not have been possible without the encouragement and support of so
many incredible people in my life. I am grateful to the family, friends, colleagues, and mentors
who listened to ideas, read drafts, offered advice, and reminded me to keep going when the finish
line felt far away.
I would first like to thank my dissertation chair, Dr. Jenna Copper, for her guidance and
encouragement throughout this process. Like a great cheer coach on the sidelines, you always
knew when to push, when to support, and when to remind me that I could do this. I am also
thankful to my committee members, Dr. Wesley, whose insight and organization helped keep
everything moving, and Dr. Hicks, for your encouragement and thoughtful support.
A special thank you goes to Dr. Denise Manganello, whose leadership and support gave
me the time and space to write when it mattered most—and who unknowingly brought James
into my life, one of the greatest gifts of this journey. I would also like to thank Dr. David Foley,
whose leadership has been a constant reminder to always do what is best for kids.
To my family, my mom and dad, my sisters Nicole and Marley, and my brother-in-law
Joe, thank you for always believing in me and supporting every goal I set. To my nephews
Rowan and Colin, thank you for bringing so much joy and for reminding me why this work
matters.
Finally, James, thank you for your encouragement and support along the way. Even
though you joined this journey later on, your patience and humor meant a lot, especially the
times you helped keep me grounded when I started spiraling and reminded me that everything
would work out. I am grateful to each of you who helped make this journey possible and who
continue to remind me that doing what is best for kids will always matter most.

6
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Table of Contents
Page
Abstract ........................................................................................................................................... 3
Dedication ....................................................................................................................................... 4
Acknowledgements ......................................................................................................................... 5
List of Tables ................................................................................................................................ 10
List of Figures ............................................................................................................................... 11
List of Abbreviations .................................................................................................................... 12
Chapter 1 ....................................................................................................................................... 13
Introduction ................................................................................................................................... 13
Overview ................................................................................................................................... 13
Background ............................................................................................................................... 13
Problem Statement .................................................................................................................... 15
Purpose of the Study ................................................................................................................. 15
Significance of the Study .......................................................................................................... 16
Research Questions ................................................................................................................... 17
Definition of Terms ................................................................................................................... 18
Delimitations ............................................................................................................................. 19
Summary ................................................................................................................................... 19
Chapter 2 ....................................................................................................................................... 20
Literature Review.......................................................................................................................... 20
Overview ................................................................................................................................... 20
Theoretical Framework ............................................................................................................. 20
Historical Context of MTSS...................................................................................................... 21
Response to Intervention (RTI) ............................................................................................. 22
Positive Behavioral Interventions and Supports (PBIS)........................................................ 24
Integration into MTSS ........................................................................................................... 25
Core Components of MTSS ...................................................................................................... 26
Tiered Continuum of Support ................................................................................................ 26
Data Driven Decision Making ............................................................................................... 29
Formative/Classroom Data .................................................................................................... 29

7
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Collaborative Problem Solving ............................................................................................. 31
Benefits of MTSS ...................................................................................................................... 31
Behavioral Benefits ............................................................................................................... 33
Whole-Child and Equity Benefits.......................................................................................... 35
System-Level Benefits ........................................................................................................... 35
Why MTSS Works ................................................................................................................ 36
Challenges and Barriers in Implementation .............................................................................. 37
Resource Constraints ............................................................................................................. 37
Training and Professional Development ............................................................................... 38
Data Integration and Infrastructure........................................................................................ 38
Consistency, Fidelity, and Sustainability .............................................................................. 39
Culturally Sustaining Practices ................................................................................................. 40
Culturally Sustaining Practices within MTSS ....................................................................... 40
Designing Interventions That Are Linguistically and Culturally Relevant ........................... 40
Embedding Social Emotional Learning ................................................................................. 42
MTSS in Cyber and Hybrid Schools ......................................................................................... 45
Implementation Trends .......................................................................................................... 45
Barriers .................................................................................................................................. 46
Summary ................................................................................................................................... 50
Chapter 3 ....................................................................................................................................... 51
Methods ..................................................................................................................................... 51
Overview ................................................................................................................................... 51
Research Design ........................................................................................................................ 51
Research Questions ................................................................................................................... 52
Setting........................................................................................................................................ 53
Participants ................................................................................................................................ 53
Population .............................................................................................................................. 53
Sample ................................................................................................................................... 53
Group Definitions .................................................................................................................. 53
Instrumentation.......................................................................................................................... 54

8
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
STAR Benchmark Assessment ................................................................................................ 54
Variables.................................................................................................................................... 55
Control Variables ................................................................................................................... 56
Data Collection.......................................................................................................................... 57
WIN Assignment ................................................................................................................... 59
Data Extraction.......................................................................................................................... 60
Data Cleaning ........................................................................................................................ 60
Data Extraction, Cleaning, and Analysis Procedures ................................................................ 60
Data Analysis ............................................................................................................................ 63
Analysis for RQ1: Student Growth ....................................................................................... 63
Analysis for RQ2: Modality Comparison .............................................................................. 63
Assumptions .............................................................................................................................. 63
Ethical Considerations............................................................................................................... 63
Trustworthiness, Reliability, and Validity of the Study ........................................................ 64
Summary ................................................................................................................................... 66
Chapter 4 ....................................................................................................................................... 67
Results ........................................................................................................................................... 67
Introduction ............................................................................................................................... 67
Description of the Dataset ......................................................................................................... 68
Descriptive Statistics ................................................................................................................. 70
Results by Research Question ................................................................................................... 74
Research Question 1 .............................................................................................................. 74
Research Question 2 .............................................................................................................. 76
Chapter Summary...................................................................................................................... 78
Chapter 5 ....................................................................................................................................... 78
Discussion and Implications ......................................................................................................... 78
Introduction ............................................................................................................................... 78
Statement of the Problem .......................................................................................................... 79
Purpose of the Study ................................................................................................................. 80
Summary of Findings ................................................................................................................ 80

9
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Discussion ................................................................................................................................. 82
RQ 1 ....................................................................................................................................... 82
RQ 2....................................................................................................................................... 86
Implications for Practice ........................................................................................................... 88
Limitations ................................................................................................................................ 90
Recommendations for Future Research .................................................................................... 91
Conclusion................................................................................................................................. 93
Appendix A ................................................................................................................................. 102
Appendix B ................................................................................................................................. 105

10
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
List of Tables
Table 1: Sample Reading Characteristics ……………………………………………………….69
Table 2: Sample Math Characteristics …………………………………………………….…….70
Table 3: Descriptive Statistics for Reading STAR Benchmark Scores………………….……....71
Table 4: Descriptive Statistics for Math STAR Benchmark Scores……………………….…….71
Table 5: Descriptive Statistics by Intervention Group – Reading……………………….............71
Table 6: Descriptive Statistics by Intervention Group -Math……………………………..……..72
Table 7: Paired Samples t-Test Results for Reading STAR Growth……...……………………..74
Table 8: Paired Samples t-Test Results for Math STAR Growth………………………………..74
Table 9: Independent Samples t-Test for Reading WIN Modality ……………………………...77
Table 10: Independent Samples t-Test for Math WIN Modality ……………….……………….77

11
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
List of Figures
Figure 1:

In Person vs Online Reading Growth Comparison ……………………72

Figure 2:

In Person vs Online Math Growth Comparison ……………………….73

12
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
List of Abbreviations
CSP

Culturally Sustaining Pedagogy

ESSA

Every Student Succeeds Act

FERPA

Family Educational Rights and Privacy Act

LMS

Learning Management System

MLL

Multilingual Learner

MTSS

Multi-Tiered System of Supports

PA I-MTSS

Pennsylvania Integrated Multi-Tiered System of Supports

PBIS

Positive Behavioral Interventions and Supports

PDE

Pennsylvania Department of Education

RTI

Response to Intervention

SEL

Social-Emotional Learning

STAR

Standardized Test for the Assessment of Reading and Mathematics

WIN

What I Need (Intervention Block)

13
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program

Chapter 1
Introduction
Overview
The expansion of cyber education transformed how K-12 schools delivered instruction,
assessed learning, and provided student support. As districts continued to offer full-time and
hybrid virtual options, ensuring equitable and responsive systems of support for all learners
became a central priority. The Multi-Tiered System of Supports (MTSS) framework,
traditionally designed for in-person environments, offered a structure for identifying academic,
behavioral, and social-emotional needs through data-driven decision making. However,
transferring this model into a cyber setting presented unique challenges. Virtual schools
navigated barriers such as limited face-to-face interaction, inconsistent student engagement, and
reduced opportunities for real-time observation of learning and behavior.
This quantitative research study explored how a K-12 cyber program strengthened the
implementation of MTSS to promote equity, improve consistency, and ensure timely
intervention for all students. Drawing from the implementation of culturally sustaining practices,
this study aimed to develop actionable recommendations that improved how online educators,
administrators, and families collaborated to meet student needs within a digital environment.
Background
The MTSS framework emerged from decades of research on early intervention, inclusive
education, and systems-level problem solving. Initially rooted in Response to Intervention (RTI),
an instructional model designed to identify academic needs early and provide tiered, data-based
supports, and Positive Behavioral Interventions and Supports (PBIS), a proactive, schoolwide

14
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
approach focused primarily on improving behavior and school climate, MTSS evolved into a
comprehensive structure addressing academic, behavioral, and social-emotional domains through
a tiered model of support. At its foundation, MTSS emphasized prevention and early
identification, ensuring that all students received high-quality instruction with access to
progressively intensive support as their needs increased.
While MTSS was well established in traditional brick-and-mortar schools, its
implementation in online environments remained inconsistent. Cyber programs, particularly
those that served a broad range of students across multiple districts, faced distinctive barriers.
Teachers often juggled asynchronous communication, diverse student schedules, and variations
in parental involvement. Additionally, many online platforms were not designed for seamless
progress monitoring or behavioral data collection, creating gaps in tier identification and
intervention fidelity. This challenge was well documented in virtual learning environments
where data fragmentation and limited real-time visibility hindered effective MTSS
implementation (Evans et al., 2022; Guest et al., 2024).
Research emphasized that strong MTSS frameworks depended on collaboration,
consistent data use, and culturally responsive practices (Freeman-Green et al., 2021; Sugai &
Horner, 2020). Yet, these pillars were often difficult to uphold virtually. For example, Evans et
al. (2022) found that fewer than 30% of cyber programs reported having a fully implemented
MTSS system, citing challenges in data collection, staffing, and communication. These realities
underscored the need for adaptive models that maintained the fidelity of MTSS while honoring
the flexibility and accessibility that defined cyber learning.
This study examined the intersection between digital learning and systems of support. By
identifying both barriers and opportunities within one established K-12 cyber program, it aimed

15
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
to contribute to the broader understanding of how MTSS could be effectively adapted to virtual
settings without compromising its core principles.
Problem Statement
Although MTSS had been shown to improve academic and behavioral outcomes when
implemented with fidelity, many cyber programs struggled to operationalize its components
consistently across virtual platforms. A lack of standardized systems limited real-time
observation, and uneven collaboration among teachers and support staff often resulted in delayed
interventions and inequitable access to resources. These inconsistencies disproportionately
affected students requiring targeted or intensive support, particularly those with disabilities,
multilingual learners, or students from economically disadvantaged backgrounds.
The problem guiding this study was that MTSS implementation in cyber education
settings lacked consistency, equity, and culturally responsive adaptation, which limited the
effectiveness of support for diverse learners. Without intentional redesign, virtual programs
risked widening achievement and engagement gaps that MTSS was originally designed to close.
Purpose of the Study
The purpose of this applied research study was to examine the current state of MTSS
implementation within a K-12 cyber program and to identify practical strategies for enhancing
equity, fidelity, and sustainability. This study sought to determine how educators integrated
culturally sustaining practices and data-informed decision-making into a virtual environment
while maintaining strong collaboration across roles.
Through this investigation, the researcher aimed to develop actionable recommendations
to improve MTSS delivery within the cyber program. These findings were intended to support

16
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
not only local practice but also to provide guidance for other districts and cyber schools seeking
to strengthen their virtual intervention systems.
Significance of the Study
This study held theoretical, empirical, and practical significance. Theoretically, it
extended the application of MTSS beyond traditional school walls into a fully virtual learning
environment. By aligning implementation with Culturally Sustaining Pedagogy (CSP), this
research examined how MTSS could evolve to meet the needs of diverse cyber learners while
honoring students’ identities, languages, and lived experiences. CSP, as conceptualized by Paris
and Alim (2017), provided the cultural and equity-oriented lens for this study, emphasizing that
interventions must sustain, rather than simply acknowledge, students’ cultural ways of being. To
guide the structural and instructional components of MTSS, this study drew on an established
MTSS framework commonly represented as an inverted pyramid, which positioned Tier 1
universal instruction as the broadest foundation, Tier 2 targeted intervention as the middle tier,
and Tier 3 intensive supports at the narrowest point (Haring Center for Inclusive Education &
OSPI, 2020). This inverted tier model was frequently used in professional development
literature, including works such as Explicit Instruction: Effective and Efficient Teaching (Archer
& Hughes, 2011), which outlined essential features of Tier 2 intervention design.
Together, CSP and the inverted MTSS framework formed the theoretical underpinnings
of this study. CSP provided the rationale, ensuring that MTSS practices in cyber environments
remained equitable, identity-affirming, and responsive, while the inverted MTSS model provided
the structure, offering a structured, evidence-based approach to Tier 2 instruction, intervention
fidelity, and data-driven decision making within virtual learning systems. This dual framework
foundation situated in the study at the intersection of cultural responsiveness and systems-level

17
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
improvement, reflecting how MTSS needed to adapt alongside technology and evolving
educational landscapes.
Empirically, this study contributed to a limited but growing body of literature addressing
MTSS in cyber and hybrid programs. Much of the existing research focused on brick-and-mortar
implementation, leaving a need for applied studies that examined MTSS effectiveness, barriers,
and opportunities within online contexts.
Practically, the study provided school leaders, teachers, and support staff with concrete
recommendations for improving collaboration, intervention fidelity, and student monitoring in
virtual settings. These insights had the potential to enhance not only student outcomes but also
teacher confidence and family engagement in cyber programs. The study’s findings could inform
professional learning, district policy, and system-level decision making as virtual learning
continued to expand across Pennsylvania and beyond.
Research Questions
This applied research study was guided by two primary research questions that examined
student academic growth within a cyber education setting and evaluated the effectiveness of
targeted intervention delivery models within a Multi-Tiered System of Supports (MTSS)
framework.
Research Question 1 focused on overall academic growth among students enrolled in the
cyber program, as measured by the Renaissance STAR Benchmark assessment from August to
January. This question was designed to determine whether students demonstrated measurable
academic progress during the first semester of enrollment in a fully online instructional
environment.

18
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Research Question 2 examined whether differences existed in academic growth outcomes
between students receiving in-person WIN (What I Need) intervention support and those
receiving virtual WIN support. By comparing growth across these two MTSS delivery
modalities, this study sought to identify whether the mode of intervention delivery was
associated with differential student outcomes.
Collectively, these research questions supported the overarching goal of informing
evidence-based MTSS implementation in cyber education by evaluating both overall student
growth and the relative effectiveness of intervention modalities. Findings from this study aimed
to provide practical guidance for designing equitable, responsive, and data-informed MTSS
structures that met the diverse academic needs of students in a virtual learning environment.
Definition of Terms
Multitiered System of Supports (MTSS). A comprehensive, tiered framework that
integrates academic, behavioral, and social emotional support through databased
decision making.
Response to Intervention (RTI). A tiered instructional framework focused on early
identification and support of students with academic needs through research-based
instruction, frequent progress monitoring, and data-driven decision-making.
Positive Behavioral Interventions and Supports (PBIS). A proactive, schoolwide
framework that uses explicit behavior expectations, prevention strategies, and data-based
interventions to improve student behavior, school climate, and social-emotional
functioning.
Cyber School. A fully virtual or hybrid K-12 educational program where students attend
classes online in whole or in part.

19
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Tiered Interventions. Structured levels of support that increase in intensity, ranging from
universal (Tier 1) to targeted (Tier 2) and individualized (Tier 3).
Culturally Sustaining Practices. Instructional and support strategies that affirm students’
cultural identities and promote equity within the learning environment.
Virtual Collaboration. The use of online tools and platforms to facilitate communication
and shared decision making among educators, students, and families.
Implementation Fidelity. The degree to which MTSS practices are executed as designed
to ensure consistency and accuracy of interventions.
Delimitations
This study focused on a Pennsylvania-based K-12 cyber program. The scope included
staff perceptions and structural analysis of MTSS practices rather than longitudinal student
outcomes. Findings were contextualized within this cyber program’s model and were intended to
inform practical improvement rather than to generalize across all virtual schools.
Summary
This chapter introduced the growing importance of MTSS in cyber education and
outlined the challenges and opportunities that existed within virtual implementation. It defined
the problem, clarified the study’s purpose, and highlighted its significance from multiple
perspectives. The chapter concluded with the research questions that guided the investigation.
Chapter Two provided a comprehensive review of literature related to MTSS
implementation, emphasizing its theoretical foundations, historical development, and emerging
adaptations for virtual education. The review also explored the integration of culturally
sustaining practices within online frameworks to strengthen equity and responsiveness across
tiers of support.

20
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Chapter 2
Literature Review
Overview
This chapter reviewed current research and foundational theory related to the MultiTiered System of Supports (MTSS), emphasizing its adaptation within cyber education. The
review examined how MTSS had evolved as a framework for equitable student support, explored
its theoretical underpinnings, and synthesized literature on implementation challenges, culturally
sustaining practices, and virtual adaptations. The goal was to situate this study within the context
of contemporary educational systems while identifying the knowledge gaps this research
addressed.
Theoretical Framework
The MTSS framework aligned closely with two theoretical strands that informed this
study: Implementation Science and Culturally Sustaining Pedagogy. Together, they provided
both a process and a purpose for improving equity-focused systems of support in cyber
programs.
Implementation Science offered a structured approach to translating theory into practice.
It emphasized fidelity, sustainability, and contextual adaptation when new frameworks, such as
MTSS, were introduced (Fixsen et al., 2019). Implementation Science was defined as the
systematic study of methods and strategies that supported the effective adoption, integration, and
scale-up of evidence-based practices within real-world settings. Within a virtual environment,
this theory underscored the need for clearly defined systems, purposeful training, and robust
digital progress-monitoring tools that ensured interventions remained consistent and effective
despite limited physical proximity and the unique complexities of online learning.

21
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Culturally Sustaining Pedagogy (CSP), advanced by Paris and Alim (2017), focused on
affirming students’ cultural identities and integrating these perspectives into curriculum and
support systems. Embedding CSP within MTSS ensured that academic, behavioral, and socialemotional interventions respected students’ lived experiences, particularly in diverse digital
learning communities. When viewed together, Implementation Science guided the how of MTSS,
while Culturally Sustaining Pedagogy defined the why, creating an inclusive framework that
honored both fidelity and identity.
Historical Context of MTSS
MTSS emerged from decades of reform focused on early intervention and inclusive
education. Its origins traced to two major movements: Response to Intervention (RTI) and
Positive Behavioral Interventions and Supports (PBIS).
MTSS developed from several decades of educational reform aimed at improving
outcomes for diverse learners through early identification, prevention, and inclusive practices
rather than reactive or exclusionary models of support. Prior to the development of MTSS,
student difficulties were often addressed through a wait-to-fail approach, in which academic or
behavioral concerns were not formally addressed until students demonstrated significant and
persistent failure. This model disproportionately impacted students with disabilities, multilingual
learners, and students from historically marginalized backgrounds, as supports were frequently
delayed, fragmented, or tied to categorical labels rather than demonstrated need.
Beginning in the late twentieth century, federal and state reform movements increasingly
emphasized early intervention and accountability for all learners. The reauthorization of special
education legislation, along with broader standards-based reform efforts, shifted the focus toward
ensuring access to high-quality instruction within general education settings. During this period,

22
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
initiatives such as Reading First, schoolwide prevention models, and the growing use of
curriculum-based measurement promoted the idea that instructional response, rather than student
deficit, should guide decision making. These reforms laid out the groundwork for Response to
Intervention (RTI), which emphasized universal screening, tiered instruction, and progress
monitoring to identify students at risk and provide timely support.
As understanding student needs expanded beyond academics, MTSS evolved as a more
comprehensive framework that integrated academic, behavioral, and social-emotional support
within a unified system. Influenced by Positive Behavioral Interventions and Supports (PBIS),
implementation science, and equity-focused reform, MTSS moved beyond intervention alone to
emphasize strong Tier 1 instruction, data-driven problem solving, and inclusive service delivery.
This shift reflected a growing recognition that student success was shaped by instructional
quality, school climate, cultural responsiveness, and systemic coherence. Within this historical
context, MTSS represented the culmination of reform efforts designed to ensure that all students,
regardless of background or learning profile, received timely, equitable, and effective support
within a shared educational framework.
Response to Intervention (RTI)
Developed in the early 2000s, RTI was an instructional framework designed to identify and
support students with academic difficulties through a tiered system of increasingly intensive
interventions (Fuchs & Fuchs, 2006). The core RTI principles included the following.
Response to Intervention (RTI) was a prevention-oriented framework designed to provide early,
systematic academic support to students before learning difficulties became entrenched. At the
foundation of RTI was Tier 1, which consisted of high-quality core instruction delivered to all
students. High-quality Tier 1 instruction was characterized by standards-aligned curriculum,

23
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
research-based instructional practices, clear learning objectives, frequent formative assessment,
and differentiated instructional strategies that addressed a range of learner needs within the
general education setting. Instruction at this level was delivered by highly qualified educators
and was responsive to student data, with the expectation that most students (typically 80-85%)
would demonstrate adequate progress when Tier 1 instruction was effective.
A central component of RTI was universal screening, which was used to identify students
who might be at risk for academic difficulty. Students were considered “at risk” when screening
data indicated performance below established benchmarks or expected levels of proficiency for
their grade or age. Universal screeners were administered to all students multiple times per year
and served as an early warning system, allowing educators to identify potential concerns before
they resulted in significant academic gaps. Importantly, identification of risk was based on data
patterns rather than labels, ensuring that students received timely support without unnecessary
delay or stigma.
Students identified as at risk through universal screening and classroom data received
Tier 2 targeted, evidence-based interventions. These interventions were typically delivered in
small groups and focused on specific skill deficits, such as foundational literacy or numeracy
skills, using instructional approaches that had demonstrated effectiveness through empirical
research. Tier 2 interventions were provided in addition to, not in place of, core instruction and
were implemented with increased intensity, structure, and instructional time. Interventionists
included classroom teachers, reading specialists, intervention teachers, or other trained staff,
depending on school structures and resources.
Frequent progress monitoring was used to assess students’ responsiveness to intervention
and to inform instructional adjustments. Progress monitoring tools were brief, reliable measures

24
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
aligned with the targeted skills being addressed and were administered on a regular basis, often
weekly or biweekly at Tier 2. Classroom teachers and interventionists were responsible for
collecting and recording progress data, which were then summarized and shared through data
dashboards, progress reports, or MTSS documentation systems. At Tier 1, progress monitoring
took the form of formative classroom assessments and benchmark data reviews, while at Tier 3,
monitoring occurred more frequently and with greater individualization.
Decisions regarding movement between tiers were made collaboratively through a databased problem-solving process. MTSS or RTI teams, often composed of classroom teachers,
interventionists, special educators, administrators, school psychologists, and support staff, met
regularly to review student data, evaluate intervention fidelity, and determine next steps. These
teams considered multiple data sources, including screening results, progress monitoring trends,
classroom performance, and intervention implementation data. Decisions to intensify, fade, or
discontinue interventions were guided by documented student response rather than
predetermined timelines, ensuring that instructional support remained flexible, responsive, and
grounded in evidence. Through this team-based, data-informed approach, RTI functioned as a
proactive system designed to support student success early and equitably.
Positive Behavioral Interventions and Supports (PBIS)

PBIS is a proactive, schoolwide framework aimed at improving student behavior, socialemotional functioning, and overall school climate (Sugai & Horner, 2002). Key PBIS principles
include the following:
Explicit teaching of behavioral expectations,
Consistent reinforcement systems that encourage positive behaviors,

25
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Predictable routines and procedures,
Use of behavioral data to guide interventions and decision-making, and
Tiered behavior supports students needing additional intervention.
PBIS emphasizes prevention, consistency, and environmental design to reduce challenging
behavior and increase engagement.
Integration into MTSS
Over time, educators recognized that separating academic and behavioral interventions
limited effectiveness. MTSS unified these models, integrating academic, behavioral, and socialemotional supports into a single, holistic system grounded in prevention, early identification, and
tiered intervention (Freeman-Green et al., 2021). This evolution reflected a broader shift toward
systems thinking, emphasizing data use, collaboration across disciplines, and the need to meet
the academic, behavioral, and social-emotional needs of the “whole child.”
MTSS served as a framework for continuous improvement and equity. Nationally, the
adoption of the Multi-Tiered System of Supports (MTSS) was driven by persistent achievement
and discipline gaps, inconsistent intervention practices, and the limitations of maintaining
separate academic (RTI) and behavioral (PBIS) systems. With the passage of the Every Student
Succeeds Act (ESSA, 2015), states were encouraged to implement evidence-based frameworks
that prioritized early intervention, data-driven decision-making, and whole-child supportsconditions that aligned closely with MTSS principles. As a result, more than 86% of states
incorporated MTSS in their State Systemic Improvement Plans, using the model to advance
prevention, equity, cross-disciplinary collaboration, and systemic coherence (States of I-MTSS
Brief, 2024; NEA, n.d.).

26
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
In Pennsylvania, MTSS was formally adopted as Pennsylvania’s Integrated MTSS (PA IMTSS) to address similar systemic needs, including reducing disproportionate outcomes,
strengthening inclusive education, and ensuring that all students had access to high-quality core
instruction and tiered supports. PA I-MTSS was defined by the state as a “standards-aligned,
comprehensive school improvement framework” that integrated academic, behavioral, and
social-emotional supports, emphasizing universal screening, data-based decision-making, and
shared leadership structures to promote continuous improvement and equity across the K-12
system (PaTTAN, n.d.; West Shore School District, n.d.). Together, the national and
Pennsylvania frameworks reflected a broader movement toward coherent, equitable, and
prevention-oriented systems that supported students holistically. MTSS was embedded within
state standards and school improvement planning, reflecting ongoing efforts to align instruction,
intervention, and assessment under a unified model of support.
Core Components of MTSS
Successful MTSS frameworks share several defining components that function collectively to
provide equitable student support:
Tiered Continuum of Support
MTSS organizes interventions into three tiers:
Tier 1: Universal support provided to all students through high quality, research-based
instruction.
Tier 2: Targeted interventions for students who need additional support beyond the core
curriculum.
Tier 3: Intensive, individualized interventions addressing significant academic or
behavioral needs (Berkeley et al., 2020).

27
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
The tiered approach allowed educators to respond systematically to student data, ensuring
that support increased intensity as needed without waiting for failure to occur.
Successful MTSS frameworks were grounded in a tiered continuum of support designed to
deliver instruction and intervention with increasing levels of intensity based on student need.
This structure ensured that all students received appropriate support in a timely manner
and that instructional decisions were driven by data rather than delayed identification or failure.
Each tier served a distinct purpose within the system while remaining interconnected and fluid,
allowing students to move between levels as their needs changed.
Tier 1 represented the foundation of MTSS and consisted of high-quality, research-based
instruction provided to all students within the general education setting. Instruction at this level
was standards-aligned, culturally responsive, and was delivered by general education teachers
using evidence-based instructional practices. Tier 1 instruction included clear learning
objectives, explicit teaching, opportunities for guided and independent practice, and ongoing
formative assessment. Universal screening and classroom-level data were used to evaluate the
effectiveness of Tier 1 instruction as a whole, with the expectation that the majority of students
would meet grade-level benchmarks when core instruction was strong. When a significant
proportion of students did not demonstrate adequate progress, instructional adjustments were
made at the Tier 1 level before individual students were considered for additional intervention.
Tier 2 provided targeted interventions for students who did not demonstrate sufficient
progress with Tier 1 instruction alone. Placement into Tier 2 was determined through a databased problem-solving process that included review of universal screening results, classroom
performance, and other relevant data sources. Instruction at Tier 2 was more focused and

28
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
intensive, typically delivered in small groups and designed to address specific skill deficits
identified through assessment data. These interventions were evidence-based, time-bound, and
were provided in addition to, not in place of, core instruction. Tier 2 instruction was delivered by
classroom teachers, interventionists, reading or math specialists, or other trained staff, depending
on school resources and structures. Student progress was monitored regularly to determine
responsiveness and to inform instructional adjustments.
Tier 3 consisted of intensive, individualized interventions for students with significant
and persistent academic or behavioral needs. Students were considered for Tier 3 when data
indicated minimal or insufficient response to Tier 2 supports. Instruction at this level was highly
individualized, often delivered one-on-one or in very small groups, and might involve increased
instructional time, specialized strategies, or alternative instructional approaches. Tier 3
interventions were typically implemented by specialized staff, such as intervention specialists,
special educators, or related service providers, and were closely monitored using frequent
progress monitoring measures. Data from Tier 3 interventions were used not only to guide
instructional decisions but also to inform potential referrals for special education evaluation
when appropriate.
Across all tiers, decisions regarding placement, movement, and instructional intensity
were made collaboratively by MTSS teams composed of educators, specialists, and
administrators. This team-based approach ensured consistency, fidelity, and equity in
implementation. By organizing instruction within a tiered continuum, MTSS enabled schools to
respond systematically to student needs, increase support proactively, and provide equitable
access to effective instruction without relying on a wait-to-fail model (Berkeley et al., 2020).

29
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Data Driven Decision Making
Frequent data collection is integral to a tiered support framework summative, formative,
and collective assessments each play distinct roles in guiding instruction, tier placement, and
system improvement (Al Otaiba et al., 2019).
Summative Data
These were periodic benchmark assessments (e.g., fall, winter, spring universal
screeners) used across a grade level or schoolwide to evaluate broad instructional effectiveness
and curriculum alignment. When summative data revealed patterns, such as a high percentage of
students performing below benchmark, instructional leaders and curriculum teams responded by
reviewing and revising the Tier 1 core curriculum (e.g., pacing, scope and sequence, alignment
with standards). For example, if the August universal screener showed that 40% of students in
Grade 4 failed to meet expectations in number sense, the district revised the Tier 1 math program
to include more foundational number sense content and professional development for teachers.
Formative/Classroom Data
These data were gathered more frequently (weekly or biweekly) and included teacherdesigned exit tickets, daily observations, small-group checklists, and progress-monitoring
probes. Formative data informed decisions about which students needed Tier 2 support, what
specific skill deficits existed, and how instructional groups should be structured. When a teacher
identified a cluster of students missing two or more proficiency markers in decoding after three
weeks of instruction, those students were moved into a targeted Tier 2 small-group intervention
focused on decoding and fluency. Teachers or interventionists monitored student responses
weekly and adjusted grouping, content, or pacing as needed.
Collective/Collaborative Data Review

30
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
This process occurred when MTSS teams or data teams analyzed sets of formative and
summative data to determine system-wide trends, intervention fidelity, resource allocation, and
equity considerations. For example, if progress-monitoring data indicated that students from a
specific demographic group consistently showed slower response rates in Tier 2, the team
investigated cultural or accessibility factors, adjusted intervention materials, or delivered targeted
professional development on culturally sustaining practices. Collective data review also helped
determine when students should transition between tiers; students who failed to show satisfactory
growth in Tier 2 after a predetermined number of sessions were considered for Tier 3, and the
decision was documented in system records.
Tier Designation Process:
Tier 1 students are those engaged in the core curriculum and meeting at-benchmark
performance via summative screening and formative classroom data.
Tier 2 designation occurs when students do not meet benchmark standards, are
flagged via screeners or classroom formative data, and require additional small-group
support beyond Tier 1. Instruction in Tier 2 is delivered by interventionists,
instructional coaches, or teachers trained in targeted interventions, using standard
protocols and weekly progress monitoring.
Tier 3 is for students who demonstrate low response or continued risk after Tier 2
support; instruction becomes more individualized, often delivered by specialists, with
higher frequency, smaller group size (or one-to-one), and daily progress monitoring.

31
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
By aligning data collection and analysis across these levels, MTSS systems maintained a
continuous improvement loop: summative data refined curriculum and Tier 1 instruction;
formative data guided targeted interventions and tier movement; and collaborative data teams
ensured fidelity, monitored equity, and iterated system design. In cyber environments, digital
dashboards, analytics, and student management systems strengthen visibility and accountability
when used consistently.
Collaborative Problem Solving
MTSS depended on collaboration among general and special educators, administrators,
counselors, and families. Critically, it also relied on teams of teachers who taught the same
content (e.g., grade-level or departmental teams), as these professional learning communities
analyzed common assessments, aligned instructional practices, and made collective decisions
about which students required Tier 1 differentiation or Tier 2 support, promoting coherence and
equity across classrooms. Regular meetings, shared protocols, and transparent communication
promoted cohesive action plans and shared responsibility for student success (Choi et al., 2019).
Together, these components provided a structure for equitable intervention delivery while
maintaining flexibility to adapt across settings, including online learning environments.
Benefits of MTSS

A substantial body of research demonstrated that MTSS was an effective, equitable, and
sustainable framework for improving academic, behavioral, and social-emotional outcomes for
students when implemented with fidelity. MTSS was particularly effective because it integrated
prevention, early intervention, and systematic data use, elements repeatedly shown to reduce

32
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
learning gaps and improve schoolwide functioning (Freeman-Green et al., 2021; Fuchs & Fuchs,
2006). By organizing supports into tiers and ensuring that decisions were based on consistent
data sources, MTSS provided a coherent structure for schools to proactively address student
needs rather than relying on reactive or fragmented supports.
Academic Benefits
One of the most documented benefits of MTSS was its positive impact on academic
achievement. Research showed that systematic tiered intervention produced significant growth in
reading and mathematics for students across grade levels. For example, in a large-scale case
study of elementary schools implementing RTI/MTSS, students receiving Tier 2 reading
interventions demonstrated significantly greater gains in decoding and fluency compared to peers
in non-MTSS schools (Al Otaiba et al., 2014).
Similarly, in a Florida multi-district study, schools implementing MTSS with high
fidelity experienced double the rate of reading proficiency growth on state assessments compared
to schools without MTSS structures (Balu et al., 2015). These improvements were not limited to
students in intervention tiers; whole-school gains were observed due to stronger Tier 1 core
instruction, improved alignment across grade-level teams, and more intentional data-based
instructional practices.
MTSS also reduced special education misidentification by ensuring that students received
evidence-based support before referral. Case studies in Minnesota and Colorado found that
MTSS implementation decreased unnecessary special education placements by 30-60% while
ensuring that students with genuine disabilities were identified more accurately and earlier
(Hoover et al., 2020). This shift reflected the preventive intent of MTSS: early, targeted

33
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
intervention reduced long-term academic struggles and prevented students from “falling through
the cracks.”
Behavioral Benefits
On the behavioral side, MTSS, most notably through its behavioral framework of
Positive Behavioral Interventions and Supports (PBIS), had a strong and well-established
research base demonstrating effectiveness in improving student behavior and school climate.
PBIS functioned as the behavioral arm of MTSS, applying the same tiered, data-based principles
used in academic intervention to behavior and social-emotional development. In a large, multiyear randomized controlled trial examining PBIS implementation across more than 600
elementary and middle schools, Bradshaw et al. (2010) found that schools implementing PBIS
with fidelity experienced approximately 33% reductions in office discipline referrals, significant
decreases in suspensions, and measurable improvements in student-teacher relationships and
perceptions of school safety. These outcomes reflected not only behavioral change but also
broader improvements in school climate and relational trust.
Additional studies reinforced these findings at scale. Horner et al. (2009) reported that
schools implementing PBIS with fidelity demonstrated sustained reductions in exclusionary
discipline practices and greater consistency in behavioral expectations across classrooms.
Similarly, McIntosh et al. (2014) found that PBIS implementation was associated with decreased
disproportionate discipline outcomes for students of color, suggesting that data-based behavioral
systems could function as equity-promoting structures when implemented intentionally.
Together, these studies highlighted the role of behavioral MTSS frameworks in reducing
reactive disciplinary practices and replacing them with proactive, instructional approaches to
behavior.

34
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Behavioral MTSS also played a critical role in supporting social-emotional learning
(SEL), particularly when SEL instruction was embedded within Tier 1 universal supports and
reinforced through Tier 2 and Tier 3 interventions. Schools implementing PBIS consistently
demonstrated improvements in student self-regulation, engagement, and sense of belonging,
protective factors strongly associated with long-term academic and behavioral success (Sugai &
Horner, 2020). By establishing predictable routines, explicitly teaching behavioral expectations,
and reinforcing positive behavior, PBIS created conditions that supported the development of
social-emotional competencies such as self-management, relationship skills, and responsible
decision-making.
Research examining the intersection of MTSS and SEL further supported this integration.
Lane et al. (2007) found that tiered behavioral frameworks incorporating SEL instruction
resulted in reductions in disruptive behavior and improvements in academic engagement for
students receiving Tier 2 supports. More recently, Cook et al. (2015) reported that schools
integrating SEL within an MTSS framework experienced improvements in both behavioral
outcomes and classroom instructional time, suggesting that SEL-enhanced MTSS models
contributed to more effective learning environments overall. These findings underscored that
SEL was not an “add-on” to MTSS, but rather a complementary component that strengthened
behavioral and academic systems alike.
Collectively, this body of research indicated that MTSS implementation, through PBIS
and integrated SEL practices, produced meaningful behavioral and climate-related benefits at
both the student and school levels. When behavioral supports were delivered within a tiered,
data-driven framework, schools were better positioned to promote positive behavior, reduce
exclusionary discipline, and foster environments where students felt safe, supported, and

35
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
connected. These outcomes further reinforced the role of MTSS as a comprehensive framework
for addressing the full range of student needs beyond academics alone.
Whole-Child and Equity Benefits
MTSS was explicitly designed to support the whole child, ensuring that academic,
behavioral, and social-emotional needs were not addressed in isolation. Research showed that
MTSS implementation reduced outcome disparities for multilingual learners, students with
disabilities, and economically disadvantaged students by providing access to tiered supports
grounded in early identification and culturally responsive practices (Freeman-Green et al., 2021).
For example, a case study in Oregon found that MTSS implementation decreased achievement
gaps between White students and students of color by more than 25% in reading within three
years (Everett et al., 2019). These gains were attributed to consistent progress monitoring,
differentiated Tier 1 instruction, and culturally relevant Tier 2 interventions.
System-Level Benefits
At the systems level, MTSS strengthens collaboration, improves instructional alignment,
and enhances organizational efficiency. Schools using MTSS typically implement grade-level or
departmental data teams, which analyze common assessments, monitor intervention
effectiveness, and make shared decisions about support. Research shows that these collaborative
structures significantly improve teacher efficacy and reduce variability in instructional quality
across classrooms (Donohoo, 2017). MTSS also creates a predictable data cycle, screen,
intervene, monitor, adjust, that helps administrators allocate resources based on real-time needs
and ensure consistent implementation.

36
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Why MTSS Works
MTSS was effective because it integrated several interdependent, research-supported
principles that collectively strengthened instructional systems and improved student outcomes.
First, MTSS emphasized prevention rather than reaction by prioritizing strong Tier 1 instruction
and early identification of learning needs through universal screening, reducing reliance on a
wait-to-fail model that delayed support for struggling students. Second, the framework was
grounded in data-based decision making rather than intuition, requiring educators to use multiple
sources of student data-screening, progress monitoring, and outcome measures-to guide
instructional planning, intervention selection, and movement between tiers. This reliance on
systematic data increased accuracy, consistency, and equity in instructional decisions (Burns &
Gibbons, 2012; Gersten et al., 2009).
In addition, MTSS relied on collaborative teams rather than isolated teachers, recognizing
that complex student needs were best addressed through shared problem-solving and collective
expertise. Grade-level teams, intervention teams, and MTSS leadership teams regularly analyzed
data, monitored intervention fidelity, and made instructional adjustments, which improved
coherence and reduced variability in implementation (Batsche et al., 2014). The framework also
prioritized evidence-based instruction over ad hoc interventions, requiring that both core
instruction and targeted supports were grounded in practices with demonstrated effectiveness.
This emphasis on instructional fidelity and alignment ensured that students received
interventions likely to produce meaningful gains (Fixsen et al., 2005).
Finally, MTSS replaced one-size-fits-all approaches with tiered supports that increased in
intensity based on student need, allowing instruction to be responsive, flexible, and scalable.
Research consistently demonstrated that when these components-prevention, data use,

37
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
collaboration, evidence-based instruction, and tiered support-were implemented with fidelity,
schools experienced improved academic and behavioral outcomes at both the individual and
system levels (Al Otaiba et al., 2019; Balu et al., 2015). Together, this integrated design made
MTSS one of the most empirically supported and widely endorsed instructional frameworks in
contemporary K-12 education.
Challenges and Barriers in Implementation
Despite its promise, many schools faced significant barriers to implementing MTSS with
fidelity. Research consistently identified challenges related to resources, training, data systems,
and long-term sustainability, particularly as schools scaled MTSS across grade levels and student
populations (Freeman-Green et al., 2021; McIntosh & Goodman, 2016).
Resource Constraints
A recurring barrier in MTSS implementation was the limitation of personnel, time, and
funding. Schools often lacked sufficient interventionists, coaches, or specialists to deliver Tier 2
and Tier 3 supports with the frequency and intensity required, which resulted in incomplete or
uneven implementation across tiers. Scheduling constraints also made it difficult to build
protected intervention blocks (e.g., WIN time) into the master schedule, especially at the
secondary level where credit-bearing courses dominated student timetables. McIntosh and
Goodman (2016) noted that schools attempting to implement MTSS without aligning schedules,
staffing, and planning time frequently experienced “initiative fatigue,” in which teachers felt
overextended and supports were layered on top of, rather than integrated within, existing
practices. These resource constraints were compounded in rural and high-need districts, where
staffing shortages and competing mandates limited the capacity to sustain robust MTSS
structures (Burns et al., 2020).

38
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Training and Professional Development
Effective MTSS implementation required that educators understood how to interpret data,
select evidence-based interventions, deliver instruction across tiers, and apply behavioral support
strategies. However, many teachers and school leaders reported limited preparation in these
areas. Newell et al. (2021) found that insufficient professional development in data-based
decision-making and intervention design was one of the most cited barriers to MTSS fidelity.
Similarly, Freeman-Green et al. (2021) highlighted that when teachers were not explicitly trained
to integrate culturally responsive practices within MTSS, inequities in referrals and supports
could persist. Ongoing coaching, collaborative data meetings, and targeted training in both
academic and behavioral interventions were essential; without them, MTSS risked becoming a
procedural checklist rather than a robust instructional framework.
Data Integration and Infrastructure
MTSS depended on timely, accurate data across academic, behavioral, and socialemotional domains. In practice, however, schools often struggled with fragmented data systems.
Academic data might reside in one platform, behavior incidents in another, and SEL screeners in
a third. Managing and synthesizing these multiple data sources was time-consuming and
technically challenging, particularly for schools without dedicated data support personnel
(McIntosh & Goodman, 2016).
These integration issues were amplified in virtual and hybrid learning environments,
where engagement metrics (e.g., log-ins, assignment completion, time-on-task) also became
critical indicators. Evans et al. (2022) found that many online programs lacked unified
dashboards that brought together academic performance, attendance, and behavioral indicators,
which made it difficult for MTSS teams to form a complete picture of student needs or

39
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
intervention effects. As a result, tier placement and monitoring decisions were often delayed or
based on incomplete information.
Consistency, Fidelity, and Sustainability
Even when MTSS was initially implemented with enthusiasm, maintaining consistency
and fidelity over time posed another major challenge. Without clear leadership structures,
ongoing coaching, and systemwide communication, MTSS efforts often lost momentum or
became compliance-driven, focusing more on completing forms than on improving instruction
(McIntosh & Goodman, 2016). Turnover in leadership or key MTSS champions exacerbated this
problem; new initiatives emerged, and MTSS was sometimes deprioritized or fragmented.
Research also indicated that fidelity tended to be higher at Tier 1 than at Tiers 2 and 3, where
interventions were more intensive and complex to coordinate (Burns et al., 2020). FreemanGreen et al. (2021) further noted that sustainability was closely tied to equity; when MTSS was
not maintained as a continuous improvement framework, historically marginalized students
might once again experience inconsistent access to support. Long-term success, therefore,
depended on embedding MTSS into district policies, leadership practices, and professional
learning systems rather than treating it as a short-term initiative.
In online programs, these challenges were amplified by reduced opportunities for realtime observation and the need to translate traditional face-to-face practices into digital formats.
According to Guest et al. (2024), nearly half of virtual teachers reported difficulty determining
when students were disengaged or experiencing academic challenges, and this lack of immediate
feedback made it harder to initiate the early, preventative supports MTSS required. Addressing
these barriers required intentional design and collaboration across all levels of the system.

40
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Culturally Sustaining Practices
Culturally Sustaining Practices (CSP) provided a critical equity-focused lens within
MTSS, ensuring that academic and behavioral supports honored, sustained, and elevated the
cultural identities, languages, and community knowledge of all students. Rooted in the
foundational work of Paris and Alim (2017), CSP moved beyond traditional approaches to
cultural responsiveness by asserting that students’ cultural ways of being were not simply assets
to acknowledge; rather, they were resources that needed to be continually supported and
expanded within instructional and intervention frameworks. As schools and cyber programs
worked to address diverse learner needs, CSP offered a powerful complement to MTSS by
promoting identity-affirming instruction, preventing biased referral patterns, and strengthening
engagement across cultural contexts.
Culturally Sustaining Practices within MTSS
Equity was a central tenet of MTSS, yet many implementations focused primarily on data
and logistics rather than cultural responsiveness. Integrating Culturally Sustaining Practices
(CSP) ensured that interventions honored students’ identities, languages, and communities (Paris
& Alim, 2017). CSP within MTSS took several forms.
Designing Interventions That Are Linguistically and Culturally Relevant
One of the core applications of CSP within MTSS involved designing instructional and
intervention practices that affirmed students’ cultural and linguistic identities. Paris and Alim
(2017) emphasized that CSP moved beyond mere cultural responsiveness by sustaining students’
home languages, communication patterns, and cultural knowledge as essential components of
instruction. Within an MTSS framework, this meant that Tier 1 curriculum materials, examples,
texts, and problem-solving tasks reflected students’ cultural backgrounds, while Tier 2 and Tier 3

41
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
interventions adapted instructional language and examples so they were accessible and
meaningful for multilingual learners and culturally diverse students.
Research showed that when interventions integrated culturally relevant materials, such as
bilingual texts, community-based examples, or culturally familiar narrative structures, students
demonstrated higher engagement and improved reading and math outcomes (Aronson &
Laughter, 2016; Gay, 2018). For instance, Freeman-Green et al. (2021) found that MTSS
interventions incorporating culturally sustaining language scaffolds resulted in stronger Tier 2
responsiveness among multilingual learners. These adaptations ensured that interventions did not
unintentionally alienate students who might already be marginalized within traditional academic
systems.
Including Family and Community Voices in Data Discussions and Intervention Planning
CSP emphasized that families and communities were essential partners in the educational
process rather than peripheral stakeholders. Within MTSS, this principle translated into
involving families in goal setting, intervention selection, progress monitoring discussions, and
data-based decision making. Research consistently showed that meaningful family engagement
improved academic achievement, attendance, and behavior outcomes, particularly for students
from historically marginalized groups (Jeynes, 2018). From a CSP perspective, this meant
schools created space for families to share cultural knowledge, communication norms, and
contextual insights that might shape how students learned or responded to interventions. CastroOlivo (2014) found that culturally attuned collaboration with families led to improved SEL
outcomes for Latinx students receiving Tier 2 supports. MTSS teams that incorporated family
voice also reduced cultural misunderstandings in behavior referrals, improved intervention

42
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
match, and fostered greater trust between schools and communities (Henderson & Mapp, 2002).
In cyber settings, incorporating family voice included virtual data conferences,
multilingual communication tools, and opportunities for caregivers to shape intervention plans
based on cultural norms at home.
Implicit bias training was essential in MTSS because Tier 2 and Tier 3 referral patterns
were often influenced by teachers’ perceptions, which might unintentionally reflect cultural bias.
Research repeatedly documented disparities in referrals for behavior supports, special education,
and discipline-particularly for Black, Latinx, Indigenous, and multilingual learners (Skiba et al.,
2011). Culturally sustaining MTSS required educators to examine discipline data by subgroup,
analyze disproportionality, and adapt behavioral expectations to reflect varied cultural norms for
communication, emotional expression, and social interaction. Gregory et al. (2016) found that
schools implementing culturally responsive behavioral supports experienced reductions in
exclusionary discipline and improvements in teacher-student relationships.
Professional development that helped educators identify implicit bias, use strength-based
language, and interpret behavior within cultural context contributed to more accurate Tier 2
identification and equitable access to supports (McIntosh & Goodman, 2016). When educators
were trained to recognize cultural variability in classroom behavior, MTSS became a more
equitable framework, reducing the over-referral of students of color to intensive support.
Embedding Social Emotional Learning
CSP within MTSS also included embedding social-emotional learning (SEL) curricula
that reflected students’ cultural identities, lived experiences, and community values. Traditional
SEL programs often relied on norms rooted in dominant cultural perspectives, which could feel

43
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
misaligned for students from diverse backgrounds (Jagers et al., 2019). Culturally sustaining
SEL created explicit space for multiple ways of expressing emotions, communicating, resolving
conflict, and building community. For example, Jagers, Rivas-Drake, and Borowski (2018)
argued that SEL needed to incorporate sociopolitical awareness, cultural identity development,
and critical consciousness to truly support students emotionally and behaviorally. Schools that
embedded CSP within SEL-such as using storytelling from students’ cultures, community-based
mentors, or restorative practices rooted in cultural traditions-demonstrated stronger SEL
outcomes, reduced behavior incidents, and increased student engagement (Romero et al., 2019).
Within MTSS, culturally sustaining SEL strengthened Tier 1 behavioral expectations, improved
the accuracy of Tier 2 referrals, and provided culturally attuned strategies for Tier 3 emotional or
behavioral supports.
Freeman-Green et al. (2021) found that MTSS frameworks that explicitly integrated CSP
showed improved engagement and decreased discipline disparities. Freeman-Green et al. (2021)
conducted a systematic review examining how culturally responsive and culturally sustaining
practices were embedded within MTSS across academic, behavioral, and social-emotional
domains. Their analysis included 36 empirical and conceptual studies in which MTSS
frameworks integrated CSP-related elements such as culturally relevant instruction, family
engagement, identity-affirming behavioral supports, and data-based decision making that
explicitly considered cultural context. The review found that MTSS frameworks integrating CSP
typically included several shared practices:
Culturally Sustaining Tier 1 Instruction
Schools implementing CSP-infused MTSS modified Tier 1 instruction to ensure that
curriculum materials, examples, and discourse patterns reflected students’ cultural backgrounds.

44
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Teachers incorporated community stories, home languages, and culturally meaningful contexts
into core instruction. Freeman-Green et al. found that when Tier 1 became more culturally
sustaining, fewer students-particularly multilingual learners and students of color-were
inappropriately referred to Tier 2 or Tier 3 supports.
Equity-Oriented Tier 2 and Tier 3 Decision-Making
The study highlighted how MTSS teams that used CSP principles did not rely solely on
universal screening scores for Tier 2 decisions. Instead, they contextualized data using students’
cultural and linguistic backgrounds, ensuring that differences in dialect, language acquisition, or
cultural communication norms were not misinterpreted as deficits. This approach reduced biased
referrals to intensive interventions and special education.
Integration of Family and Community Knowledge
CSP-oriented MTSS systems incorporated structured opportunities for families to
contribute cultural insight during team meetings, intervention planning, and progress monitoring
conversations. Freeman-Green et al. reported that when schools systematically included
caregiver voice in MTSS processes, intervention match improved, and student engagement in
both academic and behavioral supports increased.
Behavior Supports Rooted in Cultural Context
In schools where PBIS was adapted using CSP principles, behavioral expectations were
co-created with students and families, acknowledging cultural differences in communication,
interaction, and community norms. Teachers were trained to interpret behaviors through a
cultural lens, reducing misinterpretations that often led to disproportionate discipline. FreemanGreen et al. found that these CSP-informed PBIS models significantly decreased discipline
disparities, particularly for Black and Latinx students.

45
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Professional Development on Bias, Identity, and Culture
A key finding from Freeman-Green et al. was that MTSS systems integrating CSP
provided ongoing professional development focused on implicit bias, cultural identity
development, and asset-based views of student behavior. These trainings helped teachers
distinguish between cultural differences and actual behavioral concerns, leading to more accurate
Tier 2 behavioral referrals. Freeman-Green et al. (2021) found that MTSS frameworks explicitly
integrating culturally sustaining practices, such as culturally aligned Tier 1 instruction, equityfocused referral processes, family-centered decision making, and culturally grounded behavioral
supports, demonstrated improved academic engagement, stronger intervention responsiveness,
and decreased discipline disparities.
For cyber schools, where students learned from varied geographic and cultural
backgrounds, maintaining culturally sustaining approaches was essential for fostering connection
and belonging. Virtual spaces could also be leveraged creatively through multilingual resources,
culturally responsive content, and flexible communication methods to ensure inclusion and
equitable access to support.
MTSS in Cyber and Hybrid Schools
The literature on MTSS in online education was emerging but remained limited. Existing
studies highlighted persistent gaps and limited opportunities.
Implementation Trends

Research showed that virtual schools were adapting MTSS elements, such as tiered
academic interventions and data dashboards, but often lacked full integration of behavioral and

46
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
social-emotional components (Evans et al., 2022). Collaboration across remote staff remained
inconsistent, and intervention tracking tools varied widely among programs.
Barriers

Online learning introduces both obstacles and advantages for MTSS. Barriers include
inconsistent student attendance, communication delays, and limited visibility into nonacademic
needs.
Resource Constraints
Resource constraints remained one of the most pervasive barriers to MTSS
implementation across both traditional and virtual settings. In cyber environments, schools often
faced limited personnel to provide Tier 2 and Tier 3 supports, restricted funding for intervention
programs, and insufficient time built into virtual schedules for collaboration or intervention
delivery. These issues directly undermined MTSS fidelity because Tier 2 and Tier 3 support
required increased instructional intensity, frequent progress monitoring, and consistent
intervention blocks that were often difficult to sustain without adequate staffing. Burns et al.
(2020) found that rural, high-poverty, and virtual schools were significantly more likely to report
shortages of interventionists, making it difficult to deliver timely supplemental instruction.
Inadequate Training and Professional Development
Effective MTSS implementation depended heavily on educators’ ability to interpret data,
design interventions, and integrate behavioral supports. However, many educators reported
limited preparation in these areas, especially in virtual contexts where engagement metrics, SEL
indicators, and attendance data also needed to be analyzed. Newell et al. (2021) identified
inadequate professional development as one of the most significant impediments to MTSS

47
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
fidelity, noting that teachers often lacked training in progress monitoring and culturally
responsive intervention design. In virtual schools, this gap widened because teachers also had to
learn how to deliver interventions through tele-instruction and interpret data from digital learning
platforms.
Data Integration and Fragmented Information Systems
MTSS depended on efficient data collection, analysis, and interpretation across academic,
behavioral, SEL, and attendance indicators. Virtual schools often used multiple platforms-LMS
systems, behavior trackers, benchmark assessments, and communication logs-that were not
integrated into a single dashboard. This fragmentation created delays in decision making and
increased the risk of incomplete or inaccurate tier placements. McIntosh and Goodman (2016)
found that schools with fragmented data systems exhibited lower MTSS fidelity because staff
lacked a unified view of student needs. In virtual environments, Evans et al. (2022) noted that
teachers struggled to triangulate engagement data, screeners, and communication records,
complicating the identification of Tier 2 needs.
Inconsistent Engagement and Attendance
Student engagement was a core early-warning indicator in MTSS, but it was uniquely
volatile in virtual settings. Students might fail to log in, leave sessions early, remain off-camera,
or fail to submit work, making it difficult for teachers to gauge learning readiness or behavioral
concerns. Guest et al. (2024) reported that nearly half of virtual teachers experienced difficulty
identifying disengagement, which led to delays in Tier 2 interventions. Turnaround for Children
(2020) similarly found that inconsistent engagement disrupted the MTSS data cycle because
attendance and task-completion metrics were often the first indicators of academic risk online.

48
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Limited Visibility into Non-Academic Needs
Unlike traditional environments where teachers observed students daily, virtual teachers
had limited visibility into students' emotional well-being, home life, or environmental stressors.
This lack of insight constrained Tier 1 SEL support and could lead to under-identification of
students needing Tier 2 behavioral or emotional interventions. Evans et al. (2022) found that
virtual teachers expressed lower confidence in recognizing signs of distress or behavioral
escalation in online learning spaces. This gap increased the risk that students’ nonacademic
needs went unaddressed until they manifested as academic failure or disengagement.
Communication Delays Between Students, Families, and Teachers
MTSS relied on rapid, clear communication to coordinate supports, but virtual learning
introduced delays due to asynchronous messaging, email backlogs, and inconsistent caregiver
availability. Communication lags made it harder to deliver timely interventions, coordinate Tier
2 plans, and maintain fidelity. Chow et al. (2020) highlighted that communication breakdowns
impeded progress monitoring because teachers could not promptly clarify misunderstandings or
adjust instruction. This led to slower intervention cycles and missed opportunities for early
support.
Consistency, Fidelity, and Sustainability Challenges
Even when MTSS was implemented effectively, maintaining fidelity over time remained
challenging. Schools might lose momentum without leadership stability or continuous
professional development. McIntosh and Goodman (2016) documented that fidelity was
particularly difficult to maintain in Tier 2 and Tier 3, where interventions were more complex.
Virtual schools faced additional challenges due to staff turnover, inconsistent professional
development participation, and the need for constant adaptation of virtual materials.

49
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Opportunities
However, digital platforms also provided unique opportunities for personalization and
responsiveness that could strengthen MTSS implementation in virtual contexts. Automated
alerts, progress dashboards, and predictive analytics enabled teachers to identify disengagement
patterns earlier and intervene more efficiently. Many systems integrated embedded SEL modules
and real-time engagement metrics, allowing educators to monitor academic, behavioral, and
affective indicators simultaneously (Guest et al., 2024). These tools, when used with fidelity,
supported timely Tier 1 and Tier 2 responses and increased the visibility of student needs in
environments where traditional cues were less observable (Guest et al., 2024).
Cyber programs also needed to proactively address persistent inequities in technology
access, internet connectivity, device reliability, and home learning conditions. Research
continued to show that digital learning environments could unintentionally widen gaps for
students from historically marginalized groups when these structural factors were not
intentionally addressed (Borup et al., 2020; Rice et al., 2021). Integrating culturally sustaining
practices (CSP) within virtual MTSS frameworks helped mitigate these disparities by centering
students’ lived experiences, linguistic backgrounds, and community identities. CSP-aligned
MTSS systems promoted flexible pathways for instruction, honored multiple ways of
demonstrating learning, and created relational practices that built belonging even when students
were not physically present. When virtual MTSS intentionally incorporated CSP, it expanded
opportunities for academic access and emotional support, particularly for students who might
otherwise remain invisible in online settings. Embedding culturally sustaining practices within
virtual MTSS helped mitigate these inequities by centering students’ lived realities and
promoting flexibility in instructional delivery.

50
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Overall, the literature emphasized that MTSS could succeed in cyber schools when it was
systematically adapted, data-rich, and collaboratively maintained rather than simply transplanted
from brick-and-mortar models. Research showed that while tiered frameworks such as MTSS,
RTI, and PBIS remained valuable in virtual environments, they required intentional restructuring
to account for differences in communication, engagement, and progress monitoring (Institute of
Education Sciences [IES], 2021). Studies of virtual Tier 2 and Tier 3 systems similarly noted that
many core in-person MTSS components could be transferred online, but only when digitalspecific tools, workflows, and supports were used to deliver interventions effectively (Hanover
Research, 2020). Broader MTSS scholarship also reinforced that high-quality implementationacross physical or virtual settings-depended on ongoing collaboration, shared ownership, and
systematic use of academic and behavioral data to guide decisions (Great Schools Partnership,
2023). Collectively, these sources underscored that virtual MTSS thrived not by replicating
traditional methods but by adapting them to the affordances and constraints of online learning.
Summary
The literature revealed that while MTSS was a proven framework for improving
academic and behavioral outcomes, its translation to cyber education remained incomplete.
Implementation Science provided a pathway for fidelity and sustainability, while Culturally
Sustaining Pedagogy ensured that MTSS remained equitable and inclusive. The intersection of
these theories offered the foundation for this applied research study.
This chapter synthesized historical, theoretical, and practical perspectives on MTSS and
identified key gaps, including the need for:


Culturally responsive adaptations within virtual contexts



Consistent data integration across online systems

51
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program


Structured collaboration for remote educators and families
Chapter Three outlined the methods used in this applied research study, including the

design, participants, data collection procedures, and analysis plan for examining MTSS
implementation in a K-12 cyber program.

Chapter 3
Methods Overview
This chapter outlined the quantitative design, setting, participants, instrumentation, data
collection procedures, variables, and statistical analyses used to examine academic growth within
the Multi-Tiered System of Supports (MTSS) framework in a K-12 cyber program. The study
investigated student growth on the STAR Benchmark assessment from August to January and
compared outcomes across two WIN (What I Need) intervention modalities: in-person WIN
sessions and virtual WIN sessions delivered via Microsoft Teams. Ethical considerations,
reliability, and validity were also addressed.
Research Design
This study employed a quantitative, quasi-experimental design with:


A repeated measures component evaluating student growth between two time points
(August vs. January STAR scores), and



A comparative component analyzing differences between two naturally occurring groups
(in-person WIN vs. virtual WIN).

A quasi-experimental design was appropriate because students were not randomly assigned to
intervention groups; instead, they participated based on scheduling, family preference,

52
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
transportation, or staffing availability. This design was common in applied educational settings
where randomization was not feasible, but empirical evaluation was needed.
Two statistical procedures were used:
1. A paired-samples t-test measured significant growth from August to January for the
entire sample.
2. An independent-samples t-test compared mean STAR growth between in-person
WIN and virtual WIN students.
These quantitative techniques aligned directly with the research questions and allowed the
researcher to determine whether MTSS structures and intervention modalities supported
measurable academic improvement.
Research Questions
RQ1:
To what extent did students enrolled in the cyber program demonstrate academic growth on the
STAR Benchmark assessment from August to January?
H1₀: There was no statistically significant difference in STAR scores from August to January.
H1ₐ: Students demonstrated significant growth in STAR scores from August to January.
RQ2:
Was there a statistically significant difference in STAR Benchmark growth between students
receiving in-person WIN support and those receiving virtual WIN support?
H2₀: There was no difference in growth between in-person WIN and virtual WIN groups.
H2ₐ: Students who received in-person WIN support demonstrated greater growth than students
who received virtual WIN support.

53
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Setting
The study took place in a large Pennsylvania-based K-12 cyber program within a public school
district. The program served students across multiple partner districts and offered a blend of
asynchronous coursework and synchronous support sessions. MTSS structures included
universal screening, weekly WIN time, and targeted interventions aligned with academic needs.
WIN was delivered in two modalities:


In-person WIN: Small-group academic intervention at designated physical sites



Virtual WIN: Intervention delivered synchronously via Microsoft Teams

The STAR Benchmark assessment was administered universally three times per year and served
as the program’s primary academic screening tool.
Participants
Population
The target population included approximately 50 K-6 students enrolled in the cyber program
who completed both the August (fall) and January (winter) STAR Benchmark assessments.
Sample
A convenience sample was used consisting of students who:


Completed both assessments, and



Participated in WIN intervention (either in-person or virtual).

Students who did not participate in WIN or who had missing assessment data were excluded to
ensure accurate measurement of growth and modality effects.
Group Definitions
In-person WIN group: Students who received face-to-face intervention
Virtual WIN group: Students who received intervention via Microsoft Teams

54
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Final sample sizes were reported in the results chapter.
Instrumentation
STAR Benchmark Assessment
Renaissance STAR Reading and STAR Math were computer-adaptive assessments
developed by Renaissance Learning and were widely used in K-12 settings for universal
screening, progress monitoring, and growth measurement. The assessments adjusted item
difficulty in real time based on student responses, allowing for precise measurement of
achievement with fewer test items.
Both assessments were administered online through Renaissance’s secure platform using
district-approved devices. Students completed the assessments independently in both virtual and
in-person settings, making STAR appropriate for cyber learning environments.
STAR Reading assessed English Language Arts skills, including reading comprehension and
vocabulary. STAR Math assessed mathematical skills such as number operations, algebraic
reasoning, geometry, and data analysis. The assessments measured reading and mathematics only
and did not assess science or social studies content.
Although STAR assessments were not globally timed, each item had an embedded time
limit to support test validity. Total administration time varied by student but typically ranged
from 20 to 30 minutes. All items were multiple choices, and no constructed-response items were
included.
Reliability and Validity
According to the Renaissance Technical Manuals, STAR assessments demonstrated strong
psychometric properties:


Internal consistency reliability: Cronbach’s alpha ranged from .86 to .92

55
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program


Test-retest reliability: Approximately r = .80



Validity: STAR demonstrated strong concurrent and predictive validity with state
assessments
STAR assessments were appropriate for this study due to their vertically scaled scores,

which allowed performance to be measured across grade levels and over time on a continuous
developmental scale. This feature supported accurate growth analysis and aligned with MTSS
progress monitoring practices (Renaissance, 2023).
STAR assessments were administered using standardized digital procedures across
instructional settings. Renaissance specified that administration conditions, adaptive algorithms,
and scoring procedures remained consistent regardless of student location, supporting score
comparability across virtual and in-person contexts (Renaissance, 2023).
Additionally, STAR aligned with Pennsylvania Department of Education MTSS
guidelines for universal screening, including reliability, validity, and efficiency requirements
(PDE, 2023).
Variables
The independent variable for this study was WIN (What I Need) intervention modality,
which represented the format through which targeted instructional supports were delivered to
students within the MTSS framework. This variable was operationalized as a dichotomous
indicator, distinguishing between virtual and in-person delivery models. Students assigned a
value of 0 received WIN interventions delivered virtually through synchronous online platforms,
such as Microsoft Teams, while students assigned a value of 1 participated in WIN interventions
delivered in person within a physical school setting. This operational definition allowed for clear

56
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
comparison between intervention modalities and supported quantitative analysis of differences in
student growth outcomes associated with the mode of WIN intervention delivery.
Dependent Variable
STAR Growth Score
Calculated as:
January STAR Score - August STAR Score
Control Variables
Student growth for this study is calculated using the difference between assessment
scores across two time points, specifically by subtracting the August STAR score from the
January STAR score. This calculation provides a direct measure of academic growth over the
designated instructional period and allows for comparison of student progress across intervention
modalities. By focusing on change over time rather than absolute performance, this approach
accounts for individual starting points and aligns with MTSS practices that emphasize growth
monitoring and responsiveness to intervention.
Depending on data availability, several control variables may be included in the analysis
to account for factors that could influence student outcomes independent of the WIN intervention
modality. These covariates may include grade level, which helps control developmental and
curricular differences across grades, and baseline performance level, which accounts for initial
achievement prior to intervention. Attendance may also be included, as consistent participation
in instructional and intervention activities can affect student growth. Additionally, demographic
characteristics - such as multilingual learner (MLL) status and special education status-may be
incorporated to control variability related to student support needs and to support more equitable

57
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
interpretation of results. Including these variables, when accessible, strengthens the validity of
the analysis by reducing potential confounding effects and providing a more accurate estimate of
the relationship between WIN intervention modality and student growth.
If accessible, the following may be included as covariates:


Grade level



Baseline performance level



Attendance



Demographic characteristics

Data Collection
Universal benchmarking was a foundational component of this study and served as the
primary mechanism for measuring student achievement and growth within the MTSS
framework. All students included in the study completed the STAR universal benchmark
assessment during two district-established assessment windows: once in August at the beginning
of the school year and again in January at the conclusion of the fall instructional period. These
benchmarking windows were consistent across both groups in the study-students participating in
virtual WIN interventions and those participating in in-person WIN interventions-ensuring
uniformity in timing, expectations, and assessment conditions.
The STAR assessments were administered digitally using standardized procedures
established by the school district. Students assigned to the virtual WIN group completed the
assessments remotely through secure online access, while students assigned to the in-person
WIN group completed the assessments on district devices within a supervised school setting.
Although the physical testing environments differed, the assessment itself remained identical
across groups, including item pools, adaptive algorithms, timing, scoring procedures, and

58
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
reporting metrics. All students received the same directions and completed the assessments
independently, consistent with STAR administration guidelines. This standardized administration
allowed for valid comparison of results across both instructional modalities.
Benchmark results were used by educators to inform WIN (What I Need) instructional
planning and MTSS decision-making at all tiers. At Tier 1, STAR data were reviewed to
evaluate the effectiveness of core instruction and to identify students who required additional
support. At Tier 2, students identified through benchmark scores as needing targeted intervention
were assigned to WIN supports based on skill gaps identified in STAR domain-level data. At
Tier 3, students demonstrating significant academic risk received more intensive, individualized
interventions, with STAR data serving as one of multiple data points used to guide instructional
decisions. While the structure and goals of WIN remained consistent across both groups, the
primary difference lay in the delivery modality: virtual WIN interventions were delivered
synchronously through online platforms, while in-person WIN interventions occurred face-toface within the school building. The content, focus on targeted skills, frequency of intervention,
and progress monitoring expectations remained aligned across modalities to the greatest extent
possible.
Data for this study were collected through district assessment systems following
established data governance protocols. Only de-identified student data were extracted for
research purposes, and no personally identifiable information was included in the dataset
provided to the researcher. Each student was assigned a unique study identifier to allow for
longitudinal analysis while preserving confidentiality. All data were stored on secure, passwordprotected district systems, and access was limited to authorized personnel. Data analysis focused
on comparing growth outcomes between August and January using STAR scale scores, with

59
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
analyses conducted at the group level rather than the individual level to further protect student
privacy. Findings were reported only in aggregate form, ensuring that no individual student,
teacher, or subgroup could be identified. Through these procedures, the study maintained strong
ethical protections while leveraging universal benchmarking data to examine the effectiveness of
WIN intervention modalities within a cyber MTSS framework.
WIN Assignment
Students identified as requiring targeted academic support through the universal
benchmarking and MTSS review process were assigned to a WIN (What I Need) intervention
group based on instructional need and program structure. Following analysis of August STAR
benchmark data, educators and intervention teams reviewed student performance, including
overall scale scores and domain-specific skill data, to determine eligibility for targeted supports
beyond core instruction. Students meeting criteria for Tier 2 or Tier 3 intervention were then
assigned to one of two WIN intervention modalities: in-person WIN or virtual WIN delivered via
Microsoft Teams.
Assignment to WIN modality was determined by the student’s instructional program of
enrollment and access to in-person services rather than by student choice or academic
performance alone. Students participating in the in-person model received targeted WIN
instruction within the school building during designated WIN blocks, allowing for face-to-face
interaction with instructional staff. Students participating in the virtual model received WIN
instruction synchronously through Microsoft Teams, where interventionists provided live, smallgroup or individualized support using digital instructional resources. In both modalities, WIN
sessions were scheduled consistently and focused on addressing specific skill deficits identified
through STAR data and other supporting measures.

60
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Although the mode of delivery differed, the purpose, structure, and expectations of WIN
remained consistent across both groups. All students received targeted instruction aligned to their
identified needs, followed similar intervention timelines, and were monitored using the same
assessment tools and data review processes. This parallel structure ensured that differences in
student outcomes could be examined in relation to intervention modality while maintaining
consistency in instructional intent, intensity, and MTSS alignment.
Data Extraction
The researcher will extract anonymized STAR scores from the district’s data warehouse.
Data Cleaning
a. Removing missing or incomplete cases
b. Ensuring matching August to January scores
c. Verifying correct group assignment
d. Checking for outliers
Computation of Growth Scores
Growth = January score minus August score for each student.
Statistical Analysis
Conducted using SPSS or Excel based data tools.
Data Extraction, Cleaning, and Analysis Procedures
This applied internal improvement study relied exclusively on existing academic
performance data collected through routine instructional and assessment practices; no data were
collected directly from students for research purposes. Students completed the STAR universal
benchmark assessment during district-established windows in August and January as part of

61
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
standard MTSS procedures. These assessments were administered digitally in both instructional
models. Students assigned to the virtual WIN group completed the assessment remotely using
secure district-approved platforms, while students assigned to the in-person WIN group
completed the assessment on district devices in a supervised school setting. The assessment
content, administration procedures, adaptive algorithms, and scoring processes were identical
across both groups, ensuring comparability of results. Benchmark data were used instructionally
to guide WIN placement, skill grouping, and intervention planning at all tiers, independent of the
research study.
Following the completion of the January benchmarking window, the researcher extracted
anonymized STAR assessment data from the district’s secure data warehouse. Prior to researcher
access, all personally identifiable information was removed by the district in accordance with
data governance policies. The dataset included only variables necessary for analysis, such as
assessment scores, testing windows, grade level, WIN modality assignment, and approved
covariates when available. Each student was assigned a unique, non-identifiable study code to
allow matching of August and January records while preserving confidentiality. Data were stored
and accessed exclusively on password-protected district systems, and no data were downloaded
to personal devices or shared externally.
Once extracted, the dataset underwent a systematic data cleaning process to ensure
accuracy and integrity. This process included removing cases with missing or incomplete
assessment records, verifying that each included student had a matched August and January
STAR score, and confirming correct assignment to either the virtual or in-person WIN group
based on district records. Additional checks were conducted to identify potential data entry errors

62
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
or statistical outliers that could distort growth analyses. Decisions regarding data inclusion or
exclusion were documented to maintain transparency and replicability.
Student growth was calculated using a consistent and clearly defined formula: January
STAR scale score minus August STAR scale score for each student. This approach produced an
individual growth score that reflected progress over the fall instructional period while accounting
for baseline performance. Growth scores were then aggregated and analyzed at the group level to
compare outcomes between students receiving virtual WIN interventions and those receiving inperson WIN interventions. Analyses incorporated control variables, when available, to account
for grade-level differences, baseline achievement, attendance, or identified student support
needs.
Statistical analyses were conducted using SPSS or Excel-based data analysis tools,
depending on district access and analytical needs. Descriptive statistics were used to summarize
group characteristics and growth patterns, and comparative analyses were conducted to examine
differences in growth between WIN modalities. Throughout the analysis process, results were
reported only in aggregate form. No individual student, educator, or subgroup was identified in
any reports or findings.
This study was designed as an applied internal improvement effort using existing deidentified academic data; it presented minimal risk to participants. Student identities were
protected through anonymization, secure data storage, and restricted access, and participation in
the study had no impact on instructional decisions, grades, placement, or services. These
procedures ensured ethical compliance with district data governance standards, FERPA
requirements, and IRB expectations for confidentiality and data protection.

63
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Data Analysis
Analysis for RQ1: Student Growth


A paired samples t-test was used to compare:



Mean STAR score in August



Mean STAR score in January



This test determined whether the change in scores was statistically significant.

Analysis for RQ2: Modality Comparison


An independent samples t-test was used to compare:



Mean STAR growth for in-person WIN students



Mean STAR growth for virtual WIN students

Assumptions
Both analyses were checked for:


Normality



Homogeneity of variance



Independence of observations

If assumptions were violated, appropriate nonparametric tests or adjusted procedures (e.g.,
Welch’s t-test) were used.
Ethical Considerations
This study relied exclusively on existing, de-identified student data and did not involve
the collection of any new information from participants. At no point were student names,
identification numbers, dates of birth, addresses, or any other personally identifiable information
(PII) accessed, recorded, or analyzed. Prior to the researcher’s access, all data were stripped of
direct and indirect identifiers by the school district in accordance with established data

64
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
governance protocols, ensuring that individual students could not be identified by the researcher
either directly or through data triangulation.
All research data were stored and accessed solely within secure, password-protected
district systems that met institutional data security requirements. Access to these systems was
limited to authorized personnel, and the researcher adhered to district policies regarding data
handling, storage, and confidentiality. No data was downloaded to personal devices, shared
externally, or stored on non-district platforms. Data were used only for the purposes outlined in
this study and were not retained beyond the approved research timeline.
Research procedures complied fully with the school district’s data governance policies as
well as the requirements of the Family Educational Rights and Privacy Act (FERPA). The use of
archival, de-identified data ensured that student privacy was protected and that no individual
student could be singled out or adversely affected as a result of participation in the study.
Additionally, participation in this research was entirely independent of instructional decisionmaking. The data analyzed were not used to inform grading, placement, intervention eligibility,
or any educational services, and findings were reported only in aggregate form. As a result, the
study posed no risk to students, preserved confidentiality, and maintained clear ethical separation
between research activities and instructional practices.
Approval from the Institutional Review Board (IRB) was obtained prior to analysis.
Trustworthiness, Reliability, and Validity of the Study
Although quantitative research did not use qualitative criteria such as “credibility” or
“trustworthiness,” the following quantitative validity principles applied:

65
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Reliability
The assessment data used in this study demonstrated strong reliability based on
established psychometric evidence from the STAR assessment technical documentation.
According to the STAR Reading and STAR Math Technical Manuals published by Renaissance,
the assessments exhibited high internal consistency and test-retest reliability across grade levels,
with reported reliability coefficients generally meeting or exceeding accepted thresholds for
educational decision-making (Renaissance, 2023). These reliability metrics indicated that STAR
assessments produced stable and consistent results over time when measuring student
achievement and growth. In addition, all data extraction procedures for this study followed
standardized, district-documented protocols. Data were pulled using consistent parameters across
time points, ensuring uniformity in reporting windows, student inclusion criteria, and variable
definitions. This standardization further strengthened the reliability of the dataset and reduced
the likelihood of procedural error influencing results.
Internal Validity
Several common threats to internal validity, such as history, maturation, and selection
bias, were minimized through the design of this study. The same standardized assessment tool
(STAR) was used consistently across all measurement points, reducing instrumentation threats
and ensuring comparability of scores over time. Although the comparison groups were naturally
occurring rather than randomly assigned, they were comparable in purpose, as all students were
enrolled in the same cyber program and participated in the same overarching MTSS framework.
Additionally, growth calculations accounted for baseline differences by examining student
progress relative to their initial performance levels rather than relying solely on raw scores. This

66
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
approach strengthened causal inferences by focusing on growth trajectories and reduced the
influence of preexisting achievement differences between groups.
External Validity
The generalizability of the findings was intentionally bounded. Results were most
applicable to cyber students participating in similar virtual MTSS structures, particularly those
using adaptive online curricula and operating within virtual programs that included WIN (What I
Need) periods or structured intervention blocks. While these parameters limited broad
generalization to all educational settings, the findings remained relevant to a growing number of
cyber, hybrid, and blended learning programs that were expanding nationally. As virtual
education models continued to scale and refine MTSS implementation, the results of this study
offered practical insights that informed decision-making and program design in comparable
contexts.
Summary
This chapter described the quantitative methodology used to examine academic growth
and the impact of WIN intervention modality within a K-12 cyber MTSS framework. Using
STAR Benchmark data, the study evaluated both within-group growth from August to January
and between-group differences between in-person and virtual WIN interventions. The selected
design, instruments, variables, procedures, statistical analyses, and ethical safeguards aligned
with the study’s purpose of improving equitable MTSS implementation in cyber education.

67
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Chapter 4
Results
Introduction
The purpose of this applied research study was to examine academic growth among
students enrolled in a K-12 cyber program implementing a Multi-Tiered System of Supports
(MTSS) framework. All participants in the study were cyber students; however, they received
intervention support through two different delivery modalities. Some students attended in-person
WIN (What I Need) intervention sessions at the program’s drop-in center, while others
participated in virtual WIN sessions delivered through Microsoft Teams. The study compared the
effectiveness of these two intervention delivery formats. Academic growth was measured using
the Renaissance STAR Benchmark assessment administered at two time points during the
academic year (August and January).
The study addressed the following research questions:
RQ1: To what extent did students enrolled in the cyber program demonstrate academic growth
on the STAR Benchmark assessment from August to January?
RQ2: Was there a statistically significant difference in STAR Benchmark growth between
students receiving in-person WIN support and those receiving virtual WIN support?
To answer the research questions, two statistical procedures were used. First, a paired
samples t-test was conducted to examine overall academic growth across the entire sample of
cyber students by comparing STAR Benchmark scores from the August administration (pretest)
to the January administration (posttest). This analysis determined whether students enrolled in
the cyber program demonstrated statistically significant academic growth over the course of the
semester, regardless of intervention delivery format.

68
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Second, an independent samples t-test was conducted to compare mean growth scores
between the two intervention groups: cyber students who attended in-person WIN (What I Need)
sessions at the drop-in center and cyber students who participated in virtual WIN sessions
delivered through Microsoft Teams. This analysis examined whether differences in academic
growth were associated with the intervention delivery modality.
This chapter presents the results of the statistical analysis. First, the dataset and sample
characteristics are described. Next, descriptive statistics for STAR scores and growth outcomes
are reported. Finally, results are presented by research question.
Description of the Dataset
The dataset consisted of student STAR Benchmark assessment records collected from the
cyber program during the fall semester. Student scores from the August universal screening
window and the January benchmark window were extracted and paired to measure academic
growth over time.
After removing incomplete records and students without both benchmark scores, the final
dataset included N = 28 student records. Students represented multiple grade levels within the
cyber program, including grades Kindergarten to 6 th. Students were categorized into two
naturally occurring intervention groups based on their participation in weekly WIN sessions:
In-Person WIN: Students attending regularly scheduled WIN (What I Need) intervention
sessions at the district’s Drop-In Center during the school week. These sessions occurred weekly
on Wednesdays and provided students with direct, in-person academic support aligned to
identified learning needs Virtual WIN: Students receiving regularly scheduled WIN intervention
support remotely through Microsoft Teams. These sessions also occurred weekly on Wednesdays

69
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
and provided targeted academic support in a virtual format aligned to students’ identified
learning needs.
Group placement was determined by scheduling availability, family preference, transportation
access, and program logistics rather than random assignment.
Table 1
Sample Reading Characteristics
Characteristic

n

Percentage

In-Person WIN

16

57.1%

Virtual WIN

12

42.9%

Total Students

28

100%

Grade K-2

11

39.3%

Grade 3-4

7

25.0%

Grade 5-6

10

35.7%

The final dataset included 28 students with matched fall and winter STAR Reading
scores. Of these students, 16 (57.1%) participated in in-person WIN sessions, while 12 (42.9%)
participated in virtual WIN sessions. Students represented grades K through 6, with 39.3% in
grades K-2, 25.0% in grades 3-4, and 35.7% in grades 5-6. No students in grades 7 or 8 were
included in the dataset.

70
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Table 2
Sample Math Characteristics
Characteristic

n

Percentage

In-Person WIN

15

57.7%

Virtual WIN

11

42.3%

Total Students

26

100%

Grade K-2

10

38.5%

Grade 3-4

6

23.1%

Grade 5-6

10

38.5%

The final math dataset included 26 students with matched fall and winter STAR Math
scores. Of these students, 15 (57.7%) participated in in-person WIN sessions, while 11 (42.3%)
participated in virtual WIN sessions. Students represented grades K through 6, with 38.5% in
grades K-2, 23.1% in grades 3-4, and 38.5% in grades 5-6.
Descriptive Statistics
Descriptive statistics were calculated for August STAR scores, January STAR scores,
and overall growth scores. Growth scores were calculated by subtracting the average August
benchmark score from the average January benchmark score for the whole group. Overall,
students demonstrated measurable growth between the two testing windows. Table 2 provides
descriptive statistics for STAR scores across both benchmark administrations.
Descriptive statistics indicated that the mean August STAR Reading score was 1012.7
(SD = 118.6), while the mean January STAR score increased to 1056.9 (SD = 113.2). Students
demonstrated an average growth of 44.2 points across the assessment period, with growth scores
ranging from -2 to 238.

71
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Table 3
Descriptive Statistics for Reading STAR Benchmark Scores
Measure

Mean

Standard Deviation

Minimum

Maximum

August Reading STAR
Score

1012.7

118.6

615

1150

January Reading STAR
Score

1056.9

113.2

853

1178

44.2

56.9

-2

238

Growth Score

Table 4
Descriptive Statistics for Math STAR Benchmark Scores
Measure

Mean

Standard Deviation

Minimum

Maximum

August Math STAR Score

1001.4

114.8

801

1155

January Math STAR Score

1019.7

118.2

887

1164

18.3

42.6

-55

111

Growth Score

Descriptive statistics were also calculated separately for each intervention group to compare
baseline performance and growth patterns between students participating in in-person and virtual
WIN sessions.
Table 5
Descriptive Statistics by Intervention Group – Reading
Group

August Mean

January Mean

Growth Mean

SD Growth

In-Person WIN

1008.9

1056.5

47.6

39.5

Virtual WIN

1017.8

1057.1

39.3

42.8

72
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Figure B1
In-Person vs. Online Reading Growth Comparison

Note. Mean STAR Reading growth scores from August to January are displayed for cyber
students receiving in-person WIN interventions at the Drop-In Center and students receiving
virtual WIN interventions through Microsoft Teams. While students attending in-person WIN
sessions demonstrated slightly higher average growth than those participating in virtual WIN
sessions, the difference between groups was not statistically significant based on the independent
samples t-test.

Table 6
Descriptive Statistics by Intervention Group – Math
Group

August Mean

January Mean

Growth Mean

SD Growth

In-Person WIN

999.5

1018.2

18.7

39.4

Virtual WIN

1004.1

1021.6

17.5

46.5

73
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Figure B2
In-Person vs. Online Math Growth Comparison

Note. Comparison of mean STAR Math growth between students receiving in-person WIN
support and those receiving virtual WIN support. Students participating in in-person WIN
demonstrated slightly higher mean growth (M = 18.7) compared to students receiving virtual
WIN support (M = 17.5), though the difference was not statistically significant.
Descriptive statistics were also calculated separately for each intervention group to
compare baseline performance and growth patterns between students participating in in-person
and virtual WIN sessions. As shown in Table 6, students receiving in-person WIN support
demonstrated a mean growth of 18.7 points, while students receiving virtual WIN support

74
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
demonstrated a mean growth of 17.5 points between the August and January benchmark
assessments.
Results by Research Question
Research Question 1
RQ1: To what extent did students enrolled in the cyber program demonstrate academic
growth on the STAR Benchmark assessment from August to January?
To examine overall academic growth, paired samples t-tests were conducted comparing
August and January STAR Benchmark scores for both Reading and Math.
Reading Growth
A paired samples t-test was conducted to examine changes in STAR Reading scores for
the entire sample of students enrolled in the cyber program who participated in the study. This
analysis compared students’ STAR Reading benchmark scores from the August administration
(pretest) to their scores from the January administration (posttest) in order to determine whether
overall academic growth occurred across the assessment period.
Results indicated a statistically significant increase in student scores, t(27) = 4.12, p
< .001, suggesting that the full sample of cyber students demonstrated measurable academic
growth between the two benchmark administrations. On average, students experienced an
increase of 44.2 points in STAR Reading scores across the assessment period. These findings
indicate that, when considered as a whole group, students participating in the cyber program
demonstrated positive academic progress during the first semester of the academic year.
Table 7
Paired Samples t-Test Results for STAR Reading Growth

75
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Variable
August vs January

Mean
Difference

SD

t

df

p

+44.2

56.9

4.12

27

<.001

Math Growth
A paired samples t-test was conducted to examine changes in STAR Math scores for the
entire sample of students enrolled in the cyber program who participated in the study. This
analysis compared students’ STAR Math benchmark scores from the August administration
(pretest) to their scores from the January administration (posttest) to determine whether overall
academic growth occurred across the assessment period.
Results indicated a statistically significant increase in student scores, t(25) = 2.19, p = .038,
suggesting that the full sample of cyber students demonstrated measurable academic growth in
mathematics between the two benchmark administrations. On average, students experienced an
increase of 18.3 points in STAR Math scores across the assessment period. These findings
indicate that, when considered as a whole group, students participating in the cyber program
demonstrated positive academic progress in mathematics during the first semester of the
academic year.
Table 8
Paired Samples t-Test Results for STAR Math Growth
Variable
August vs January

Mean Difference

SD

t

df

p

+18.3

42.6

2.19

25

.038

A paired samples t-test was conducted to examine differences between August and
January STAR Math scores. Results indicated that students demonstrated statistically significant

76
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
growth across the benchmark period, t(25) = 2.19, p = .038, with an average increase of 18.3
points.
Research Question 2
RQ2: Was there a statistically significant difference in STAR Benchmark growth between
students receiving in-person WIN support and those receiving virtual WIN support?
Independent samples t-tests were conducted to compare growth outcomes between students
receiving in-person WIN support and those receiving virtual WIN support for both Reading and
Math benchmark assessments.
Reading Results
Students receiving in-person WIN support demonstrated a mean growth score of 47.6
STAR points (SD = 39.5), while students receiving virtual WIN support demonstrated a mean
growth score of 39.3 STAR points (SD = 42.8). An independent samples t-test indicated that the
difference in growth between the two groups was not statistically significant, t(26) = 0.56, p
= .58.
Math Results
Students receiving in-person WIN support demonstrated a mean growth score of 18.7
STAR points (SD = 39.4), while students receiving virtual WIN support demonstrated a mean
growth score of 17.5 STAR points (SD = 46.5). An independent samples t-test indicated no
statistically significant difference in growth between the two groups, t(24) = 0.07, p = .94.
Table 8 presents the results of the independent samples t-test for Math STAR growth.

77
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Table 9
Independent Samples t-Test for Reading WIN Modality
Group

Mean Growth

SD

In-Person WIN

47.6

39.5

Virtual WIN

39.3

42.8

t

df

p

.56

26

.58

An independent samples t-test was conducted to compare mean growth scores between
students receiving in-person WIN support and those receiving virtual WIN support. Students
receiving in-person WIN support demonstrated a mean growth score of 47.6 STAR points (SD =
39.5), while students receiving virtual WIN support demonstrated a mean growth score of 39.3
STAR points (SD = 42.8). The difference between groups was not statistically significant, t(26)
= 0.56, p = .58, indicating that growth outcomes were similar across both intervention
modalities.
Table 10
Independent Samples t-Test for Math WIN Modality
Group

Mean Growth

SD

In-Person WIN

18.7

39.4

Virtual WIN

17.5

46.5

t

df

p

0.07

24

.94

An independent samples t-test was conducted to examine differences in STAR Math
growth between students receiving in-person WIN support and those receiving virtual WIN
support. Results indicated no statistically significant difference in growth between the two
groups, t(24) = 0.07, p = .94.

78
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Chapter Summary
This chapter presented the statistical results of the study examining academic growth
within a K-12 cyber program implementing a Multi-Tiered System of Supports (MTSS)
framework. Descriptive statistics were first reported to summarize student STAR Benchmark
average scores and growth between the August and January assessment windows. Paired samples
t-tests were conducted to examine overall academic growth across the sample. Independent
samples t-tests were then used to compare growth outcomes between students receiving inperson WIN support and those receiving virtual WIN support.
The results provided quantitative evidence regarding student growth patterns and
differences between intervention modalities. Interpretation of these findings, along with
discussion of their implications for MTSS implementation in cyber learning environments, will
be presented in Chapter 5.

Chapter 5
Discussion and Implications
Introduction
The purpose of this applied research study was to examine academic growth within a K12 cyber program implementing a Multi-Tiered System of Supports (MTSS) framework and to
evaluate the effectiveness of two intervention delivery modalities: in-person WIN (What I Need)
sessions and virtual WIN sessions delivered through Microsoft Teams. Student academic growth
was measured using Renaissance STAR Benchmark assessments administered during the August
and January testing windows.

79
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
This chapter interprets the findings presented in Chapter 4 and situates them within the
broader body of research on MTSS implementation, intervention systems, and online learning
environments. The chapter begins with a restatement of the problem and the purpose of the
study, followed by a summary of the key findings in accessible language. The results are then
discussed in relation to existing research and theoretical frameworks introduced in Chapter 2.
Finally, implications for educational practice are presented, along with study limitations,
recommendations for future research, and concluding reflections.
Statement of the Problem
While the Multi-Tiered System of Supports (MTSS) has demonstrated effectiveness in
traditional school settings, its implementation in cyber education remains less clearly understood.
Virtual schools face unique challenges related to student engagement, data monitoring,
communication, and intervention delivery. These structural differences raise questions about how
tiered support systems function when instruction and interventions occur in online environments.
Although many cyber programs have adopted elements of MTSS, including universal
screening and tiered academic support, research suggests that implementation varies widely
across programs (Burns & Gibbons, 2012; Fuchs & Fuchs, 2006). While MTSS frameworks
have been extensively studied in traditional brick-and-mortar school settings, the application of
these systems within fully online or cyber learning environments remains less clearly defined
(Barbour, 2019; Rice, 2020). In particular, limited research exists regarding whether different
intervention delivery methods—such as in-person support sessions compared to virtual
interventions—produce different academic outcomes for students in cyber learning environments
(Ferdig et al., 2009; Rice, 2020). Without empirical evidence examining these intervention

80
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
modalities, district leaders and program administrators may lack guidance for structuring
effective MTSS systems that effectively support students in virtual settings.
Purpose of the Study
The purpose of this applied research study was to examine student academic growth
within a K-12 cyber program implementing MTSS and to compare growth outcomes between
two intervention delivery modalities: in-person WIN support and virtual WIN support. Academic
growth was measured using Renaissance STAR Benchmark assessments administered during the
fall semester.
Specifically, the study sought to determine:
1. Whether students enrolled in the cyber program demonstrated significant academic
growth between August and January benchmark assessments.
2. Whether differences existed in academic growth outcomes between cyber students who
traveled to the program’s Drop-In Center to participate in in-person WIN intervention
sessions and cyber students who received WIN intervention support virtually through
Microsoft Teams.
By analyzing these outcomes, the study aimed to provide practical insights for school leaders and
educators seeking to strengthen MTSS structures within cyber learning environments.
Summary of Findings
The results of the statistical analyses indicated that students enrolled in the cyber program
demonstrated measurable academic growth between the August and January administrations of
the STAR Benchmark assessments, representing growth from the beginning of the academic year
through the end of the first semester. The paired-samples t-test revealed a statistically significant

81
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
increase in STAR scores across the sample, suggesting that students experienced meaningful
academic improvement during the first semester of enrollment in the cyber program.
This growth indicates that students were able to make progress in their academic
performance in both reading and mathematics over time while participating in the program’s
instructional structure. The statistically significant change between the two benchmark periods
suggests that the instructional supports, course content, and intervention structures implemented
during the semester contributed to measurable gains in student achievement.
These findings provide evidence that students within the cyber learning environment
were able to demonstrate academic progress during the first half of the academic year, which is
an important indicator of program effectiveness within an online learning context.
When growth outcomes were compared between intervention modalities, the
independent-samples t-test indicated that differences between students receiving in-person WIN
support and those receiving virtual WIN support were not statistically significant. Although
students participating in the in-person WIN model demonstrated slightly higher mean growth
scores compared to those receiving virtual WIN support, the magnitude of this difference was
relatively small and did not reach statistical significance. This finding suggests that the mode of
intervention delivery alone did not substantially influence student growth outcomes within the
sample. In other words, students who received targeted academic support through virtual
intervention sessions via Microsoft Teams demonstrated growth patterns that were comparable to
those who participated in in-person intervention sessions. The lack of statistically significant
differences between the two groups suggests that both models of support were similarly effective
in facilitating academic progress for students participating in the cyber program.

82
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
In practical terms, the results suggest that students participating in the cyber program
demonstrated measurable academic progress over the course of the first semester and that both
in-person and virtual WIN interventions were associated with positive student growth. These
findings are particularly relevant within the context of cyber education, where questions often
arise regarding the effectiveness of virtual instructional supports compared to traditional inperson services. The results of this study indicate that virtual intervention delivery, when
implemented within a structured Multi-Tiered System of Supports (MTSS) framework, may
support student learning outcomes at levels comparable to in-person intervention support. This
suggests that cyber programs that incorporate structured intervention blocks, consistent progress
monitoring, and targeted academic support can create learning environments that effectively
promote student growth. Furthermore, the findings support the potential for flexible intervention
models that combine both virtual and in-person supports to meet the diverse needs of students
within cyber learning environments.
Discussion
RQ 1
To what extent did students enrolled in the cyber program demonstrate academic growth on the
STAR Benchmark assessment from August to January?
The results indicated that students demonstrated statistically significant academic growth
between the August and January benchmark assessments. This finding suggests that students
participating in the cyber program experienced measurable improvement in academic
performance during the first semester of enrollment.
These findings align with prior research indicating that Multi-Tiered System of Supports
(MTSS) frameworks can support academic growth when interventions are implemented with

83
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
fidelity and supported by consistent data monitoring (Balu et al., 2015; Burns & Gibbons, 2012;
Fuchs & Fuchs, 2006; Freeman-Green et al., 2021; Gersten et al., 2009; McIntosh & Goodman,
2016; National Center on Intensive Intervention, 2013; Sugai & Horner, 2002). MTSS is
designed to provide a structured system of increasingly intensive instructional supports that
respond to student learning needs through universal screening, progress monitoring, and targeted
intervention. Within this framework, educators use assessment data to identify students who may
be at risk of academic difficulty and implement tiered supports intended to accelerate learning
and close achievement gaps.
Research on MTSS implementation has consistently demonstrated that structured
intervention systems, when paired with ongoing data analysis and progress monitoring, can lead
to measurable improvements in student academic performance (Fuchs & Fuchs, 2006; Burns et
al., 2015). These systems enable schools to move away from reactive models of support and
instead implement proactive instructional strategies that address learning challenges earlier in the
instructional process (Buffum et al., 2012). Additionally, MTSS frameworks encourage
collaboration among educators, promote data-informed instructional decision-making, and
provide systematic processes for adjusting interventions based on student response (Fixsen et al.,
2019; Burns et al., 2015).
Within the context of cyber education, the growth observed in this study may reflect the
structured implementation of MTSS practices within the program, including universal screening,
weekly WIN intervention periods, and targeted academic supports based on student need. At the
beginning of the academic year, students completed the Renaissance STAR Benchmark
assessments, which served as a universal screening tool to identify students who may require
additional academic support in reading or mathematics. These benchmark results were reviewed

84
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
by teachers, instructional coaches, and program administrators to determine student placement
within appropriate intervention levels and to identify specific areas of academic need.
Based on these data, students who required additional support were scheduled to
participate in weekly WIN (What I Need) intervention sessions held on Wednesdays. During
these sessions, students received targeted academic support designed to address the specific skill
areas identified through benchmark data and ongoing progress monitoring. Some students
attended these sessions in person at the program’s Drop-In Center, where teachers provided
direct, small-group or individualized instruction. Other students participated in WIN sessions
virtually through Microsoft Teams, where teachers and instructional coaches provided targeted
support using digital instructional tools, guided practice, and individualized feedback.
Throughout the semester, teachers and instructional staff continued to monitor student
performance using course data, assignment completion, and ongoing assessment results within
the learning management systems used by the cyber program. Instructional coaches and
administrators regularly reviewed student progress data and collaborated with teachers to
determine whether adjustments to intervention strategies or levels of support were needed. This
ongoing cycle of data review, intervention implementation, and progress monitoring reflects the
core principles of the MTSS framework and may have contributed to the academic growth
observed among students during the first semester of the academic year.
Additionally, the integration of MTSS structures within the cyber program may have
helped maintain instructional consistency across both online and in-person learning
environments. Within the program, this consistency was supported through several coordinated
practices. First, all students participated in the same universal screening process using the STAR
Benchmark assessments, which provided a shared data point for teachers, instructional coaches,

85
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
and administrators to identify students in need of additional support. These data were reviewed
collaboratively to determine appropriate intervention needs and to guide placement into weekly
WIN intervention sessions.
Second, the WIN intervention structure itself was designed to ensure that both in-person
and virtual students received targeted support aligned to the same academic skill areas identified
through benchmark data. Regardless of whether students attended WIN sessions in person at the
Drop-In Center or participated virtually through Microsoft Teams, instructors focused on the
same intervention goals and skill deficits identified through student data. Teachers and
instructional staff also monitored student progress through course performance, assignment
completion, and ongoing assessment data within the program’s digital learning platforms.
Instructional coaches and administrators supported this process by regularly reviewing student
performance data and collaborating with teachers to determine whether adjustments to
intervention strategies were needed. This ongoing cycle of data review, intervention planning,
and instructional adjustment helped ensure that academic supports remained aligned across
delivery formats and that students received consistent intervention support regardless of whether
they participated in-person or virtually.
As discussed in Chapter 2, research suggests that strong Tier 1 instruction combined with
targeted Tier 2 interventions can support academic progress even when instruction occurs in
virtual settings (Evans et al., 2022). When supported by consistent data collection and
collaborative intervention planning, MTSS frameworks may help ensure that students continue to
receive appropriate academic support regardless of instructional delivery format.
While these findings demonstrate positive academic growth among students enrolled in
the cyber program, it is important to acknowledge that the study design does not establish causal

86
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
relationships. Rather, the results suggest an association between participation in the program’s
MTSS structures and observed student growth during the study period.
RQ 2
Was there a statistically significant difference in STAR Benchmark growth between students
receiving in-person WIN support and those receiving virtual WIN support?
The analysis indicated that no statistically significant difference existed in growth
outcomes between students receiving in-person WIN support and those receiving virtual WIN
support. Although average growth scores differed slightly between the two groups, the
differences were not statistically meaningful.
This finding suggests that virtual intervention delivery may provide academic support
comparable to in-person intervention sessions when implemented within a structured MTSS
framework. In this cyber program, the comparability of the two intervention formats was not
based simply on offering both options; rather, it was supported through a shared MTSS process
used across groups. All students participated in the same universal screening process through the
STAR Benchmark assessments, and those data were reviewed by teachers, instructional coaches,
and administrators to identify students in need of additional support. Students identified for
intervention were then assigned to weekly WIN sessions held on Wednesdays, with some
students traveling to the program’s Drop-In Center for in-person support and others participating
virtually through Microsoft Teams. Regardless of delivery format, intervention decisions were
guided by the same benchmark data, the same identified skill deficits, and the same expectation
that support would be adjusted in response to student performance. This approach reflects core
MTSS practices that emphasize universal screening, data-based decision making, and tiered

87
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
intervention structures designed to respond to student need (Burns & Gibbons, 2012; Fuchs &
Fuchs, 2006; McIntosh & Goodman, 2016).
The program also promoted alignment between intervention groups through consistent
monitoring of student progress using multiple data sources. In addition to the STAR benchmark
windows, teachers and instructional staff reviewed course performance, assignment completion,
and other indicators of academic responsiveness throughout the semester. Instructional coaches
and administrators collaborated with teachers to review these data and determine whether
adjustments to intervention strategies were needed. This ongoing process reflects the MTSS
problem-solving model in which teams regularly analyze student data, implement targeted
interventions, and adjust supports based on student response (Gersten et al., 2009; National
Center on Intensive Intervention, 2013).
Within this structure, one possible explanation for the similarity in outcomes between
intervention modalities is that both in-person and virtual WIN sessions were anchored to the
same instructional goals, intervention materials, and data-driven decision-making processes.
When core MTSS elements such as screening, targeted intervention, and progress monitoring are
implemented consistently, research suggests that student academic outcomes may improve
regardless of instructional setting (Balu et al., 2015; Sugai & Horner, 2002).
Another possible explanation is that virtual platforms may provide increased flexibility
and accessibility for some students. In cyber learning environments, online platforms allow
students to access support without the same logistical barriers associated with physical
attendance, such as transportation or scheduling conflicts. Research on online and blended
learning environments has suggested that the flexibility of virtual platforms can expand access to

88
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
instructional support and allow programs to deliver targeted interventions to students who might
otherwise face barriers to participation (Barbour, 2019; Rice, 2020).
However, these findings should be interpreted cautiously. Although the results suggest
that both intervention modalities were associated with comparable academic outcomes within
this program, they do not indicate that intervention delivery methods produce identical learning
experiences in all contexts. Additional research examining engagement patterns, instructional
strategies, and student demographics may help clarify the conditions under which different
intervention modalities are most effective.
Implications for Practice
The findings of this study have several practical implications for educators,
administrators, and district leaders implementing MTSS within cyber learning environments.
First, the results suggest that structured intervention periods such as WIN time can support
academic growth when integrated into cyber programs. Establishing consistent intervention
blocks allows educators to provide targeted academic support while maintaining alignment with
MTSS principles of early identification, tiered instruction, and ongoing progress monitoring.
Research on MTSS implementation has emphasized the importance of dedicated intervention
time within the school schedule to ensure that students receive targeted instruction aligned with
identified learning needs (Burns & Gibbons, 2012; Fuchs & Fuchs, 2006; McIntosh & Goodman,
2016). Cyber programs may similarly benefit from scheduling dedicated WIN periods that
provide opportunities for individualized instruction, small-group support, and systematic
monitoring of student progress.
Second, the findings highlight the potential viability of virtual intervention delivery
within cyber programs. The absence of statistically significant differences between intervention

89
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
modalities suggests that virtual platforms may serve as effective environments for delivering
targeted academic support when in-person access is limited. Prior research on online and blended
learning environments has suggested that digital instructional platforms can support interactive
learning, targeted feedback, and differentiated instruction when implemented within structured
instructional systems (Barbour, 2019; Ferdig et al., 2009; Rice, 2020). This flexibility may be
particularly important for programs serving geographically dispersed student populations or
students who face transportation or scheduling constraints that limit access to in-person services.
Third, the results emphasize the importance of data-informed decision making within
cyber MTSS frameworks. Consistent benchmark assessments, progress monitoring systems, and
structured data review meetings allow educators to identify student needs early and adjust
instructional supports accordingly. MTSS literature consistently highlights the role of universal
screening and progress monitoring as key components of effective intervention systems that
enable schools to respond proactively to student learning needs (Gersten et al., 2009; National
Center on Intensive Intervention, 2013). Within cyber learning environments, digital dashboards
and integrated data platforms may further enhance educators’ ability to monitor student
engagement and academic progress in real time.
Finally, district leaders implementing cyber MTSS systems may consider developing
structured collaboration processes among teachers, interventionists, and support staff. Research
on MTSS implementation suggests that collaborative problem-solving teams and shared data
analysis processes can strengthen intervention fidelity and ensure that supports are consistently
implemented across instructional settings (Burns & Gibbons, 2012; McIntosh & Goodman,
2016). Professional development focused on digital intervention strategies and collaborative data

90
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
review may therefore strengthen the consistency, coordination, and sustainability of MTSS
implementation in cyber learning environments.
Limitations
Several limitations should be considered when interpreting the findings of this study.
First, the study was conducted within a single cyber program located in one Pennsylvania school
district. As a result, the findings may not be generalizable to other cyber schools or virtual
programs operating under different organizational structures, instructional models, or
demographic contexts.
Second, the sample size was limited to students who had complete benchmark assessment
data for both the August and December testing windows. Students with incomplete records were
excluded from the analysis, which may have influenced the overall representation of the dataset
and reduced the size of the analytic sample.
Third, the study relied exclusively on Renaissance STAR Benchmark assessment data to
measure academic growth. Although benchmark assessments provide useful indicators of student
progress and allow for consistent measurement across time points, they represent only one
dimension of academic achievement and may not fully capture broader learning outcomes.
Fourth, the study examined outcomes within a relatively short time frame, focusing on
academic growth during a single semester. Examining student outcomes across multiple
academic years may provide additional insight into the sustained impact of MTSS interventions
within cyber learning environments.
Finally, the quasi-experimental design limited the ability to control for external variables
influencing student performance. Students were not randomly assigned to intervention groups;
instead, group placement was influenced by factors such as scheduling, family preference,

91
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
transportation, and staffing considerations. Consequently, the study identifies associations
between intervention modalities and student outcomes rather than establishing causal
relationships.
Recommendations for Future Research
Future research may build upon the findings of this study in several meaningful ways to
further advance understanding of Multi-Tiered System of Supports (MTSS) implementation
within cyber learning environments.
First, additional studies could examine MTSS implementation across multiple cyber
programs or districts to better understand how program structure, student demographics, and
instructional delivery models influence intervention effectiveness. The present study focused on
a single district cyber program, which provided valuable insight into the relationship between
intervention modality and benchmark growth within that context. However, cyber programs vary
widely in terms of enrollment structures, staffing models, instructional platforms, and levels of
synchronous support. Expanding the scope of research to include multiple cyber programs across
different geographic regions and institutional contexts may allow researchers to identify patterns
of effective MTSS implementation and determine which structural elements most strongly
contribute to student success. Comparative studies across districts may also reveal how
variations in staffing, intervention scheduling, or program design influence student growth
within cyber learning environments.
Second, longitudinal research examining student outcomes across multiple semesters or
academic years could provide deeper insight into the long-term impact of MTSS systems in
virtual settings. While the current study examined growth across a single semester between
benchmark assessment windows, extended research could investigate whether the academic

92
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
gains observed within shorter intervention periods are sustained over time. Longitudinal analyses
may also help determine whether students who receive consistent intervention support
demonstrate cumulative growth across multiple grade levels or academic years. Additionally,
extended studies could explore how early intervention within cyber programs affects long-term
academic trajectories, including grade progression, credit accumulation, and overall program
persistence.
Third, future research could expand the range of outcome variables used to measure
student success within cyber learning environments. The present study relied primarily on
benchmark assessment growth as an indicator of academic progress. While standardized
assessments provide valuable data regarding skill development, additional indicators such as
course completion rates, assignment submission patterns, student engagement metrics,
attendance within synchronous learning sessions, and persistence within the program may
provide a more comprehensive understanding of student outcomes. Incorporating multiple
indicators of success may help researchers better understand how academic growth interacts with
behavioral, motivational, and engagement factors within cyber settings.
Fourth, qualitative research examining educator and student experiences with MTSS
interventions in virtual learning environments may provide valuable insight into the
implementation process and perceived effectiveness of intervention models. While quantitative
data can identify measurable changes in student outcomes, qualitative approaches such as
interviews, focus groups, and observational studies may help illuminate how interventions are
delivered in practice and how participants experience those supports. For example, future studies
could explore how educators adapt intervention strategies for digital platforms, how students
perceive the accessibility and usefulness of virtual support sessions, and what challenges

93
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
teachers encounter when delivering differentiated instruction in cyber environments.
Understanding these experiences may help refine intervention strategies and improve
implementation fidelity across virtual MTSS systems.
Finally, future research could examine how culturally sustaining and culturally
responsive instructional practices intersect with MTSS implementation in cyber learning
environments. As discussed in Chapter 2, culturally sustaining practices emphasize the
importance of recognizing and valuing students’ diverse cultural identities, linguistic
backgrounds, and lived experiences within instructional design. Cyber programs often serve
highly diverse student populations, including students who may seek alternative learning
environments due to prior academic challenges, social experiences, or personal circumstances.
Investigating how culturally responsive instructional strategies, family engagement practices,
and community-centered approaches can be integrated into virtual MTSS frameworks may help
strengthen equity and responsiveness within cyber programs. Such research may also identify
strategies that better support historically marginalized student populations within online learning
contexts.
Collectively, these areas of future research have the potential to deepen understanding of
MTSS implementation within cyber education and contribute to the development of more
effective, equitable, and sustainable intervention systems for students learning in virtual
environments.
Conclusion
This study examined academic growth within a K-12 cyber program implementing a
Multi-Tiered System of Supports (MTSS) framework and compared outcomes between two
intervention delivery modalities: in-person WIN (What I Need) sessions and virtual WIN

94
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
sessions delivered through Microsoft Teams. The results indicated that students demonstrated
measurable academic growth during the first semester of participation in the cyber program.
Additionally, the analysis revealed no statistically significant differences in academic growth
between students receiving in-person intervention support and those receiving virtual
intervention support.
These findings suggest that cyber programs can successfully implement MTSS structures
when interventions are supported by consistent data monitoring, structured intervention periods,
and collaborative instructional practices. While implementing MTSS within virtual environments
presents unique challenges, the results of this study highlight the potential for flexible
intervention delivery models to effectively support student learning.
As cyber education continues to expand, the development of evidence-based MTSS
frameworks tailored to virtual learning contexts will remain essential for ensuring equitable and
effective student support. Continued research and program development may help refine
intervention models, strengthen data-driven decision making, and enhance the ability of cyber
programs to meet the diverse needs of students in online learning environments.

95
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
References
Al Otaiba, S., Fien, H., & Torgesen, J. K. (2019). Identifying and intervening with beginning
readers who are at risk for dyslexia: Advances in individualized classroom instruction. In
D. Kilpatrick (Ed.), Reading development and difficulties (pp. 55-78). Springer.
https://doi.org/10.1007/978-3-030-26550-2_4
Al Otaiba, S., Petscher, Y., Pappamihiel, N. E., Williams, R. S., Dyrlund, A. K., & Connor, C.
M. (2014). A longitudinal examination of response to intervention implementation.
Journal of Learning Disabilities, 47(3), 224-238.
https://doi.org/10.1177/0022219412453874
Archer, A. L., & Hughes, C. A. (2011). Explicit instruction: Effective and efficient teaching.
Guilford Press.
Ardoin, S. P., Binder, K. S., Zawoyski, A. M., & Foster, T. E. (2013). The relation between databased decision making and RTI implementation. School Psychology Review, 42(4), 534550.
Aronson, B., & Laughter, J. (2016). The theory and practice of culturally relevant education: A
synthesis of research across content areas. Review of Educational Research, 86(1), 163206. https://doi.org/10.3102/0034654315582066
Balu, R., Zhu, P., Doolittle, F., Schiller, E., Jenkins, J., & Gersten, R. (2015). Evaluation of
response to intervention practices for elementary school reading.
National Center for Education Evaluation and Regional Assistance.
https://ies.ed.gov/ncee/pubs/20154006/
Batsche, G., & Elliott, J. (2014). Creating and sustaining a culture of problem-solving and
response to intervention. Joural of Learning Disabilities, 47(2), 99-107.

96
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
https://doi.org/10.1177/0022219413477486http://doi.org/10.1007/978-0-387-49053-3_28
Barbour, M. K. (2019). The landscape of K–12 online learning: Examining policy, practice, and
research. Journal of Online Learning Research, 5(3), 239–246.
Batsche, G., & Tilly, W.D. (2017). Multi-tiered systems of support: A framework for school
improvement. Handbook of Response to Intervention. Springer.
Berkeley, S., Bender, W. N., Peaster, L. G., & Saunders, L. (2020). Implementation of response
to intervention: A snapshot of progress. Journal of Learning Disabilities, 53(2), 114-126.
https://doi.org/10.1177/0022219408326214
Borup, J., Chambers, C., & Stimson, R. (2020). Online teacher and on-site facilitator
perceptions of parental engagement at a supplemental virtual high school. International
Review of Research in Open and Distributed Learning, 21(4), 79-95.
https://doi.org/10.19173/irrodl.v21i4.4793https://doi.org/10.19173/irrodl.v20i2.4237
Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the Effects of Schoolwide
Positive Behavioral Interventions and Supports on Student Outcomes: Results From a
Randomized Controlled Effectiveness Trial in Elementary Schools: Results From a
Randomized Controlled Effectiveness Trial in Elementary Schools. Journal of Positive
Behavior Interventions, 12(3), 133-148.
Burns, M. K., & Gibbons, K. (2012). Implementing response-to-intervention in elementary and
secondary schools. Routledge.
Burns, M. K., Peters, R., & Noell, G. (2020). Fidelity of implementation in multi-tiered systems
of support. School Psychology Review, 49(1), 69-82.
https://doi.org/10.1016/j.jsp.2008.04.001
Castro-Olivo, S. (2014). Promoting social-emotional learning in adolescent Latino English

97
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
language learners. School Psychology Quarterly, 29(4), 567-582.
https://doi.org/10.1037/spq0000062
Choi, J., Meisenheimer, J., McCart, A., & Sailor, W. (2019). Improving learning for all students
through equity-based inclusive reform practices. Remedial and Special Education, 40(4),
203-213.
Cook, C. R., Lyon, A. R., Kubergovic, D., Browning Wright, D., & Zhang, Y. (2015).
A supportive beliefs intervention to facilitate the implementation of evidence-based
practices. Journal of School Psychology, 53(3), 197–211
Donohoo, J. (2017). Collective teacher efficacy: The power of educators’ collective beliefs.
Corwin.
Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011).
The impact of enhancing students’ social and emotional learning. Child Development,
82(1), 405–432. https://doi.org/10.1111/j.1467-8624.2010.01564.x
Evans, S. W., Owens, J. S., & Bunford, N. (2022). Evidence-based psychosocial interventions
for students in virtual learning environments. Journal of Online Learning Research, 8(2),
113-129.
Fixsen, D. L., Blase, K. A., Metz, A., & Van Dyke, M. (2019). Implementation of science:
Fidelity, predictions, and outcomes. Active Implementation Research Network.
https://www.activeimplementation.org/resources/implementation-science
Forman, S. G., & Crystal, C. D. (2015). Systems consultation for MTSS implementation.
School Psychology Review, 44(3), 245–260.
Freeman-Green, S., Clark, M. D., & Lightner, K. (2021). Multi-tiered systems of support and
culturally responsive practices: A systematic review. Education and Treatment of

98
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Children, 44(3), 233-256.
Fuchs, D., & Fuchs, L. S. (2006). Introduction to response to intervention: What, why, and how
valid is it? Reading Research Quarterly, 41(1), 93-99.
https://doi.org/10.1598/RRQ.41.1.4
Gay, G. (2018). Culturally responsive teaching: Theory, research, and practice (3rd ed.).
Teachers College Press.
Gersten, R., Compton, D., Connor, C., Dimino, J., Santoro, L., Linan-Thompson, S., & Tilly,
W. (2009). Assisting students struggling with reading: Response to
Interventi
. Institute of Education Sciences Practice Guide.
Gregory, A., Hafen, C. A., Ruzek, E. A., Mikami, A. Y., & Allen, J. P. (2016). Closing the racial
discipline gap in classrooms by changing teacher practice. School Psychology Review,
45(2), 171-191. https://doi.org/10.17105/SPR45-2.171-191
Guest, M., Wiley, K., & Vaden-Kiernan, N. (2024). Monitoring engagement and behavior in
virtual learning environments. Journal of Online Learning Research, 10(1), 45-63.
Hanover Research. (2020). Best practices in implementing MTSS in virtual and hybrid schools.
Hanover Research.
Haring Center for Inclusive Education, & Office of Superintendent of Public Instruction. (2020).
Multi-tiered system of supports: An implementation guide.
https://ippdemosites.org/wp-content/uploads/2020/11/Haring-Center-OSPI-MTSS-v2.pdf
Henderson, A. T., & Mapp, K. L. (2002). A new wave of evidence: The impact of school, family,
and community connections on student achievement. Southwest Educational
Development Laboratory.

99
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Hoover, J. J., & Love, E. (2011). Supporting school improvement through MTSS.
Theory Into Practice, 50(1), 1–7. https://doi.org/10.1080/00405841.2011.534909
Institute of Education Sciences. (2021). Supporting students in online learning environments.
U.S. Department of Education. https://ies.ed.gov
Jagers, R. J., Rivas-Drake, D., & Borowski, T. (2018). Equity and social and emotional learning:
A cultural analysis. Educational Psychologist, 53(2), 162-184.
Jagers, R. J., Rivas-Drake, D., & Williams, B. (2019). Transformative social and emotional
learning. American Educator, 43(4), 18-22.
Jeynes, W. H. (2018). A meta-analysis on the effects of parental involvement on Latino students’
academic outcomes. Education and Urban Society, 50(1), 4-33.
Lane, K. L., Kalberg, J. R., & Menzies, H. M. (2009). Developing schoolwide programs to
prevent and manage problem behaviors. Guilford Press.
Lane, K. L., Oakes, W. P., & Menzies, H. M. (2014). Systematic screening tools for behavior
disorders. Journal of Emotional and Behavioral Disorders, 22(3), 123–135.
McLeskey, J., & Waldron, N. (2015). Effective leadership for inclusive schools. Journal of
Special Education Leadership, 28(2), 67–77.
McIntosh, K., & Goodman, S. (2016). Integrated multi-tiered systems of support: Blending RTI
and PBIS. Guilford Press.
Mitchell, B. S., Stormont, M., & Gage, N. A. (2011). Tiered prevention frameworks in schools.
Education and Treatment of Children, 34(3), 353–377.
National Education Association. (n.d.). MTSS: More than alphabet soup. https://www.nea.org
National Center on Intensive Intervention. (2013). Data-based individualization: A framework
for intensive intervention. U.S. Department of Education.

100
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
OSEP Technical Assistance Center on PBIS. (2022). Positive Behavioral Interventions and
Supports implementation blueprint. https://www.pbis.org
Paris, D., & Alim, H. S. (2017). Culturally sustaining pedagogies: Teaching and learning for
justice in a changing world. Teachers College Press.Pane, J. F., Steiner, E. D., Baird, M. D., &
Hamilton, L. S. (2017). Informing progress:
Insights on personalized learning implementation. RAND Corporation.
Pennsylvania Training and Technical Assistance Network. (n.d.). Multi-tiered system of support
(MTSS). https://www.pattan.net
Renaissance Learning. (2023). STAR Reading and STAR Math technical manuals.
Renaissance Learning.
Rice, M. (2020). Supporting students with disabilities in K–12 online learning environments.
Educational Technology Research and Development, 68, 2199–2213.
Rice, M., East, T., & Mellard, D. (2020). Students with disabilities in K–12 online learning.
Journal of Online Learning Research, 6(1), 37–61.
Rosen, Y., & Beck-Hill, D. (2012). Intertwining digital content and learning. Journal of
Research on Technology in Education, 45(1), 53–64.
Sailor, W., Dunlap, G., Sugai, G., & Horner, R. (2009). Handbook of positive behavior
support. Springer.
Skiba, R. J., Arredondo, M. I., & Rausch, M. K. (2011). New and emerging research on
disparities in discipline. Indiana University Equity Project.
Sugai, G., & Horner, R. H. (2002). The evolution of discipline practices: School-wide positive
behavior supports. Child & Family Behavior Therapy, 24(1-2), 23-50.
https://doi.org/10.1300/J019v24n01_03

101
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Sugai, G., Simonsen, B., Freeman, J., & La Salle, T. (2016). PBIS and systems change. School
Psychology Review, 45(1), 20–36.
Thurlow, M., Lazarus, S., & Christensen, L. (2018). Applying MTSS for students with
disabilities. National Center on Educational Outcomes.
Turnaround for Children. (2020). Building student engagement and well-being in virtual
learning environments. https://turnaroundusa.org
U.S. Congress. (2015). Every Student Succeeds Act, 20 U.S.C. § 6301.
https://www.congress.gov/bill/114th-congress/senate-bill/1177

102
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Appendix A
Additional Statistical Tables

Table 1
Sample Reading Characteristics
Characteristic

n

Percentage

In-Person WIN

16

57.1%

Virtual WIN

12

42.9%

Total Students

28

100%

Grade K-2

11

39.3%

Grade 3-4

7

25.0%

Grade 5-6

10

35.7%

Table 2
Sample Math Characteristics
Characteristic

n

Percentage

In-Person WIN

15

57.7%

Virtual WIN

11

42.3%

Total Students

26

100%

Grade K-2

10

38.5%

Grade 3-4

6

23.1%

Grade 5-6

10

38.5%

103
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program

Table 3
Descriptive Statistics for Reading STAR Benchmark Scores
Measure

Mean

Standard Deviation

Minimum

Maximum

August Reading STAR
Score

1012.7

118.6

615

1150

January Reading STAR
Score

1056.9

113.2

853

1178

44.2

56.9

-2

238

Growth Score

Table 4
Descriptive Statistics for Math STAR Benchmark Scores
Measure

Mean

Standard Deviation

Minimum

Maximum

August Math STAR Score

1001.4

114.8

801

1155

January Math STAR Score

1019.7

118.2

887

1164

18.3

42.6

-55

111

Growth Score

Table 5
Descriptive Statistics by Intervention Group – Reading
Group

August Mean

January Mean

Growth Mean

SD Growth

In-Person WIN

1008.9

1056.5

47.6

39.5

Virtual WIN

1017.8

1057.1

39.3

42.8

Table 6
Descriptive Statistics by Intervention Group – Math
Group

August Mean

January Mean

Growth Mean

SD Growth

104
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
In-Person WIN

999.5

1018.2

18.7

39.4

Virtual WIN

1004.1

1021.6

17.5

46.5

Table 7
Paired Samples t-Test Results for STAR Reading Growth
Variable
August vs January

Mean
Difference

SD

t

df

p

+44.2

56.9

4.12

27

<.001

Table 8
Paired Samples t-Test Results for STAR Math Growth
Variable
August vs January

Mean Difference

SD

t

df

p

+18.3

42.6

2.19

25

.038

Table 9
Independent Samples t-Test for Reading WIN Modality
Group

Mean Growth

SD

t

df

p

In-Person WIN

47.6

39.5

Virtual WIN

39.3

42.8

.56

26

.58

Mean Growth

SD

t

df

p

In-Person WIN

18.7

39.4

Virtual WIN

17.5

46.5

0.07

24

.94

Table 10
Independent Samples t-Test for Math WIN Modality
Group

105
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Appendix B
Figure B1
In-Person vs. Online Reading Growth Comparison

Note. Mean STAR Reading growth scores comparing in-person WIN and virtual WIN
intervention groups.

106
Examining Benchmark Growth Within MTSS in a K-12 Cyber Program
Figure B2
In-Person vs. Online Math Growth Comparison

Note. Mean STAR Math growth scores comparing in-person WIN and virtual WIN intervention
groups.