You've collected data for six months. You have spreadsheets, graphs, and numbers. But when the IEP team asks "How's the student doing?"—can you tell a coherent story? Data doesn't speak for itself. Your job is to be its translator, turning raw numbers into insights that guide decisions.
Why Data Needs a Story
Research on data-based decision making in education reveals a troubling gap: while 87% of special educators collect behavior data, only 34% report feeling confident analyzing it for decision-making (Espin et al., 2017). The data exists—but its potential goes unrealized.
Data Without Story vs. Data With Story
Without Story:
"Marcus had 47 incidents in September, 52 in October, 38 in November, and 41 in December."
Team response: "So... is that good?"
With Story:
"Marcus's calling out behavior peaked in October during our geometry unit but dropped 27% when we introduced scheduled check-ins. December's slight uptick coincided with holiday schedule changes—a pattern we also saw last year."
Team response: "What adjustments should we make for January?"
The second version doesn't have more data—it has context. It connects numbers to events, identifies patterns, and points toward action.
The Four Questions Every Data Review Must Answer
Before your mid-year review, make sure your data presentation answers these four essential questions:
Where did we start?
Reference baseline data from the beginning of the year or when the goal was written. Without a starting point, progress is meaningless.
Example: "At the start of the year, Marcus was calling out an average of 18 times per class period. His goal is to reduce to 5 or fewer."
Where are we now?
Report current performance using the same measurement as the goal. Include recent trend data, not just the most recent data point.
Example: "Over the past two weeks, Marcus has averaged 8 call-outs per class period, down from his October peak of 13."
Are we on track?
Calculate whether current progress trajectory will meet the year-end goal. Be honest about pace.
Example: "To meet his May goal of 5 or fewer, Marcus needs to reduce by approximately 0.6 incidents per month. His current rate of improvement is 0.8 per month—we're ahead of schedule."
What's influencing the data?
Identify factors that explain patterns—both positive and negative. This is where your observational expertise matters most.
Example: "The November improvement coincided with starting scheduled attention breaks. The December regression aligns with substitute teachers and schedule disruptions."
Trend Analysis: Seeing the Forest Through the Trees
Day-to-day data fluctuates. A student might have 2 incidents Monday, 8 Tuesday, and 4 Wednesday. If you report daily numbers, you'll confuse everyone—including yourself. What matters is the trend.
The Weekly Average Rule
Never report single days in isolation. Calculate weekly averages to smooth out daily variation. Compare week-over-week or month-over-month for meaningful patterns.
Three Trend Types to Identify
↓ Decreasing Trend
Problem behavior is reducing over time. This is what you want to see for reduction goals.
Report: "Average weekly incidents decreased from 15 (Sept) to 8 (Dec)—a 47% reduction."
↑ Increasing Trend
For replacement behaviors, this is positive. For problem behaviors, it signals needed intervention changes.
Report: "Hand-raising increased from 20% of opportunities (Sept) to 65% (Dec)."
→ Flat Trend
No significant change over time. May indicate intervention isn't working OR that the behavior is at a stable level.
Report: "Incidents have remained between 10-12 per week since October—intervention may need modification."
Calculating Trend Lines
For simple trend analysis, use the "split-middle" technique:
- Divide your data in half (e.g., Sept-Oct vs. Nov-Dec)
- Calculate the average for each half
- Compare: Is the second half higher, lower, or the same?
- Calculate percentage change: ((New - Old) / Old) × 100
Adding Context: The "Why" Behind the Numbers
Numbers tell you WHAT happened. Context tells you WHY. Document these contextual factors throughout the year so you can reference them during reviews:
| Context Type | Examples | How to Document |
|---|---|---|
| Intervention Changes | Started break cards, modified schedule, new reinforcement system | Note start date on graph; mark as "phase line" |
| Environmental Factors | Substitute teachers, schedule changes, seating moves | Add annotations to data points |
| Setting Events | Illness, family events, medication changes | Document in notes (respecting privacy) |
| Instructional Context | New unit, assessment week, preferred vs. non-preferred content | Track alongside behavior data |
Context Documentation Tip
Modern data collection tools let you add notes alongside each data entry. Take 5 seconds to add context: "substitute today," "fire drill during math," "came in upset from bus." These notes become invaluable during reviews.
Presenting Data to the IEP Team
Your audience matters. Parents, administrators, and related service providers have different data literacy levels and priorities. Structure your presentation to serve everyone.
The 3-2-1 Presentation Format
3 Key Findings (1 minute)
Lead with the headline. What are the three most important things to know?
Example: "1) Call-outs down 55% from baseline. 2) Hand-raising replacing call-outs. 3) Math class remains most challenging."
2 Visual Supports (2 minutes)
Show, don't just tell. Simple graphs communicate trends faster than numbers.
Example: Line graph showing weekly averages + bar chart comparing baseline to current
1 Recommendation (1 minute)
Based on the data, what should happen next?
Example: "Data supports continuing current intervention with additional math-specific support."
Graph Best Practices
Do:
- • Include baseline and goal lines
- • Label axes clearly
- • Use trend lines for clarity
- • Mark intervention phase changes
- • Keep it simple—one behavior per graph
Don't:
- • Overload with multiple data series
- • Use confusing color schemes
- • Show raw daily data (too noisy)
- • Present without verbal context
- • Forget to celebrate wins
Common Data Review Pitfalls
Pitfall #1: Cherry-Picking Good Days
"He had zero incidents on Tuesday!" ignores that he had 15 on Wednesday.
Fix: Always report averages or ranges, not single data points.
Pitfall #2: Ignoring Missing Data
Gaps in data collection are information. Why wasn't data collected?
Fix: Acknowledge gaps honestly. "Data was collected on 78% of school days—the 22% gap was primarily during assessment weeks."
Pitfall #3: Comparing Apples to Oranges
Comparing September frequency to December duration is meaningless.
Fix: Use consistent measurement throughout. If you must change, restart baseline.
Pitfall #4: All Problems, No Progress
Focusing only on what's not working demoralizes the team and parents.
Fix: Use "celebrate and problem-solve" structure. Lead with wins, then address challenges.
The Bottom Line
Your data has a story to tell. Your job is to be the translator—connecting numbers to meaning, patterns to causes, and findings to actions.
The best data presentations answer "So what?" before anyone asks it. They show where you started, where you are, and where you're going.
Invest time before your mid-year review to analyze trends, add context, and practice your narrative. The IEP team will thank you—and more importantly, the data will actually drive better decisions for your student.
About the Author
The Classroom Pulse Team consists of former Special Education Teachers, BCBAs, and BCBA students passionate about turning behavior data into actionable insights for every educator.
Take Action
Put what you've learned into practice with these resources.
Key Takeaways
- Data without context is noise—always connect numbers to real-world impact
- Trend lines matter more than daily fluctuations; look for patterns over 4-6 weeks
- Compare current performance to baseline AND to goal criteria
- Address data gaps honestly; missing data is still information
- Present both celebration (progress) and problem-solving (challenges) in every review
Mid-Year Data Review Template
A structured template for preparing behavior data presentations for IEP meetings. Includes prompts for trend analysis, context notes, and next steps.
Tags:
Ready to Transform Your Classroom?
See how Classroom Pulse can help you streamline behavior data collection and support student outcomes.
Visualize Your IEP DataFree for up to 3 students • No credit card required
About the Author
The Classroom Pulse Team consists of former Special Education Teachers and BCBAs who are passionate about leveraging technology to reduce teacher burnout and improve student outcomes.
Related Articles
The Four Functions of Behavior: A Complete Guide for Educators
Understanding WHY students behave the way they do is the foundation of effective intervention. Learn the four functions of behavior—attention, escape, tangible, and sensory—and how to identify them in your classroom.
How to Graph Behavior Data: A Visual Guide for Educators
Learn how to create clear, effective behavior data graphs for IEP meetings, progress monitoring, and data-driven decision making. Includes line graphs, bar charts, and trend analysis techniques.
Evidence-Based FBA & BIP Best Practices: The Complete 2025 Guide
Master the gold standard for Functional Behavior Assessments and Behavior Intervention Plans. Learn proper FBA methodology, function-based intervention planning, treatment fidelity requirements, and when to modify interventions for optimal student outcomes.
