Effective behavior intervention relies on accurate interpretation of collected data. This comprehensive guide provides educators and behavior analysts with the tools and techniques necessary to transform raw behavioral data into actionable insights that drive student success.
Data interpretation is not about finding what we expect to see, but rather discovering what the data actually tells us about student behavior and intervention effectiveness.
Proper data interpretation enables us to:
| Stage | Key Activities | Common Pitfalls |
|---|---|---|
| 1. Data Preparation | Clean data, check for missing values, organize by date/session | Including unreliable data points |
| 2. Visual Inspection | Create graphs, identify obvious patterns | Over-interpreting random variation |
| 3. Systematic Analysis | Apply visual analysis rules, calculate statistics | Ignoring clinical significance |
| 4. Contextualization | Consider environmental factors, student history | Overlooking external variables |
| 5. Decision Making | Determine intervention effectiveness, plan next steps | Making hasty decisions |
Visual analysis is the primary method for interpreting single-subject design data in behavior analysis. This systematic approach allows practitioners to identify meaningful patterns and changes in behavior over time without relying solely on statistical tests.
The mean or average value of data within a phase
The direction and slope of the data path (increasing, decreasing, or flat)
The degree of fluctuation or bounce in the data
How quickly behavior changes after intervention introduction
The proportion of data points that overlap between phases
Similar patterns across similar phases or conditions
| Graph Element | Best Practice | Purpose |
|---|---|---|
| X-Axis | Time dimension (sessions, days, dates) | Shows progression over time |
| Y-Axis | Behavior measure (0 to max observed + 20%) | Displays behavior magnitude |
| Phase Lines | Solid vertical lines between conditions | Indicates intervention changes |
| Data Points | Connected with lines, different shapes for conditions | Shows individual measurements |
| Labels | Clear, descriptive phase labels | Identifies conditions |
Always include at least 5 data points in baseline and intervention phases for reliable visual analysis. Fewer points make it difficult to establish patterns and assess stability.
Recognizing patterns in behavioral data is crucial for understanding the function of behaviors and predicting future occurrences. This chapter explores common patterns and their implications for intervention.
Example: Monday spikes in challenging behavior after weekends
Intervention Implication: Focus on transition supports and Monday morning routines
Example: Tantrums increase initially when attention is withheld
Intervention Implication: Prepare team for increase; maintain consistency
Example: Work completion increases just before scheduled breaks
Intervention Implication: Consider variable-ratio reinforcement schedule
| Strategy | When to Use | How to Apply |
|---|---|---|
| Moving Average | High variability data | Calculate 3-5 point average to smooth data |
| Split-Middle Line | Determining trend | Divide phase in half, find median of each, connect |
| Envelope Analysis | Assessing variability | Draw lines connecting peaks and valleys |
| Lag Analysis | Delayed effects | Compare behavior to events from previous sessions |
Avoid "seeing" patterns that don't exist. Random variation can sometimes appear meaningful. Look for patterns that repeat across multiple phases or conditions before drawing conclusions.
Trend analysis reveals the direction and rate of behavior change over time, essential for evaluating intervention effectiveness and predicting future behavior patterns.
| Trend Type | Description | Interpretation |
|---|---|---|
| Accelerating ↗ | Increasing slope over time | Behavior increasing rapidly |
| Decelerating ↘ | Decreasing slope over time | Behavior decreasing rapidly |
| Stable → | Minimal slope (±5% of mean) | No significant change |
| Variable ↝ | Inconsistent direction | Unstable pattern |
| Curvilinear ⤴ | Changing slope direction | Initial change then plateau |
A trend is considered stable if at least 80% of data points fall within 15% of the trend line. This criterion helps determine when to change phases or modify interventions.
Celeration values:
Project trend lines from baseline into intervention phases (dashed line) to visualize what would have happened without intervention. This helps assess intervention impact.
Understanding the level (central tendency) and variability (spread) of data is fundamental to determining intervention effectiveness and making phase change decisions.
| Measure | Calculation | When to Use |
|---|---|---|
| Mean | Sum of all values / Number of values | Stable data without outliers |
| Median | Middle value when ordered | Data with outliers or trends |
| Mode | Most frequent value | Discrete categories |
| Trimmed Mean | Mean excluding top/bottom 10% | Occasional extreme values |
Data is considered stable when:
| Pattern | Concern Level | Action |
|---|---|---|
| Low bounce (CV < 20%) | Low | Proceed with analysis |
| Moderate bounce (CV 20-50%) | Medium | Look for patterns |
| High bounce (CV > 50%) | High | Investigate measurement issues |
| Extreme outliers | Very High | Verify data accuracy |
High variability often indicates:
Address variability before concluding intervention ineffectiveness.
Analyzing changes between phases (baseline to intervention, or between different interventions) is critical for determining intervention effectiveness and making data-based decisions.
Compare median or mean values between phases. A meaningful change typically exceeds 25% of baseline level.
Compare trend lines between phases. Look for changes in direction or slope steepness.
Compare stability envelopes. Reduced variability often indicates better behavioral control.
Examine first 3-5 data points after phase change. Immediate changes suggest strong intervention effects.
Calculate percentage of data points that overlap. Less overlap indicates stronger effects.
In reversal or multiple baseline designs, look for similar effects across replications.
| Method | Formula | Interpretation |
|---|---|---|
| Percentage of Non-overlapping Data (PND) | % of intervention points exceeding baseline max/min | >90% very effective, 70-90% effective, <70% questionable |
| Percentage of All Non-overlapping Data (PAND) | % of all data points that don't overlap | >75% large effect, 50-75% medium, <50% small |
| Improvement Rate Difference (IRD) | Improvement in intervention - Improvement in baseline | >0.70 large, 0.50-0.70 moderate, <0.50 small |
Document all phase change decisions with rationale. Include environmental events, team decisions, and any protocol modifications to aid future interpretation.
While visual analysis remains primary in single-subject designs, statistical methods can supplement interpretation and provide additional confidence in decision-making.
| Test | Purpose | Requirements |
|---|---|---|
| C Statistic | Detect trend in baseline | 8+ data points |
| Tau-U | Measure effect controlling for baseline trend | 5+ points per phase |
| Regression Analysis | Model trend and level changes | 20+ total data points |
| Time Series Analysis | Account for autocorrelation | 50+ data points |
Behavioral data often shows autocorrelation (today's behavior predicts tomorrow's). This violates assumptions of many statistical tests. Always check for autocorrelation before applying traditional statistics.
Interpretation: >0.8 large, 0.5-0.8 medium, <0.5 small
Values >1.96 indicate statistically reliable change
A statistically significant decrease from 100 to 95 aggressive incidents may be less meaningful than a non-significant decrease from 10 to 2 incidents. Always consider:
Transforming data interpretation into actionable decisions requires systematic evaluation processes and clear decision rules that balance multiple considerations.
| Decision Type | Data Indicators | Action Criteria |
|---|---|---|
| Continue Current | Positive trend, decreasing variability | 3+ consecutive points in desired direction |
| Modify Intervention | Plateau after initial progress | No change for 5+ sessions |
| Intensify Support | Insufficient progress rate | Below aimline for 3 consecutive points |
| Fade Support | Sustained improvement | At goal level for 10+ sessions |
| Change Strategy | No improvement or worsening | 3+ points in wrong direction |
Current: 20% task completion
Goal: 80% task completion
Timeline: 30 school days
Daily progress needed: (80-20)/30 = 2% per day
Draw line from current level to goal, check actual progress against this line weekly.
When data are ambiguous, err on the side of less restrictive interventions while increasing data collection frequency to clarify patterns more quickly.
Effective communication of data interpretation ensures all stakeholders understand student progress and support consistent implementation of interventions.
| Audience | Focus | Format |
|---|---|---|
| Parents/Families | Progress toward goals, celebrations, next steps | Simple graphs, narrative summary, concrete examples |
| Administrators | Compliance, outcomes, resource needs | Dashboard summary, trend data, brief bullets |
| Team Members | Implementation details, data patterns, decisions needed | Detailed graphs, phase analysis, action items |
| Students | Personal progress, achievements, goals | Visual progress charts, sticker charts, self-monitoring |
"During baseline (Sessions 1-5), Johnny's on-task behavior averaged 35% of observed intervals, with high variability (range: 20-50%). Following implementation of the self-monitoring intervention (Sessions 6-15), on-task behavior immediately increased to 60% and showed a steady upward trend, reaching 85% by Session 15. Variability also decreased substantially (SD reduced from 12% to 5%). This 50% improvement exceeds our goal of 70% on-task behavior. We recommend continuing the current intervention while beginning to fade prompts for self-monitoring."
This chapter provides practical tools and resources to support accurate data interpretation and professional development in behavior analysis.
| Tool Category | Options | Best For |
|---|---|---|
| Data Collection | Classroom Pulse, BehaviorSnap, Catalyst | Real-time data entry and organization |
| Graphing | Excel, Google Sheets, GraphPad | Creating publication-quality graphs |
| Statistical Analysis | R, SPSS, SAS | Advanced statistical calculations |
| Visual Analysis | Tau-U Calculator, PND Calculator | Effect size calculations |
| Problem | Possible Causes | Solutions |
|---|---|---|
| Can't detect pattern | High variability, insufficient data | Collect more data, examine environmental factors |
| Conflicting trends | Multiple behaviors, competing interventions | Separate data by context, simplify intervention |
| Plateau in progress | Ceiling effect, motivation loss | Adjust goals, modify reinforcement |
| Data drift | Observer bias, definition creep | Retrain observers, clarify definitions |