Track Student Progress and Inform Continuous Improvement with the - - PowerPoint PPT Presentation

track student progress and inform continuous improvement
SMART_READER_LITE
LIVE PREVIEW

Track Student Progress and Inform Continuous Improvement with the - - PowerPoint PPT Presentation

Track Student Progress and Inform Continuous Improvement with the NAEP Data Explorer Robert Finnegan, Educational Testing Service Jason Nicholas, Westat Julie Williams, California Department of Education NAEP Data Explorer History and


slide-1
SLIDE 1

Track Student Progress and Inform Continuous Improvement with the NAEP Data Explorer

Robert Finnegan, Educational Testing Service Jason Nicholas, Westat Julie Williams, California Department of Education

slide-2
SLIDE 2

NAEP Data Explorer

History and Overview of the Redesign

2

slide-3
SLIDE 3

History of Online Data Analysis Tools

  • Early 2000s

NAEP Data Tool

  • Two iterations
  • Allowed users

to look up pre- summarized data tables

3

slide-4
SLIDE 4

History of Online Data Analysis Tools

  • Released in 2005
  • First iteration of

using plausible values to generate summary results

  • Included user

defined significance testing and charting capabilities

  • Redesigned in 2009

4

slide-5
SLIDE 5

Overall Goals for NAEP Reporting

5

slide-6
SLIDE 6

Key Goals for the NDE Redesign

  • Improve user interface for efficient data requests
  • Save settings and reports
  • Copy reports across subject and grades
  • Integrate existing NAEP tools with a search interface
  • Update charts and graphs
  • Use NAEP brand styles

6

slide-7
SLIDE 7

Select Criteria

7

slide-8
SLIDE 8

Select, Combine, View Jurisdictions

8

slide-9
SLIDE 9

Select Variables, Create Crosstabs

9

slide-10
SLIDE 10

Save Reports, View Saved Reports

10

slide-11
SLIDE 11

Copy and Edit Reports

11

slide-12
SLIDE 12

Integrated Search Function

12

slide-13
SLIDE 13

“Poor interpretations of reports are harmful even if they are based

  • n carefully designed tests.”

John Hattie, Educational Measurement: Issues and Practice, Winter 2014, 33(4), pp 34-35.

slide-14
SLIDE 14
  • Key to understand the NAEP

program and its design

  • Uniquely different than state

assessment systems and serves a different purpose

  • Data from NAEP based on complex samples of

schools and students and related interpretations need to account for this aspect

NAEP Program

slide-15
SLIDE 15
  • Every state has a NAEP State Coordinator who is

the “local” expert in all things NAEP

– Makes NAEP happen…………

  • This individual is the key resource for you in

accessing NAEP data and conducting analyses

  • Trained and supported in NAEP analyses and NAEP

data tools

  • Provided valuable feedback and insights into re-

design of NDE

NAEP State Coordinators

slide-16
SLIDE 16
  • NAEP Support & Service Center supports the

Coordinators with training and assistance

  • Also TUDA Coordinators in select large urban

districts

– California

  • Los Angeles
  • San Diego
  • Fresno

– Non-California

  • New York City, Chicago, Dallas, Las Vegas, Miami-Dade……….

NAEP State Coordinators

slide-17
SLIDE 17
  • With NAEP state results being available every
  • ther year, there are insightful data on student

progress available

– Trends not available from state assessment data – In addition, contextual information related to the environments in which students learn

  • NAEP data can provide valuable pathways for

state agencies to efficiently “inform” their improvement plans

– Provide the spark

NAEP Results and Data

slide-18
SLIDE 18
  • Familiarize yourself with what data are available

– Nation’s Report Card website

(nationsreportcard.gov)

– Survey Questionnaires (search: NAEP questionnaires)

  • Have a question in mind

– Subject, grade, year, jurisdiction

  • Play with the tool

– Don’t get too deep too soon

Best Practices for Analyzing Data with the NDE

slide-19
SLIDE 19
  • Subscales

– Algebra, Geometry, Measurement, Probability & Statistics, Number & Operations – Physical, Earth, Life – Literary, Informational

  • Multiple statistics

– Means & percentiles for scale scores – Percentages for achievement levels – Confidence intervals

NDE Hidden Gems

slide-20
SLIDE 20
  • Cross Tabulation

– NSLP participation x teacher tenure

  • Trend

– All states have trend back to 2002 or 2003 – Some back to 1992

  • Statistical significance testing

– Provides the check

NDE Hidden Gems

slide-21
SLIDE 21
  • What is impact of teacher’s years of experience
  • n elementary Reading achievement?
  • Do we know? What is the relationship?

Example

slide-22
SLIDE 22

Example

2 Years or Less 3 to 10 Years More than 10 Years Percent of Students 11% 34% 56% Average Scale Score 213 220 223

  • Grade 4 – Reading – National Public – 2017

* Significantly different (p < .05) from More than 10 Years.

* * * *

slide-23
SLIDE 23

Example

  • Grade 4 – Reading – National Public – 2017

4 18 30 73

slide-24
SLIDE 24

Example

2 Years or Less More than 10 Years Average Scale Score 213 223

  • Grade 4 – Reading – National Public – 2017

* Significantly different (p < .05) from National Public Gap.

Gap 11

*

  • 4*

23*

slide-25
SLIDE 25
  • Differences by state exist
  • NAEP cannot infer causality

– New teachers don’t teach as well as old teachers? – New teachers get jobs in places with lower achievement?

  • What other characteristic might be involved?

– Parental education – NSLP eligibility – School location – School Title I status

Example

slide-26
SLIDE 26

Example

2 Years or Less More than 10 Years Eligible 13% 53% Not Eligible 9% 59% Eligible 204 210 Not Eligible 228 238

  • Grade 4 – Reading – National Public – 2017

* Significantly different (p < .05) from Eligible.

* * * *

slide-27
SLIDE 27

Example

  • Grade 4 – Reading – National Public – 2017

Gap 4%

  • 2%

2 Years or Less Eligible 13% Not Eligible 9%

* Significantly different (p < .05) from Eligible or NP gap.

*

11%*