Inside NAEP.Developing NAEP Test Questions
Peggy G. Carr.National Center for Education Statistics.November 17, 2007

Item Development Process and Schedule
Explain the item development process and schedule
Complexity
Inclusiveness of the process
Attention to quality

New vs. Existing Frameworks
New vs. Existing Frameworks
For new frameworks
Create all new items
More than 4 years from development to implementation

For existing frameworks
Ongoing development for replacement blocks .(3 per assessment per grade)
Nearly 3 years from development to implementation

For both types of frameworks
More than 300 people involved in each assessment
Twice the number of items needed are created for paper and pencil items

How It Works
How It Works
Framework

How It Works
How It Works
Framework
-Plan
-Develop
-Expert Input
-Develop
-Field Test
-Develop
-Expert Input
-Develop

How It Works
How It works
Framework
Plan - 3 months
Develop - 7 months
Expert Input - 3 months
Develop - 7 months
Field Test - 8 months
Develop - 4 months
Expert Input - 2 months
Develop - 3 months
Operational

Contractors
Contractors
Educational Testing Service (ETS)
NAEP Education Statistics Services Institute (NESSI)
Pearson
Westat
Fulcrum
Logistics

Outside Experts
Outside Experts
Planning Committee (12)
Representatives from framework and standing committees

Standing Committee (12-18)
Members from framework committee
Curriculum specialists, university education faculty, etc.

State and District Review Panel (100-150)
Representatives from each state or district
Assessment directors, content specialists

Academic Review Panel (5-9)
Experts in the subject area (mathematicians, scientists)

Ensuring Quality
Ensuring Quality
The number of reviews
The qualifications of our reviewers
Examination of empirical data collected during both pilot and operational data collections

Quality control measures

Plan
Ensure that we are following both the letter and intent of the framework
An implementation plan for the entire process
Guidelines/blueprint for the item development contractor
Inventory and evaluation of items in current pools

Develop
Selection of stimuli
Reading passages, political cartoons, ICT topics
Governing Board approval

Item writing
Writers trained on frameworks and specifications
4-5 professional staff per content area
At least one has advanced degree in content area
At least one trained in how students
learn the content
Bias and fairness review

Develop (continued)
Small Scale Study
About 50 responses per item
For special items (ICTs, HOTs, etc.)
Cutting-edge
Additional time and research
Platform and administrative processes and procedures
Special working groups of experts

Expert Input
Ensure diversity of opinions and input
Standing Committee
State and District Reviewers

Expert Input (continued)
Standing Committee Goal
Consistency with framework
Grade or age appropriate
Alignment with the scoring rubric
Pool of items match with blueprint
Appropriate and adequate coverage of:
Content
Difficulty
Dimensions
Item type

Expert Input (continued)
State and District Goal
Consistency with framework
Variance from what would be assessed in the state
Perspectives from around the country
Consideration for what students are being taught

Develop
Further Item Development
Resolution of comments from reviews
Items revised
Quality control
Governing Board comments

Pilot Test
Includes production time, administration, and data analysis
At least 1500 responses per item
Administered at same time as operational assessments

Develop
Further Item Development
Proposed selection of items for the operational assessment
Revisions based on empirical data from the pilot assessment

Expert Input
Standing Committee
Review of the proposed item pool in its entirety
Review of individual items for quality

Academic Panel
Mathematicians, scientists, etc.
Accurate representation of the subject matter

Balance of accuracy and grade- level appropriateness

Develop
Further Item Development
Resolution of comments from reviews
Items revised
Quality control
Governing Board comments

Governing Board Input
Framework
Plan
Develop
Expert Input
Develop
Field Test
Develop
Expert Input
Develop
Operational

Frameworks
Governing Board comments
Approval of stimuli
Governing Board comments Critical NCES Decisions and Activities
Framework
Plan
Develop
Expert Input
Develop
Field Test
Develop
Expert Input
Develop
Operational

Implementation Plan.Guidelines
Resolution of comments and concerns, quality control
Resolution of comments and concerns, quality control
Resolution of review comments
Resolution of comments and concerns, quality control

NAEP Related Resources
NAEP Related Resources
For further information regarding NAEP item development, visit these NAEP and NAGB websites:
http://nces.ed.gov/nationsreportcard/tdw/item_development/
http://nagb.org/policies/pl-index.htm