Friday, January 18th, 2013
The Center for Information & Research on Civic Learning and Engagement (CIRCLE) has just released two new fact sheets, both looking at the claim that today’s students are not properly prepared for citizenship once they graduate from high school.
Only eight states currently test students on American government or civics, and only about a quarter of students nationwide earn a “proficient” score on the National Assessment of Educational Progress (NAEP) civics exam. This statistic is often cited as proof that more civic education is needed.
However, while civic education certainly deserves increased attention, the CIRCLE study notes that to bemoan the state of civic education and students’ civic knowledge based on the NAEP results would be seriously misunderstand the national assessment. CIRCLE director Peter Levine explains:
Typically, the release of NAEP Civics results is treated as evidence that students know far too little about civics. […] These interpretations of the NAEP are misleading. The test is designed to produce the overall results that it yields. When the current NAEP Civics Framework was developed in 1998, a committee of teachers and other knowledgeable citizens decided how difficult each proposed question ought to be for students at each grade level, and then decided what overall score should qualify a student as having “basic,” “proficient,” or “advanced” knowledge. If members of the committee had agreed on a more—or less—demanding idea of what qualifies as success in civics, they would have set the cutoff scores either higher or lower, and those decisions would have been equally valid. The committee was informed by empirical data, but ultimately it had to make a value-judgment about what questions to ask how much knowledge is satisfactory. Each subsequent NAEP Civics assessment has then been carefully constructed to be as equally difficult as the 1998 assessment. Thus 24% percent of 12th graders were rated “proficient” in 2010 because the NAEP 12th grade test had been designed to yield roughly that proficiency rate.
For a citizen or policymaker who wants to decide whether students know enough, there is no substitute for looking at individual NAEP items and making an independent decision about whether most students at a given grade level should know the answers.
However, Levine notes that the test can be used to show which students perform better or worse than the norm for their grade, how students’ knowledge has changed over time, which educational practices are related to higher scores, and how well students understand various specific topics.
On some topics, young people were informed. More than three in four young voters could correctly answer at least one factual question about the candidates’ position on a campaign issue that they had chosen as important. And on questions about the structure of the US government, they performed as well or better than older adults who have been asked similar questions in other polls.
On other topics, most young people were misinformed. For instance, a majority (51.2%) believed that the federal government spends more on foreign aid than on Social Security, when in fact Social Security costs about 20 times more. But again, older adults have also been found to be widely misinformed on the same topics.
About one quarter of young voters were poorly informed about the campaign’s issues, and young people who did not vote were generally uninformed.
Young people who recalled high-quality civic education experiences in school were more likely to vote, to form political opinions, to know campaign issues, and to know general facts about the US political system. That does not mean that civics causes higher turnout and more knowledge, because students who experience better civics may also have other advantages in their schools and communities. But the correlations are very strong and at least demonstrate that active and informed citizens tend to be people who had good civic education.
Learn more about the results of both of these studies over at CIRCLE.