by Andrea M. Patawaran-Hickman and Rachael Winterling
Histories in Public Education
The emphasis upon what students should learn, the way in which teachers should teach, and the role of the American school has long been debated. Throughout the 18th and 19th century, the common schools dominated the new republic. In 1838, Abraham Lincoln, in “The Perpetuation of Our Political Institutions” and in other speeches of the era of the creation of American schools echoed the goal of education. They argued that the aim of schools ought to:
make our children and youth either partisans in politics, or sectarians in religion; but to give them education, intelligence, sound principles, good moral habits, and a free and independent spirit; in short, to make them American free men [and women] and American citizens, and to qualify them to judge and choose for themselves in matters of politics, religion, and government (Hirsch 8).
The reaction against the common schools began in the 20th century with a new approach in progressive education which placed the child or the individual at the center of the experience rather than societal welfare. The lasting impression is more “empathy with childhood” but “its fatal flaw” is described as its assumption that children would learn naturally without restrictions in content (Hirsch 35). By the end of the 1950’s, the open classroom had taken over public schools. Reports on SAT verbal and math scores depict a “precipitous decline” in the mid-1960 with those who were graduating under the new ideologies. The graph plots show verbal and math SAT scores across a long range of time starting with 1965-2005. The graph shows another decline in the 1980’s in both areas and slow gradual increase in Math scores after 1980. “Starting in the 1960’s, a decline in reading and writing skills of high school graduates began to be alarmingly apparent. Recently they have even declined somewhat further, despite the frantic emphasis on reading skills under No Child Left Behind. DEFINE NCBH and date it. Scores on the verbal Scholastic Aptitude Test (SAT) declined from a peak in 1963 to a low point around 1990s. Since they have remained rather flat” (Hirsch 28). Much of the 21st century educational controversies have been over the No Child Left Behind initiative to improve teacher quality and test scores throughout the nation.
Meritocracy and Quantitative Data
In 2001, NC Schools through a “collaborative effort of the NC Education Research Council, the Governor’s Office, and the NC Department of Instruction” created the NC School Report Card to provide “nuts and bolts” to “parents and others interested in the public schools of NC with information about schools, district and state-level data in a number of areas. These include student performance on tests, teacher qualifications, school safety, class and school size, and many others” (NC School Report Cards, FAQs). The 21st century shift in education focuses on quantifiable data while simultaneously attaching data to teacher quality. The numbers game and the influx of informational graphics has become common place as a component of the American educational experience. Parents now receive NC School Report Cards along with the children’s report card to judge not only their child’s progress, but also that of he school, the district, and the state. Although the NC Department of Public Instruction website argues that the data should not be used to “rank” schools within the district or even the state; it becomes a difficult task to ignore when higher educational reports like the ones from U.S. News and World Reports emphasize that scoring is one way to effectively measure institutional merit.
A Snapshot: University Meadows Elementary
The NC School Report is designed to communicate performances of the students, teachers, and the school in its entirety. The Report Card Snap Shot, on the other hand, is the two-sided one page document that the parent’s actually receive. The Report Card Snap Shot privileges specific information and redirects its audience to the NC public schools website for detailed information. It fails as visualization because it is information intensive and heavy in its use of numerical data. The front page is designed into a two-column format. The power zone is used to highlight the Report Card title, academic year, and school specific information which is repeated in the left top power zone with details about the location of the school, contact information, school type, and district belonging. This information seems trivial as it would be already known to the parent. The School Profile is located in the bottom column of the power zone and focuses strictly on population and average class sizes in two small tables. The multiples of the document are the school, district, and state. The right column headlines itself as “High Student Performance” with a title for the table as “Performance of Students in Each Grade on the NC End-of-Grade Test”and subtitle of “Percentage of Students’ Scores At or Above Grade Level” which shows a table of grade level testing areas and results in percentages and the same repeated information categories of school, district, and state. The table does not “like a good story… draw you in, provoke questions, and offer a sense of exploration and discovery” (Steele 157).
A simple bar graph created in Microsoft word on a grey scale would have created a more visualizing enticing piece of evidence. It would allow the data to speak for itself; however, it would also high-light poor school performances,see image below. Of course, school official would not want a document that revealed such harsh realities.
The bar graph makes it easier to see where the school is performing poorly on the state End-Of-Grades Tests which may have alarmed parents whose children were in or rising to the 5th grade in both tested areas as the bar depicts a significant decline in the middle of the chart. It may have even invited questions from the parents as to why the school is not performing on the same level as the district in Grades 4 and 5 in Reading and why the overall numbers do align with district. The “multiples,” data points for the school, district, and state, directly invite comparisons,” the essence of statistical thinking.” The lowest performing section, 5th grade indicators, are lost among other percentages in the data provided by the school and the state and do not seem to be as visualizing alarming as the evidence actually suggests. Audience members would have to carefully read into the performance numbers. The decline from 3rd to 5th grade is substantial, but the use of the table makes this evidence easy to overlook. On the other hand, the bar graph makes the evidence hard to ignore.
The second table titled “Performance of Each Student Group on the N.C. End of Grade Tests” with an even longer subtitle “Percentage of Students, Grouped by Gender, Ethnicity, and Other Factors, Who Passed BOTH the Reading and Math Tests” makes the audience jump to quick generalizations about student performance and gender, race, and socio-economics at the school, district, and state level. At a first glance, the second table seems like an extension of the first table, but upon closer examination, it is a break-down of a compilation of all the data given in the first table. It does not correlate with the first table because it is an overall snap shot of all grades and all tests. It is not specific to either testing area or to grade level. The only real information graphic on the document is found on the back page and it is completely useless. It is a bar graph that shows that the school is 100% connected to the internet whereas the state and district are 99.8% connected. The graph does answer questions such as what type of internet speed is available in the classroom nor does it explain how or to which device the classroom connects with the internet or if there is even a device connected. The percentage difference between the school and the district and state is such an insignificant amount that it is not really a statistic at all; however, it allows the school to appear to be on top of at least one area of the NC School Report.
The how-to manuals on the website instruct the audience to communicate directly with the schools for more specific information or with questions regarding any of the measures. It warns the audience away from using the NC Report Cards as a comparison tool to rank individual schools within the district, but it does not explain why or how the comparison would be inaccurate or faulty. The How to Read Guide provides one-line descriptions of why the information is useful and notes that the information found within the reports can explain why specific schools have more or less activities and/or funding.
What Do the Numbers Suggest?
Some Americans are not buying into the educational quantitative measures . Some feel that the current trend to rank schools places an unwarranted value on particular schools especially in higher education. Kristin Hiemstra states: I do not like the rankings.” The Chapel Hill guidance counselor adds:
“The same way designer labels define the perceived value of the popular high school clique, designer college rankings define the perceived value of a particular school’s education. In the same way poor-fitting designer clothes are a waste of money so is a poor (college) match. … When a student is a good match for a school, they will learn more about themselves. … This highly personal aspect of education is not measurable.”
Are these graphs a way to blindly mislead the general public into false assumptions? The purpose of this paper is to explore how information graphics are currently used through out the American educational experience. We will explore local school report cards within the state; examine college admission materials provided by state universities in comparison to college rankings provided by magazines such as U.S. News and World Report. “I share some wide-held concerns about the ability for the magazine to determine what is ‘best.’ This is not unique to U.S. News,” said Sexton, vice president of enrollment management at Santa Clara University, “In America, we want to keep score. We want winners and, therefore, ‘lessers.’ We want simple answers to often complex questions, the college search being a prime example” of an American educational hierarchy.
U.S. News & World Report averages 15 million page views on its website when its new ranking comes out. U.S. (Diamond 1). News 2015 Best Colleges Rankings Data Collections will three new statistical surveys including—main, financial aid, and finance. “These surveys solicit information on such factors as enrollment, faculty, tuition, room and board, SAT and ACT scores, admissions criteria, graduation and retention rates, college majors, school finances, activities, sports and financial aid. This data is used in the Best Colleges rankings that will be published on usnews.com and in the print guidebook that will be available on newsstands” (Morse 1). In addition, U.S. News includes questions about differential graduation rates based on family income levels (see also War on Poverty). “U.S. News has also started mailing out the undergraduate academic reputation surveys to colleges, universities and high school counselors nationwide. The results of these peer reputation surveys play an important part in the Best Colleges ranking methodology” (Morse 1).
Intentional Data Fudging
Numerous schools have misrepresented admissions data to raise their ranks in different kinds of college ranking systems. Being a highly ranked gives schools an advantage. For example, parents want their children to go to a highly accredited university in hopes of them getting a great education and securing a financial future. “Colleges that nab a top spot advertise it in their promotional materials. A strong ranking brings academic prestige, bragging rights and higher achieving students: “there is pressure to lie about the data or manipulate it,” said Mark Schneider, a vice president with the American Institutes for Research. “Colleges say, ‘I want the best students in the country to apply,’ and ‘the best way to do that is to be a Top 20 school’” (Diamond 1). In February, 2014 Flagler College’s president announced that it has misrepresented admissions data for incoming freshmen that were admitted in fall 2010 through fall 2013:
In a Feb. 17, 2014, statement on the school’s website, President William T. Abare Jr. said that “a senior admissions officer has resigned after taking sole responsibility for misreporting test scores, grade point averages, and class ranks of entering freshman. We have notified appropriate organizations, agencies, and accrediting bodies of the situation and have informed them that we have commissioned an independent investigation. We will disclose the findings once we receive the final report. We are reviewing what happened and are developing policies, procedures, and systems to ensure this does not occur again (Morse 1).
Emory University also intentionally misrepresented data about students to groups that rank colleges nationwide for more than a decade (Omer 1). “Emory is routinely listed as one of the nation’s top colleges by national publications, such as U.S. News & World Report, Forbes and Peterson’s, The Atlanta Journal-Constitution reported. Families and students across the nation often rely on rankings when deciding where to apply for college. University officials say they do not know if the wrong data affected Emory’s rankings” (Omer 1). Emory President James W. Wagner reported that SAT/ACT scores were overstated, as well as class rankings. Emory reported SAT and ACT data for admitted students instead of enrolled students, artificially inflating Emory’s test scores. They also overstated the amount of incoming students that graduated in the top 10 percent of their high school class (Diamond 1).
The following chart is taken from the University of North Carolina Chapel Hill’s report Statistical Abstract of Higher Education in North Carolina. The horizontal bar graph misrepresents data in numerous ways. Firstly, the graph represents 18 North Carolina affiliated schools and represents them equally as if they all have the same population which skews all of the percentages represented. With the population not being privileged, the normal user would assume that all the data is truthfully represented, allowing the user to make wrong comparison among the 18 North Carolina affiliated schools.
The following table, SAT Test Scores for New Freshmen, Fall 2013 is taken from the UNC Charlotte’s Office of Institutional Research Fact Books. Not many individuals know that universities have fact books available for public viewing. Fact books can be analyzed for enrollment and retention statistics. The table below will serve as an example that can be related to other universities for undergraduate admissions. This table has a “Not Available” section that is indicated to be students that took the ACT over the SAT, but is never clarified. To make this table more informative, it should include the ACT test scores as well so every freshmen test score is represented. ACT test scores are not represented in any fall 2013 fact book provided by UNC Charlotte. In addition, adding a section that represents how many students were accepted and enrolled for every score interval would provide the most accurate statistical data. Also, in the “High School Senior Class Rank of New Freshmen” section, it is peculiar that 1,332 (43% of the number of enrolled freshmen) individual ranks are not available. UNC Charlotte should provide an explanation to the 43 percent of its data not represented in the table if they want to create a transparent ethos for the university.
Open Enrollment in Higher Education
The G.I. Bill “has been heralded as one of the most significant pieces of legislation ever produced by the federal government—one that impacted the United States socially, economically and politically” (U.S. Department of Veteran Affairs 1). But it almost never came to pass. The first version of the G.I. Bill was signed into law on June 22, 1944 by President Franklin D. Roosevelt. The goal of the G.I bill was to give servicemen and women the opportunity to resume their education after discharge to help address the “persistent unemployment and underemployment of veterans” (Hansen 1).The importance of the G.I. Bill in higher education rests in its opening of ALL doors to what once considered restricted access in education governed by social position. By implementing the bill, the federal government chose to promote “open enrollment” in public institutions of higher education. In 2008, the G.I. Bill was updated o give veterans with active duty on or after September 11, 2011. It “enhanced educational benefits that cover more educational expenses, provide a living allowance, money for books and the ability to transfer unused benefits to spouses or children” (U.S. Department of Veteran Affairs 1).
Source: Department of Defense, U.S. Census Bureau
In 2012, it was reported that only 36 percent on veterans are using the G.I. Bill, which is a startling number, why would someone give up a free education (Hansen 1). In 2010, there was 2,266,883 active duty Americans in the military. 36 percent of 2,266,883 is 816,077 individuals. People might find the 36% percent shocking but when you have 816,077 using their G.I. Bill to attend college can affect college admissions across the nation. For example, it adds more attendees to universities every year because veterans have an opportunity for a free higher education. With more attendees/applicants to universities the competition increases at all higher education affiliations especially ones that are ranked. “The increasing amount of digital information in modern society has ushered in golden age for data analysis. Ample data encourages users to conduct more frequent exploratory data analyses to explain scientific, social, cultural, and economic phenomena” (Steele 157).
Careful and Laborious Analysis
The need for quantifiable data to evaluate American schools is deeply rooted in positivist traditions. From early elementary to higher education, the norm referenced ranking system places students and schools into a system of order where numbers are indicators of intelligence and worth. Data is the making of the American Educational tradition; however, careful analysis of the data that is often buried into multiple layers of information which can misguide the audience and force assumptions that are not necessarily accurate or worthy of being considered real evidences of truth. Another interesting fact about this data is the difficulty to even locate it. Organizations do this to limit the exposure of the misrepresented data. One must have to be very interested in the particular data to locate it and analyze it. To do so effectively, one would almost have to had prior knowledge of the subject and be privileged to information that is often not disclosed to the masses.
Briggs, Bill. Diamond, Laura. “Emory Scandal: Critics Doubt College Ratings.” The Atlanta Journal- Constitution. 26 Aug. 2012. Web. 15 Apr. 2014. <http://www.ajc.com/news/news/local/emory-scandal-critics-doubt-college- ratings/nRMSY/>.
Education and Training.” History and Timeline -. U.S. Department of Veteran Affairs, n.d. Web. 12 Apr. 2014. <http://www.benefits.va.gov/gibill/history.asp>.
“Fact Book | Office of Institutional Research | UNC Charlotte.” UNC Charlotte’s Office of Institutional Research. University of North Carolina at Charlotte, n.d. Web. 15 Apr. 2014. <http://ir.uncc.edu/fact-book>.
Hansen, Matthew. “Only 36% of Veterans Utilizing GI Bill’s Free Tuition.” Omaha.com. World Herald, 10 Dec. 2012. Web. 15 Apr. 2014. <http://www.omaha.com/apps/pbcs.dll/article AID=/20121210/NEWS/712109929/1201>.
Hirsch, E.D. The Making of Americans: Democracy and Our Schools. Yale University Press: New Haven, 2009.
Hurt, Alyson, Erica Ryan, and JoElla Straley. “By The Numbers: Today’s Military.” NPR. Department of Defense, U.S. Census
Morse, Bob. “Flagler College Misrepresents Admissions Data.” US News. U.S.News & World Report, 19 Feb. 2014. Web. 15 Apr. 2014. <http://www.usnews.com/education/blogs/college-rankings-blog/2014/02/19/flagler- college-misrepresents-admissions-data>.
Omer, Sevil. “Emory University: False Academic Data Sent to Ranking Groups.” NBC News. NBC News, 17 Aug. 2012. Web. 15 Apr. 2014. <http://usnews.nbcnews.com/_news/2012/08/17/13339734-emory-university-false- academic-data-sent-to-ranking-groups?lite>.
“Statistical Abstract of Higher Education in North Carolina 2012-2013.” University of North Carolina, a System of Higher Learning. University of North Carolina at Chapel Hill, Aug. 2013. Web. 15 Apr. 2014. <http://www.northcarolina.edu/?q=university-facts-and- figures/access>.
Steele, Julie, and Noah Lliinsky, eds. Beautiful Visualizations. Sebastopol: O’Reilly Media, 2010. Print.
Tufte, Edward. Visual Explanations: Images and Quantities, Evidence, and Narrative. Connecticut: Graphics, 1997. Print.
University Meadows Snapshot. NC School Report Cards. Public Schools of North Carolina: State Board of Education: Department of Instruction. <http:/wwwncreportcards.org>