The Worst Pie Chart Ever:
Problems with this chart:
a clear violation of Pie chart Rule #2: pie charts are for data that add up to a meaningful total.
Backward in Time and Other Errors:
Problems with this chart:
Source: Illinois Board of Higher Education, Illinois Higher Education Annual Report (July 2002), 24
Figure 3.16: Revised Enrollment Chart:
From Bowling Alone
Problems with this chart:
unnecessary 3-D effect.
The chart does not display meaningful data (the most important data are actually represented by the arrows on the bars, not by the height of the bar).
The chart is not self explanatory [the letters on the bars reference the textual explanation].
Source: Robert D. Putnam, Bowling Alone (New York: Simon and Schuster, 2000), figure 47
Figure 3.5: Revised Chart, with Data from Text:
Just Plain ChartJunk:
Source: Kristin E. Smith, Loretta E. Bass, and Jason M. Fields. "Child Well-Being
Indicators From the SIPP" Population Division Working Paper No. 24
U.S. Bureau of the Census. Washington, D.C. April 1998
These charts have something to do with futures trading: (click on thumbnails)
Source: New York Times:
3/14/08: "This article will appear in this Sunday's [3/16/08] Times Magazine."
|Problems with this chart:
Charles Blow usually does a better job with the Times graphics. In this case, he asks the reader to draw comparisons across what are in effect 18 pie charts.
Roughly estimating the numbers from the chart, and just using Obama's share of the primary vote, this is my revision:
Notes: There is some method to the sorting here, but it might have been better to sort by the date of the primary to avoid the implication that the city\rural divide is increasing.
In his Senate testimony defending the progress in the Iraq war on April 9, 2008, Army General David H. Petraeus presented several charts, including the one below summarizing the operational readiness of the Iraqi army and police battalions. Glenn Kessler, of the Washington Post comments on several of the charts here:
The stacked bar chart represent the number of battalions at each stage of readiness, with ORA 1 being the highest stage.
The problems with the chart are the following:
Estimating the numbers from the graphics, the following chart shows how the "army" side of the chart could have been constructed. Notice how the relatively modest increase in the number of battalions at the two highest levels of readiness.
Not content with the distractions and distortions made possible by the use of 3-D effects, charters sometimes feel the need to add all sorts of other Chartjunk to a graph. In the graphics on the left, Kevin Phillips is trying to make the point that income is more inequitably distributed in the United States than in other countries.
Note the extraneous features of this in this graphic.
While it might be possible to design a better graphical display for these data, a table does the job quite nicely:
*Kevin Phillips, The Politics of Rich and Poor (1991: Harper Perennial), 9.
Time Series Data Distortion
This is a time series chart originally printed in a public policy textbook authored by four professors of political science employed by three public universities.
From these data they conclude:
"There is some evidence that the cost of higher education may not have escalated so much... Figure 9-12 reflect the average cost for tuition, room, and board as a percentage of median family income from 1964 to 1995. While private institutions have increased costs substantially, public university costs have remained constant. This indicates that the increased costs associated with higher education may be quite reasonable when compared to family income levels." (Cochran, 346-7)
Note the ways in which the authors have understated the rising costs of public university education. First, the costs are deflated not by adjusting for the consumer price index but by median family income -- especially for the years after 1982, median family income rose much faster than the consumer price index. Second, graphing both the private and public data on the same graph enlarges the scale on which the public data is displayed. It's hard to tell from the graph, but between 1980 and 1995 it appears that public university costs increased from around 11% of family income to near 15% -- in effect, the share of family income going to public university costs has increased by a third. The third way of minimizing the cost increases that have occurred since 1980 is to extend the time series back to 1965.
A completely different picture emerges if one were to compare the rate of increase in public university costs to the rate of increases in other sectors of the economy. On the left, we see that from 1981 to 1999 -- over the lifetime of today's college student -- public university costs have risen faster than any other sector of the economy. Faster even than rising medical care costs. In addressing the topic of health care inflation, the same authors note that: "Cost escalation in the medical field has been constant," and spend four pages of text addressing the reasons for the increases. (pp. 268-72).
Clarke Cochran et. al. American Public Policy: An Introduction (1999: St. Martin's Press),
NYTimes Alphabetical Sorting
Charles Blow is responsible for this chart that demonstrates two egregious charting errors: the countries are sorted alphabetically, and the use of the size of bubbles to represent the magnitude of the data. It's nice that the Times gives so much space to Blow's charts on its editorial page, but........
Wall Street Journal Data Distortion
The side by side charts used in a Wall Street Journal editorial, "No Politician Left Behind
Lack of money isn't the problem with education," is a classic example of data distortion. Note first that the data on the spending is is not adjusted for inflation or, the growth in the number of pupils. In theory, 500 is the maximum score on the NAEP scale-scored math tests, but no student ever reaches this standard. The average score for high school seniors on the same scale is just over 300.
Including some more recent data, and adjusting the reading score scale, we get quite a different picture:
Here is the fairest comparison, using 8th-grade data:
|(click on chart for full image)||It takes some time to explain everything that is wrong with this chart. Fortunately, the esteemed Jorge Camoes has gone to all the trouble: He explains:
Illinois Board of Higher Ed.
Sorting data by the most significant variable greatly aids in the interpretation of graphical representations of data. Sorting data alphabetically, as is done throughout this IBHE report, generally works to hide significant facts about the data.
Problems with this chart:
Correcting these problems results in a chart (showing 2000 data below) that says more about the data.
The Public Agenda
The Public Agenda, a wonderful site for information about American public policy, features "issue guides" on a wide range of policy issues. Unfortunately, over the years the quality of the charts in the "facts" section of the site have serious deteriorated. Consider this pie chart:
At least they are to be commended for sorting the data from high to low (unlike this table where the data are sorted alphabetically). One of the more perplexing practices on the site is their tendency to scale time series percentage charts from 0 to 100, often eliminating any variation in the trend. Often variation is lost when charters insist that charts always use a zero-based scale but Public Agenda takes this standard to its logical conclusion and scales percentages to 100. The title on this chart is correct and, properly scaled, the data would show a dramatic rise in health spending. But here the graphic depiction falls flat:
Notice also that the X-axis contains decennial data for the first four bars, then skips to 1993 and 1997 before going to annual data. Here's an alternative:
ChartJunk examples:(click on thumbnails)
From: The Drug Library