“ten best” rankings: the #1 way to confuse and disempower

When an article promises to reveal the "ten best" transit systems or colleges or corporations, why do we care?   Why do we even think such a list could be meaningful?

Well, how would you react if a major news magazine published definitive research on the "ten best fruits"?

Imagine it.  The timeless standards (apple, orange, grape) are in the #3-5 positions.  This confirms just enough of our long-standing assumptions to give the ranking system credibility.  But the fast-surging pomegranate is now #1, the bratty little blueberry is #2, and more scandalous, some long-trusted fruits have crashed out of the top 10.  Where did the banana (#21) go wrong?  800px-Kiwi_(Actinidia_chinensis)_1_Luc_Viatour_edit

The list would also contain some entries meant to surprise you.  You're supposed to think: "Wow, the mushy little kiwifruit is now one of the ten best fruits?  When did that happen?  I'd better buy this magazine and get the details!  Maybe the kiwifruit deserves another look!"

No, that would be silly, because almost anyone can see that "best fruit" is a meaningless term.  You could do a list of the most popular fruits, the sweetest fruits, the most sustainably cultivated fruits, the best fruits for various kinds of nutrition, or the most important fruits for the Solomon Islands economy.  But a list of the "ten best fruits" would be nonsense.  We each have many different demands of a fruit, but those demands aren't all important in the same way at the same time.  Most of us couldn't even form our own absolute definition of "best fruit," let alone try to get anyone to agree with it.  In fact, we woudn't even try.  The whole idea is obviously silly.

So why do we look twice at a list of top ten US transit systems?  Why would a major magazine think we would care?

Well, the fruit analogy suggests that when it come to transit systems (or colleges) people either (a) assume that everyone's idea of a "good" transit system is the same or (b) just don't want to think about what they want from their transit system.

As in any business, journalists may think they're responding to their readers' desires, but they're also helping to forge those desires.

If ten-best lists are about something that's reasonably factual ("ten most reliable transit systems," "ten safest transit systems", "top ten in ridership per capita") then they can be useful.  They can encourage excellence and help people reward that excellence with ridership and investment.

But when you tell your readers that certain transit systems are the "best," and don't explain your criteria very well, you signal that everyone must have the same sense of what's good.  That encourages people to go into transit debates as bullies, assuming that if a transit agency doesn't deliver on their notion of the good, the agency must be incompetent or failing, so the only valid response is abuse.  It encourages people not to notice that "failing to do what I want" is often a result of "doing what someone else wants."

In short, it encourages people to think like three-year-old children, for whom "my needs" are rich and glorious and self-evident to any reasonable person while "other people's needs" are vague and tedious abstractions.  That, in turn, forces officials to act like parents managing their childrens' tantrums.  And then we're offended when those officials can seem paternalistic?

Photo: Luc Viator, from Wikipedia

17 Responses to “ten best” rankings: the #1 way to confuse and disempower

  1. Leigh Holcombe May 2, 2011 at 1:20 pm #

    I don’t think mass-market writers actually have nefarious agendas. They quickly throw together silly lists based on anecdotal evidence, expecting very little response. Most people, reading a list of the ten best fruits would say “Wow, I had no idea” and then immediately forget the whole article. If you’re interested in responsible reporting on important issues by professional journalists, you’re reading the wrong magazine.

  2. Sean May 2, 2011 at 1:36 pm #

    In my experience most Internet flame wars stem from the problem that people have different criteria for what’s “best” or desirable and are unwilling to admit that someone else’s criteria could be just as valid FOR THEM.

  3. EngineerScotty May 2, 2011 at 6:03 pm #

    Trolling for eyeballs.

  4. JJJJ May 3, 2011 at 1:12 am #

    “If ten-best lists are about something that’s reasonably factual (“ten most reliable transit systems,” “ten safest transit systems”, “top ten in ridership per capita”) then they can be useful.”
    Well, that entirely depends on the methodology.
    Last year, there was a list of “ten drunkest cities”. The number 1 city (Fresno) “won” because it had the highest rate of DUI arrests. In other words, a city that sends out DUI patrols and checkpoints 2 times a week gets called more drunk than a city that never arrests anyone.
    So in transit terms, a “safest transit” list might penalize the city with 5,000 arrests over the city with 500 arrests…..even though the first city might have a 100% capture rate, and the second a 1% rate.

  5. Jase May 3, 2011 at 4:14 am #

    Pomegranate the best? You are Out Of Your Mind. Apricot FTW and all you liberals can go whine about it to your precious ” government”
    ps this is what the rest of the Internet is actually like.

  6. Alex B. May 3, 2011 at 6:04 am #

    EngineerScotty is right – most outlets, particularly in the online age, do this kind of stuff to troll for eyeballs and boost their pageview count. Whether it’s a top ten list or a slideshow or something similar, the whole idea is get people to click through.

  7. GMichaud May 3, 2011 at 7:57 am #

    Aside from the carnival atmosphere any top ten list in the major media generates, I do think a top list of transit systems, based on reasonable markers could be a useful tool for study. One difficulty is considering so many transit systems, especially on a large scale, such as the US or globally. This might require a change in the title to something like ten well run transit systems worthy of study, rather than trying to rank the best.
    It would be useful to look at well done systems, to understand why some systems work better than others. Make no mistake though, it cannot be done with cursory evidence. To be useful it would take detailed analysis. The type of analysis that is done on this site on a regular basis.

  8. Alan Kandel May 3, 2011 at 10:18 am #

    Pacific RailNews (the magazine) had an article called: “Making the Grade: Does Your Favorite Pass, Pass?” I thought it was meaningless, but interesting nonetheless. I wanted to know what other railroad aficionados thought. PRN provided a list of the top 25 according to reader response. In case you’re confused, some famed railroad passes are passes such as Tennessee Pass in Colorado (highest U.S. mainline railroad pass at around 10,000 ft. in elevation), California’s Tehachapi Pass (which has incorporated into it the world famous Tehachapi Loop) – I think you get the picture.
    I subsequently wrote a letter to the editor explaining that my favorite pass (Altamont Pass in northern California) didn’t even make the top 25. It didn’t matter. Here’s some of what I wrote in that letter reprinted in the August 1996 PRN.
    “Was I up in arms because the pass I prefer wasn’t included among the top 25? No! I figured the ranking was all subjective anyway. It was nice to see how others selected the various grades and gain some insights as to grades I might not have been aware of before.”
    The point here, I believe, is there must be some value to the U.S. News article. Even if it just gets people tuned in to transit, gets them thinking along these lines (pun intended), that could be useful. But not to share criteria for what makes the best the best, seems counterintuitive.
    No matter. Was this another way of asking what your favorite transit system is? No doubt. Phoenix Metro Rail’s is right up there. I’ve ridden it twice. I can’t speak to ones I haven’t ridden.

  9. In Brisbane May 3, 2011 at 4:23 pm #

    OMG!!! BANANA, where did you go wrong!!

  10. GMichaud May 3, 2011 at 8:10 pm #

    There is no doubt a fanciful, carnival atmosphere that represents much of mass market writing has little substance behind it.
    I have mentioned a book before “Vallingby and Farsta-from Idea to Reality” by David Pass, MIT Press as an excellent analysis of the creation of transit. (Suburbs of Stockholm Sweden).
    To me it is a good example of the type of analysis that should accompany any top ten ratings. This detail would be useful to transit professionals, but the interested observer can also take something from the discussion.
    Clearly if you are writing for the general public the level of detail may be different, however that does not excuse the journalist from being informed about all of the nuances of transit.
    A well informed journalist will write a better top ten list than one who is merely interested in the yellow journalism aspects of the subject in order to sell papers. It is important to distinguish the writer, the article and their general knowledge and goals.
    I do not disagree with Alan above that an article can get people interested in transit. However, a poorly done, uninformed journalist can do more harm than good.
    Unfortunately in America, the land that keeps its people in ignorance about city planning, architectural and aesthetic issues, stupidity seems to be the rule.

  11. Alon Levy May 4, 2011 at 12:35 am #

    Yes, there’s certainly a point in ranking systems on individual metrics, as long as one doesn’t have pretenses about overarching top 10 rankings. It’s useful to know many of the metrics used by US News for colleges, such as class size and selectivity, though these are far from exhaustive.
    I for one would be very happy if there were an international database telling me easily comparable numbers for transit systems. Unfortunately, the only metrics for which I’ve found easy comparisons are ridership, system size, the fare structure for one-zone or almost one-zone systems, and for a few networks the maximum rush hour frequency. I’ve been trying to compile decent numbers on construction and operating costs for a while, and never gotten beyond first-order figures.

  12. SpyOne May 13, 2011 at 8:05 am #

    I have both a good example of somebody who explains their methodology a bit better, and maybe alsa a clue as to where US News got some of their data: The Brookings Institute has just released a report on transit in the 100 largest US cities.
    http://www.brookings.edu/~/media/Files/Programs/Metro/jobs_transit/0512_jobs_transit.pdf
    And I notice some things like: people at US News’s site were expressing incredulity at Honolulu making the list. Well, the Brookings Institute says 97% of working age people in Honolulu live “near” transit, making it #1 in the US in that category.
    Honolulu is also #1 in the percentage of jobs that can be reached by a “typical” resident on transit in 90 minutes or less (59.8). And because of that, it is rated #1 overall.
    So perhaps US News was reading over somebody’s shoulder. Or perhaps they were merely using some of the same criteria.
    I haven’t read the report, just glanced at it, but it does seem like they are willing to explain how they got the numbers.

  13. EngineerScotty May 13, 2011 at 11:55 am #

    Some comments on the Brookings report, lifted from a comment thread at Portland Transport:
    The Brookings report is interesting, and its top 20 is rather surprising: The top 20 metros are:
    1) Honolulu, HI
    2) San Jose, CA
    3) Salt Lake City, UT
    4) Tuscon, AZ
    5) Fresno, CA
    6) Denver, CO
    7) Albuquerque, NM
    8) Las Vegas, NV
    9) Provo, UT
    10) Modesto, CA
    11) Ogden, UT
    12) Portland, OR
    13) New York City, NY
    14) Milwaukee, WI
    15) Madison, WI
    16) San Francisco, CA
    17) Washington, DC
    18) Seattle, WA
    19) El Paso, TX
    20) Bakersfield, CA
    A few things to note:
    * One of the ranking criteria, “share of all jobs reachable via transit in 90 minutes”, likely benefits smaller, more compact cities. The only “large” cities in the Top Ten are San Jose (a city which gets half the ridership of TriMet despite having a larger population than the TriMet service area) and Denver. It would be interesting to instead compare transit commute times with similar trips made by driving–but land use is important, folks.
    * The regions of interest are metropolitan statistical areas, so many cities with excellent urban transit but poor transit to the ‘burbs do poorly–in particular, many large East Coast cities with well-regarded transit systems (such as Chicago, Los Angeles, and Boston) fell into this trap. And New York and Washington DC, while making the top 20, were ranked lower than Portland.
    * Quite a few college towns make the list–and I suspect that a few more (such as Eugene/Springfield) would also do well were they to have sufficient population to be considered for the study.
    Portland does well on two of the criteria (% working residents near a transit stop, and wait times), but does poorly on the “90-minute” criteria. I suspect the large number of jobs in places like Washington County has something to do with that.

  14. Alon Levy May 13, 2011 at 1:09 pm #

    San Jose has the USA’s worst light rail system expressed in boardings per route-km; jobs may be vaguely near commuter rail, but the streets are so pedestrian-hostile that employers run shuttles, and even then the transit mode share is low. Any metric placing it as the second best in the US is not worth bothering with.

  15. EngineerScotty May 13, 2011 at 1:22 pm #

    Of course, it is probably a stretch to call LA a “Large East-Coast City”, unless you live in Santa Monica. 🙂

  16. Alon Levy May 13, 2011 at 4:00 pm #

    If you’re a fish, then east coast is absolutely the correct term to describe LA.