Obama And McCain Sites Fail The Usability Test

I just published a report called Presidential Candidate Sites Fail Usability that examined the usability of John McCain’s and Barrack Obama’s Web sites. We applied 5 of the 25 criteria from Forrester Web Site Review methodology. Our evaluation tested two scenarios: 

  • Make a $50 contribution on the candidate’s site
  • After seeing all of the news about the candidates’ different energy policies, user wants to quickly find an overview of the candidate’s energy policy.

Here’s how the sites fared:

These results are not very good. Interestingly, both sites failed our privacy and security criteria. This was a problem that I highlighted last November when I evaluated the six major candidates at that time (McCain was not included in that analysis).

The bottom line: Obama and McCain should make it easier for their online constituents.

Written by 

I am a customer experience transformist, helping large organizations improve business results by changing how they deal with customers. As part of this focus, I examine strategy, culture, interaction design, customer service, branding and leadership practices. I am also a fanatical student of business, so this blog provides an outlet for sharing insights from my ongoing educational journey.

Simply put, I am passionate about spotting emerging best practices and helping companies master them. And, as many people know, I love to speak about these topics in almost any forum.

My “title” is Managing Partner of the Temkin Group, a customer experience research and consulting firm that helps organizations become more customer-centric. Our goal is simple: accelerate the path to delighting customers.

I am also the co-founder and Emeritus Chair of the Customer Experience Professionals Association (CXPA.org), a non-profit organization dedicated to the success of CX professionals.

7 thoughts on “Obama And McCain Sites Fail The Usability Test”

  1. I just glanced at Obama’s site and not sure how he ‘failed’ on either scenario for most of the criteria you displayed.

    – there’s a big, bright red Donate Now button in the top right, above the fold. And that’s even if you don’t get the Donate Now splash page first, which I did.
    – And for energy policy, there’s Energy and Environment under the Issues drop down, which is the first place I’d look if I was looking for policy statements on the key issues.
    – There’s a link to their Privacy Policy in the footer, which is a very common web convention. Didn’t see a link to a Security Policy, but I’ve rarely seen those explicity laid out in navigation. Not saying that’s right, but it far more common to not have a security policy easily ‘findable’ on a site. (just checked Amazon, for example. and didn’t see one in the usual spots)
    – And not really sure how text can be listed as not legible, but it supports easy scanning. I could see legible text, but not easily scannable. But the other way seems pretty difficult to accomplish.

    Anyway, I know a lot of usability testing is subjective and these are just my opinions… so take them for what they’re worth!

  2. Jason: Thanks for sharing your thoughts. When we find a design flaw on a site, it doesn’t mean that everything on the site is flawed. By applying a standard methodololgy to the evaluation, we remove a lot of the biases that individuals (all of us) have when they look at a site. There are more details about our findings in the report, but (unfortunatley) that information is only available to Forrester clients. We’ve completed over 1,200 of these reviews (for our research as well as for consulting we do with individual companies) and our clients just about alway find them to be extremely valuable for finding areas where they can improve.

  3. This seems to me to highlight the problem with usability that we are trying to fix here at Foviance. Usability assessments vary enormously and with much of the discipline vested in qualitative research it is little wonder. We have been pioneering the use of quantitative evaluation alongside qualitative by employing technologies such as web analytic and survey software to find out what people really are doing and thinking.

    If we actually knew how many people click on donate vs how many visit the site or even better compared how many donated with those who record their intent as to donate in a simple survey, we would know whether the presentation of the button really is an issue.

  4. Paul: I’ve always recommended using a combination of tools to assess usability. All tools have their own strengths and weaknesses. Web Analytics and surveys are extremely valuable, but they don’t replace expert reviews (which is what we’ve done for this research). The key to success is understanding where and how to use each tool.

Leave a Reply