DonnaM » Blog Archive » Yes, IA is rocket science

Yes, IA is rocket science

I was in a meeting recently to talk through a new website project – discussing the stages of the project. I was suggesting my normal approach – understand the project goals, do some user research, analyse content, draft the IA etc etc.

As we were talking through the process, I noticed one of the senior managers was clearly unsettled. After we talked a bit about the early steps, he finally said “Why do we need to do this? Why can’t we just come up with the IA. After all, it’s not rocket science”.

He, as a senior manager, had a fairly good idea of  the domain. So he had a fairly clear idea of how it would best be represented on the website. His ideas weren’t bad at all, but I didn’t know if they were ‘right’. After a bit of discussion we agreed to make some quick changes based on his ideas, but reserved the right to change it when we had collected some information.

But it did make me think. Why do I think there is some complexity to creating a good IA for a website, when to others it appears simple? (I’ve noticed that people generally think their own field or expertise is complex, and assume that other fields are straightforward – I think that is just human.)

I don’t really think IA is as hard as rocket science. But I do think there are some hard parts:

  • We usually deal with messy problems
  • Our projects are all about language and concepts, which vary from person to person
  • A lot of what we do is pulling together different (often competing) inputs to try our best to create a balance
  • We have to work with opinionated people. And everyone has an opinion on how things should be grouped, labelled and what is most important!
  • There is no one right answer
  • Our individual experiences contribute to solutions – so the ‘answer’ depends on who creates it

But it is achievable. I think part of the trick to helping people understand that there is complexity is to better explain the pathway and rationale for decisions – show how inputs contributed to outputs, how we’ve balanced priorities. Not just show the end result…

3 Responses to “Yes, IA is rocket science”

  1. Jessica Enders Says:

    Great post Donna.

    As you can imagine, this happens a lot with forms too, and it’s something I ponder over a lot. I agree you summary of ‘hard parts’ and would particularly agree with the important role that is illustrated in your third dot point: “A lot of what we do is pulling together different (often competing) inputs to try our best to create a balance”. Your client may have had a really good knowledge of his domain, but this doesn’t mean he’s across the needs of *all* parts of the organisation, nor understands some of the nuances that arise at deeper levels.

    I also think the big benefit that an ‘expert’ brings is familiarity with the number of different ways that a particular design approach can fail, even if on face value it looks workable. This familiarity comes, of course, from experience working with and observing users.

    For example, a past client of mine wanted a part of their customer-facing IA to be “Sustainability”. While this made sense from a categorisation point of view, my work with users suggested that the label itself was going to be problematic for a portion of the target population. In the world where the client ‘lives’, “sustainability” is the hot-button word-of-the-moment and everybody knows what it means; this did extend to some of their customers, but definitely not all.


  2. Donna Spencer Says:

    Thanks Jessica, that certainly was the case with this client. To her/him it seemed that an audience-based approach would be completely logical, but I know how hard these are to make work…

  3. Glen Wallis Says:

    Hi Donna, interesting post. I think you nailed it when you said ‘show how inputs contributed to outputs’. I find real world examples really help.

    I conducted user testing of a draft site recently and found that only one user even noticed an element that was considered highly important. We re-worked it and the result was an obvious improvement. I now use that as a story to illustrate the importance of testing.

    Another story I tell is about a large intranet project from a few years ago. We started by conducting card sorting sessions with the existing labels to identify weaknesses.

    Users were given a bunch of cards and they sorted them according to the existing navigation. Cards they were unsure of went into a separate pile. One guy labelled his pile ‘WTF?’.

    The WTF pile, as we called it, was the largest pile after numerous such tests. As we refined the draft IA we found less and less WTF resources, and when we tested the final IA we got none. Isn’t that a great advertisement for user-centred design?

    Keep up the good fight Donna. It may not be rocket surgery but it certainly isn’t easy. If it was easy everybody would do it.

Leave a Reply