Published January 11th, 2017
by Steven Clark

Early last year, I knocked together a post about UX that threw out a pretty rough definition while also talking about how weird your dick looks in a Morphsuit:

“User Experience is the methodology of analysing and improving the overall experience of using a website or app for your average user by gathering information (e.g. surveys, click heat maps, analytics), trialling iterative improvements based on that information (e.g. beta testing, A/B tests) and improving your app/website based on the outcome of those trials.”

Weird dicks
Most of my blogs are about weird dicks if you think about it

While that blog wasn’t my finest or most intelligent writing, it at least provided a succinct summary that demonstrated a base understanding of something I’m only somewhat familiar with. Research isn’t something I find myself doing all that often as SMEs rarely have budgets set aside for UX when they commission a website build. If I’m lucky, I’ll have a few months of analytics reports from an old website, a customer survey or two or some vague half-remembered user feedback from the client. Any data is better than none at all but I’d be lying if I said any of this drastically changes my design approach, rather than just helping to eliminate options or shape overall site structure.

Without legitimate research, there’s rarely justification to produce a series of wireframes prior to branded visuals, even when you can produce them quickly. While wireframing can be used to test functionality or layouts with real users, they’re more the outcome of user research than a core part of it (as a handy graph in this beginner’s guide to UX illustrates). Having never worked anywhere that sold wireframing to clients, I find it difficult to rationalise why they’re necessary, particularly for web projects. That’s not to say we didn’t use them for in house product development but this was more the result of merging all of our industry specific UI components together into one framework instead of any research related outcome.

Early user testing
Early user testing experiments

Even if rudimentary data influences your design decisions or you produce wireframes before visuals it’s still not strictly speaking UX – it’s just design.

This is the fundamental misunderstanding some seem to have with UX. Before user experience was the industry’s favourite wank rag, design still required basic information gathering followed by application. It was always about solving problems – even if it was as simple as not enough conversions – but some agencies and digital professionals seem to think that doing as little as what I’ve described above somehow employs a UX methodology to projects. This conveniently leaves out a pretty fundamental (and also presumably the most time-consuming and expensive) part of UX – the part where you test your product with actual users.

This confusion has seeped its way into to many public orgs and businesses looking for in house designers. Job roles you see thrown about by LinkedIn recruiters now have arbitrary “UX” or “UX/UI” prefixes stapled to their titles and include no new requirements besides a proficiency in an enterprise level wireframing program like Axure or Balsamiq.

To some in the world of agency UX, data driven design is as simple as “focusing on an end goal or conversion”; UX Design posts are mostly about making your site responsive and what is supposedly a scientific and analytical process with definitive outcomes can somehow be compared to Disney animation and Stranger Things. This literal interpretation of user experience becomes about how users feel instead of how they act; less the deliberate analysis of user behaviour and more a user’s positive or negative emotional response to an interface.

Stranger Things
This is what real UX looks like, yes I am available for public speaking opportunities

This is why UX can be presented as assumptions some accepted as truth years ago. “Users get frustrated when your site doesn’t work properly on mobile”, “users get angry if a signup process takes more steps than is necessary” and “users get confused if your contact information isn’t obvious enough” are all common examples. These internalised beliefs – that aren’t based on actual research, remember – are then poorly communicated to clients who begin to believe bullshit like “nobody scrolls on web pages”. This is why we need reports and entire websites to debunk popular UX myths.

You, your colleagues or your Twitter followers are not your users. Taking the time to consider how users may feel as they experience the site you’re designing isn’t really the same as testing how they’ll actually behave when browsing your site, nor are they equally valuable.

Note: This post originally appeared on the Habanero Digital blog meaning certain parts are probably weirdly self referential or talk about ongoing feuds that probably exist almost exclusively in the mind of the author.

Clients Content Digital Agencies User Experience

by Steven Clark

Steven is a designer/developer and wannabe intellectual with an obsessive personality and too much spare time. Don’t follow him on Twitter.

< Browse posts