Context in UX – What It Is, What It Isn’t, and Why It’s Important

Big Brown Bat (Eptesicus fuscus) in Flight

Copyright@2012 Bill Kraus, All rights reserved.

Our location is relentlessly tracked by our mobile devices. Our online transactions – both business and social – are recorded and stored in the cloud. And reams of biometric data will soon be collected by wearables.  Mining this contextual data offers a significant opportunity to enhance the state of human computer interaction. But this begs the question: what exactly is ‘context’ ?

Consider the following sentence:

“As Michael was walking, he observed a bat lying on the ground.”

Now take a moment and imagine this scene in your mind.

Got it? Good.

Now a few questions. First, does the nearby image influence your interpretation of this sentence? Suppose I told you that Michael was a biologist hiking through the Amazonian rain forest. Does this additional information confirm your assumptions?

Now, suppose I told you that the image has nothing to do with the sentence, but instead it’s just a photograph I took in my own backyard and inserted into this post because I have a thing for flying mammals.  Furthermore, what if I told you that Michael actually works as a ball boy at Yankee stadium? Do these additional facts alter your interpretation of the sentence? Finally, what if I confessed that I have been lying to you all along, that Michael is actually in Australia, his last name is Clarke, and that he was carrying a ball gauge? Has your idea of what I meant by ‘bat’ changed yet again? (Hint – Michael Clarke is a star cricket player.)

The point here is that contextual information – the who, what, where, and when of a situation – provides critical insights into how we interpret data. In pondering the sentence above, providing you with context – either as additional background statements or through presumed associations with nearby content – significantly altered how you interpreted that simple sentence.

At its essence, context allows us to resolve ambiguities. What do I mean by this? Think of the first name of someone you work with. Chances are good that there are many other people in the world (or at your company if your company is as big as Oracle) with that same first name. But if I know who you are (and ideally where you are) and what you are working on, and I have similar information about your colleagues, then I can make a reasonably accurate guess as to the identity of the person you are thinking of without you having to explicitly tell me anything other than their first name. Furthermore, if I am wrong, my error is understandable to you, precisely because my selection was the logical choice. Were you thinking of your colleague Madhuri in Mumbai that you worked with remotely on a project six months ago? But I guessed the Madhuri that has an office down the hall from you in Redwood City and with whom you are currently collaborating? Ok, I was wrong, but my error makes sense, doesn’t it?   (In intelligent human computer interactions, the machine doesn’t always need be right as long as any errors are understandable. In fact, Chris Welty of IBM’s Watson team has argued that intelligent machines will do very well to be right 80% of the time – which of course was more than enough to beat human Jeopardy champions.)

So why is the ability to use context to resolve ambiguities important? Because – using our example – I can now take the information derived from context and provide you with a streamlined, personalized user experience that does not require you to explicitly specify the full name of your colleague – in fact, you might not need to enter any name at all if I have enough contextual background about you and what you are trying to do.

When it comes to UX, context is actually a two-way street. Traditionally, context has flowed from the machine to the user, where layout and workflow – the consequence of both visual and interaction design – has been used to inform the user as to what something means and what to do next.  But as the availability of data and the complexity of systems have grown to the point of overwhelming the user, visualizations and interactions alone are not sufficient to stem the tide. Rather, context – this time emanating from the user to the machine – is the key for achieving  a more simplified, personalized user experience.

Context allows us to ask the right questions and infer the correct intentions. But the retrieval of the actual answers – or the execution of the desired task – is not part of context per se. For example, using context based on user identity and past history (demographic category, movies watched in the past) can help a recommendation engine provide a more targeted search result. But context is simply used to identify the appropriate user persona – the retrieval of recommendations is done separately. Another way to express this is that context is used to decide which view to put on the data, but it is not the data itself.

Finally, how contextual information is mapped to appropriate system responses can be divided into two (not mutually exclusive) approaches, one empirical, the other deductive. First, access to Big Data allows the use of machine learning and predictive analytics to discern patterns of behavior across many people, mapping those patterns back to individual personas and transaction histories. For example, if you are browsing Amazon.com for a banana slicer and Amazon’s analytics show that people who spend a lot of time on the banana slicer page also tend to buy bread slicers, then you can be sure you will see images of bread slicers.

But while Big Data can certainly be useful, it is not required for context to be effective. This is particularly true in enterprise, where reasonable assumptions can be made from a semantic understanding of the underlying business model, and where information-rich employee data can be mined directly by the company. Are you a salesperson in territory A with customers X, Y, and Z? Well then it is safe to assume that you are interested in the economic climate in A as well as news about X, Y, and Z without you ever having to explicitly say so.

So in closing, the use of context is essential for creating simple yet powerful user experiences – and like the term ‘user experience’ itself, there is no one single implementation of context – rather, it is a concept that should pervade all aspects of human computer interaction in its myriad of forms.

7 comments

  1. How does context influence what I call “the creepiness factor?” Sometimes even the user doesn’t know what will creep them out until they experience it. For instance, I think about dating situations. Two different guys can use the exact same pickup line, with wildly different results. If the girl is attracted to one guy, she’ll think his line is clever or endearing etc. if she’s not attracted to the other guy, she’ll think he’s creepy or weird or lame.

    It’s not logical, or maybe it is but the context is so subtle and complex that it seems very difficult to predict. In the same way, one user may love the fact that the ATM chides them for overspending when they withdraw cash sooner than their norm; another person could find that incredibly unnerving and creepy.

    I’m curious what Bill (and others) think about the creepiness factor. When does responsive technology cross the line?

  2. Another way to think of context is that it provides additional state information. To use your example regarding pick up lines and creepiness, contextual information includes things like attractiveness of the soliciting party (appearance, age, socioeconomic status, knowledge of past history), location and atmosphere (at a party or on a dark street corner), time of day, the emotional state of the solicited party, etc.

    If you could measure these parameters (context), you could be able to predict the likely success and inversely, the ‘creepiness’ of an encounter (predictive analytics). But note that context is not prediction – it just can be used to make better predictions.

    As far as your ATM example, presumably with enough context one could predict the reaction of the user and respond appropriately based on user identity. As an aside, another very effective way of doing this is to use an adaptive system that learns from real time user feedback.

    Now people have argued that there is a ‘creepiness’ issue associated with predictive analytics – a well known example being the case where the retailer Target was able to predict the pregnancy of a teen before her father even knew by analyzing her buying habits…

    http://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/

  3. One person’s creepiness is another’s wow factor. Like a good joke, UX without nuance ain’t funny. And if you have to explain it, it ain’t any good either.

  4. @Ultan: I agree w your assertion about creepy vs. wow factor. It’s tough to predict (pun!) how people will react. IMO in enterprise the creepy factor is less a concern bc work is work. And all that entails.

    There are policies to protect you and clear recourses if you feel violated.

    Not so much outside the office.

    If context helps you do your job more effectively, it’s good.

  5. @Ultan again: High-end hotels have been researching their guests on the ‘tubes for a while now. It’s just good business. Creepy doesn’t seem to matter as much when there’s a price tag attached.

  6. So this is where adaptation and learning on the machine side comes in. Think about when you meet someone for the first time – there is an evaluation ‘dance’ going on – a negotiation of protocols if you will – where you (and the other party) typically adjust the tone (and possibly content) of your conversation based on how you perceive the other person’s level of knowledge, comprehension of language, sense of humor, etc.

    Now, why can’t machines do the same, attempting to make you more comfortable as you develop a history together – adjusting how it delivers services to you so that it wows you instead of creeping you out.

    Oh, and to Ultan’s point, machines should have a modicum of humor – why not aspire to making HCI a fun experience as well as informative… even dare I say, in enterprise applications ;-).

    @Ultan – thanks for the pointer!

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.