Plans and Situated Actions, by Lucy Suchman

Posted: September 6, 2005 at 11:53 pm in conversation, nonfiction, socialsoftware

Amazon link

Subtitled “The problem of human-machine communication”, this book debunked the prevailing philosophy in artificial intelligence at the time it was written in 1987, which was the belief that people worked by making a plan, and then executing it. Suchman examines this seemingly common-sensical idea and pointed out several of the flawed assumptions associated with it. In particular, she notes that we never fully specify a plan, because to do so would involve an excruciating level of detail. She quotes from Boden’s article “The structure of intentions”:

If one intends to buy bread, for instance, the knowledge of which bakers are open and which are shut on that day of the week will enter into the generation of one’s plan of action in a definite way; one’s knowledge of local topography (and perhaps of map-reading) will guide one’s locomotion to the selected shop; one’s knowledge of linguistic grammar and of the reciprocal roles of shopkeeper and customer will be needed to generate that part of the action-plan concerned with speaking to the baker, and one’s financial competence will guide and monitor the exchange of coins over the shop counter.” (Boden, p.28, cited on Suchman, p.44)

All of this detail is included in some sense when we make a plan to buy bread, but we rarely go into such a level of detail, because we don’t need to. It’s assumed by our familiarity with our surroundings that we understand the process of navigating through our neighborhood, of negotiating with the shopkeeper. In fact, the only time we even think about this level of interaction is when we are trying to explain it to somebody who is unfamiliar with the situation, whether a stranger or a child. These details are evoked for us by the environment.

Suchman contrasts this sense of embedded detail with how people were trying to program robots at the time. She uses the example of a robot designed to “navigate autonomously through a series of rooms”, where the robot would first observe the rooms, plot a course through them, and then follow that course. Of course, if obstacles were moved after it had plotted its course, it didn’t take that into account. As humans, we take for granted our ability to continually evolve our plans in response to our situation, but computers illustrate how difficult such situation awareness is to describe. She points out that plans, rather than being a blueprint of action, make more sense as a resource for action. The idea is that we make plans before entering a situation, and we draw upon those plans while in the situation, but if circumstances change, we obviously do not continue blindly following the plan. They are a resource, not a complete description. The similarities to Klein’s Naturalistic Decision-Making are apparent.

Similarly, the specification of plans makes more sense after the action has occurred, because afterwards we can figure out what actions were relevant to the goal which we were trying to achieve:

Instructions serve as a resource for describing what was done not only because they guide the course of action, but also because they filter out of the retrospective account of the action, or treat as “noise,” everything that was actually done that the instructions fail to mention. (p. 102)

Because instructions (or plans) only mention the parts that are considered relevant at any given point in time, they provide an extremely filtered world view that depends on the reader sharing a similar situation. For example, a set of assembly instructions can refer to a specific screw and assume that the assembler will find the right screw, because it came with the kit. As a total aside, it seems like the field of studying instructions can be philosophically fruitful; I seem to remember that Pirsig rants about instructions for several pages in Zen and the Art of Motorcycle Maintenance, which spurs him on to making the classic/romantic split. Anyway.

Suchman also draws on the study of ethnomethodology, a branch of sociology. Mainstream sociology made the assumption that there was an objective social order, which could then be observed and described by trained sociologists. Ethnomethodology, as founded by Harold Garfinkel, insisted that social order was constructed, and that studying how that construction occurred should be the goal of sociologists. As Suchman notes, “The interest of ethnomethodologists … is in how it is that the mutual intelligibility and objectivity of the social world is achieved. … the objective reality of social facts is not the fundamental principle of social studies, but social studies’ fundamental phenomenon.” (p. 58)

She uses these concepts to study the art of conversation and communication. The thing I found interesting about her take on conversation is that it is not a simple transfer of information, where one person says something, and the other hears it. Because what is said is only the smallest part of the conversation, the listener must actively try to construct meaning from what the speaker is saying. The listener constructs a model in their head, using cues from the conversation to build that model. This can lead to unfortunate misunderstandings if the mental models of the speaker and listener diverge, despite both of them participating in the same conversation. I use the example of people talking past each other in meetings in this post. However, when it is working well, the meaning of the conversation is continually being constructed by its participants.

She points out that the advantage of having such a conversation with humans is that eventually inconsistencies in shared meaning become apparent and the listener stops and says “Wait a second, what do you mean by ?” Then the conversationalists can review their assumptions and confirm that their mental models are consistent. Computers do not have the same capability for mental model repair; they get stuck in a state, and have no way of exiting, so the human has to do all the heavy lifting of mental model adjustment (a.k.a. trying to figure out what the heck the computer is doing).

Yikes. This post is completely out of control. Obviously, there’s lots of good stuff in this book. I actually read the first half of the book while up in Portland, and then when I dug it out for my vacation a couple weeks ago, I tried picking up where I left off and couldn’t, so I ended up re-reading the first half and taking a whole new set of notes. It’s tough sledding in spots, but ultimately rewarding. Although I wouldn’t blame anyone for skipping straight to Paul Dourish’s book, which builds on Suchman’s work and describes her work in a couple concise pages. Mad props to Jofish for recommending both books.

4 Responses to “Plans and Situated Actions, by Lucy Suchman”

  1. The Rantings of Eric Nehrlich || Conversational interfaces Says:

    [...] Plans and Situated Actions, by Lucy Suchman [...]

  2. The Rantings of Eric Nehrlich || Where the Action Is, by Paul Dourish Says:

    [...] Plans and Situated Actions, by Lucy Suchman [...]

  3. Futurelab's Blog Says:

    Information Foraging Theory…

    by: David Jennings I came across information foraging theory via Jakob Neilsen, who described it as "the most important concept to emerge from Human-Computer Interaction research since 1993"…….

  4. More songs about context and mood: A deeper dive into definitions « Adam Greenfield’s Speedbird Says:

    [...] the user and the applications themselves.” And this is useful, or even necessary, because, as Lucy Suchman had already argued, most of the dispositive factors in human interaction tend to become explicit only in the event of [...]

Leave a Reply

RSS feed

LinkedIn profile

Twitter

Munger: "Knowing what you don’t know is much more useful in life and business than being brilliant." blogs.wsj.com/moneybeat/2014… via @markhurst

Recent Posts

  • The Rise of Superman, by Steven Kotler
  • It’s not about you
  • Wear your damn helmet!
  • Thinking Fast and Slow, by Daniel Kahneman
  • How is your memory indexed?
  • Random Posts

  • Rules and people
  • LookinGlass Architecture and Design
  • Collaborative selves
  • Context sensitivity
  • Esther Dyson on LinkedIn


  • Archives

  • Categories