The Intuition for Abstraction

I talked about a particular test technique and how intuition can fail us. I’ve also talked a lot about testing and abstraction. But is there an intuition for abstraction? I’m thinking there is. And I’m thinking the better testers I’ve known have shown it. But I struggle with how to make this a useful concept in terms of building up this skill in testers.

I do believe that good testers embrace the soul of testing. But, in and of itself, I haven’t made that concept operational enough yet, even to myself. Yet is that intuition somewhere in that idea? I mentioned in that post that key abilities of testers are that of spotting ambiguity, discovering inconsistency, and ferreting out discrepancies. Key to all of that, however, is a willingness to embrace uncertainty.

Requirements are often implicit and bugs are often hidden. It takes a combination of logic, technique, and intuition — and sometimes pure luck — to handle implicit and/or hidden things. It’s no coincidence that many testers are inordinately fond of puzzles and games. Testers like to hunt for and find stuff. The hunt is exciting, and finding new information, or an answer, or even a new question to ask is often the ultimate motivation.

Instincts and Evolution

Along these lines, I read the book The Black Hole War and in it, Leonard Susskind says something interesting:

“All complex life-forms have built-in instinctive physics concepts that have been hardwired into their nervous systems by evolution. No one really knows how much is hardwired and how much is learned in early life.”

And, quite frankly, the distinction may not matter all that much. Susskind continues:

“The point is that by the time our nervous systems are mature, experience, of either the personal or the evolutionary kind, has given us a lot of instinctual knowledge of how the physical world behaves. Whether hardwired or learned at a very young age, the knowledge is very difficult to unlearn.”

What Susskind is saying is that this preprogrammed physics software, for lack of a better term, is largely what made survival possible. Susskind continues:

“Mutation and natural selection have made us all physicists, even animals. In humans the large size of the brain has allowed these instincts to evolve into concepts that we carry at the conscious level.”

Yet, as we know, it’s quite possible to encounter things (and people!) that go counter to our instincts. In those cases, our preprogrammed software is now being asked to handle conditions that it was not programmed for. This is where we get a breakdown of our intuition. This breakdown is ultimately what Susskind is talking about in the book, regarding ideas like quantum mechanics and physics of objects like black holes.

For me the point is that a lot of times our projects at work mirror the situations Susskind is talking about: we have to work with equal parts equivocal speculation, unequivocal fact, and pure myth. This leads us to do both number-crunching and scenario-building. This leads us to create open architected mental models that are supposed to act as an elaborate heuristic, a way of making sense of things. These models become an idiosyncratic tool with which we try to hammer our work, as well as that of others, into some kind of coherence. The problem is when our heuristic models — the things we use as a way of making sense — stop making sense.

This is where we encounter the epistemological angst I was talking about a few years ago.

Instincts and (Survival?) Advantages

With concepts like big bangs and black holes, there is no way that evolutionary pressure could have created an instinctive comprehension of these radically odd concepts. If you think about it, the same thing applies to what we deal with daily in terms of shoving electrons arounds by creating business abstractions that model a domain so that users can enact a certain behavior that leads to certain observables, some of which are visible and some of which are not. Yet — in the case of physics and software development — we often find that something in our neural networks is primed for a fantastic rewiring process that allows us not only to ask about these obscure concepts and phenomena but also to create mental abstractions — often, deeply unintuitive new concepts — to describe, manipulate, and explain them.

It’s in that very spirit — in the uncertainty of the empirical, if you will — where testers thrive. It’s in this context that testers embrace a process of discovery. We become completely engrossed until a pattern starts to emerge and we are able to tease details, often from apparently irrelevant facts, seemingly extraneous material, and unrelated commentary or casual asides. We are often forced to follow the principle of peripheral wisdom. We up end becoming quite good at recognizing stochastic resonance, which is where faint signals in “background noise” are made more distinct, more detectable. We live in a world where assertion tends to overwhelm evidence and claim easily trumps fact. This is a world where personality and ego collide with business knowledge and technology.

In a very fundamental way, all of this is exactly what we are dealing with as software and test engineers. I would argue that the best of us display a codified and practiced behavior but also a very instinctual one. That instinct for discovery and the communication of discovery is what primes our intuition to make further discoveries and learn even better ways to communicate them. This is one of the key value-adds of a tester mentality and it’s absolutely something I bring up when someone asks why developers can’t be the testers or why conflating the roles of testing and development is often counter-productive.

Subtle Differences

Somewhat along these same lines, I also read the book Eureka: Discovering Your Inner Scientist. There Chad Orzel says that scientists of his sort are sometimes thought to be “smarter” than so-called “normal” people. He says he feels that’s not true. He says:

“Scientists are not that smart — we don’t think in a wholly different manner than ordinary people do. What makes a professional scientist is not a supercharged brain with more processing power, but a collection of subtle differences in skills and inclinations.”

But those subtle differences and inclinations can make all the difference. That still leaves the question: what are those subtle differences and inclinations?

If I put it “technically”, I would say it’s the ability to design cognitive artifacts as well as the ability to design and use heuristics. But if I go beyond that, and distill it to the basics, it’s the art of how we communicate. More specifically, it’s our ability to impose discipline around our communication; it’s our ability to separate inference from observation; it’s our ability to pare down to just the relevant material that most concisely explains what we need to say; it’s our ability to communicate with tests in a way that are reflective of the domain.

In short, it’s about the ability to tell stories and build narratives.

Social Brains and Narratives

Sven Birkerts, in The Gutenberg Elegies: The Fate of Reading in an Electronic Age, said that “humanistic knowledge” (what something means or why it matters) is unlike “instrumental knowledge” (how it’s done), since the former aims to fashion a comprehensible narrative. Along those same lines, Steven Pinker, in The Language Instinct, says that human language grew out of the need to share knowledge with survival value. We know that spoken language, and with it the human mind, evolved precisely because planning — constructing and evaluating scenarios — is a survival skill.

That survival skill, however, has run into interesting barriers when it comes to being applied in a technology context. This is where that fascinating neural re-wiring I mentioned before seems to kick in. But it needs to be recognized and nurtured. We all have the instinct, but we don’t all have the intuition that can be built up from it.

Along similar, or at least related, lines of thought, in the book Scenarios, Stories, Use Cases Through the Systems Development Life-Cycle, I found this:

“Our social brain — the one that enjoys watching the scheming complexities of Shakespearean tragedies and soap operas — is adapted precisely to understand stories. It is adept at filling in details from scant evidence; at guessing and reasoning about people’s intentions; at predicting how people would respond if we chose a certain course of action.”

The instinct and intuition for narrative utilizes these mental abilities. In the testing context, this narrative impulse is largely satisfied by the notion of the scenario, whether in use cases or in BDD-style feature files. Regarding the idea of a scenario, the book says:

“[The scenario] focuses on agents and their (inter)actions. It is at root brief and abstract, but highly suggestive of context. It is helpful for predicting outcomes. It is all about courses of action.”

So my view on this is that approaches that focus on tests as expressive scenarios benefit from the inherent strengths of narrative, and from the intentional match of the story form, both in creation and presentation, to the story-processing capabilities of the human brain.

Martin Fowler once famously wrote the following:

“Any fool can write code that a computer can understand. Good programmers write code that humans can understand.”

With the current emphasis — sometimes over-emphasis — on testers-as-developers and automation, I would say this:

“Any fool can write a test that automated tools can execute. Good testers write tests that provide a narrative that humans can use to explore a business domain.”

Okay, I’ll grant it’s not as concise as Mr. Fowler but my point is simply that I believe testers need to be thinking like this.

How do we build up our intuition for what makes us distinct as a discipline? How do we put that intuition to the test? How do we measure the results of our applied intuition? One of my concerns is that approaches like BDD have done nothing to foster this intuition to communicate; if anything proponents of BDD tools — like Cucumber, JBehave, SpecFlow, whatever — have stifled it by constantly assuming there is a “right” and “wrong” way to communicate. There are, to be sure, better and worse ways but we have to grade on a scale when we apply communication techniques within our particular situations. And if we are going to utilize tool solutions, those solutions must reinforce the communication techniques we have decided upon.

About Jeff Nyman

Anything I put here is an approximation of the truth. You're getting a particular view of myself ... and it's the view I'm choosing to present to you. If you've never met me before in person, please realize I'm not the same in person as I am in writing. That's because I can only put part of myself down into words. If you have met me before in person then I'd ask you to consider that the view you've formed that way and the view you come to by reading what I say here may, in fact, both be true. I'd advise that you not automatically discard either viewpoint when they conflict or accept either as truth when they agree.
This entry was posted in Testing. Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *