-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

Technology affects us

The term “technodeterminism,” like “utopian” or “wild-eyed socialist,” is rarely used by the people to refer to themselves. But I’m willing to accept the characterization … so long as I then get to claim a moderate form of it.

At its most extreme, technodeterminists think that technology causes all people to use it in particular ways, and perhaps even to have particular beliefs. But I’ve never met anyone who believes that technology affects all people the same way, as a falling safe would. Clearly if technology affects us, it’s in part because of what we bring to it as individuals and as members of a culture: The Internet is a different sort of thing when it’s used by an educated Westerner than by a committed member of the Taliban.

But I do want to be able to have conversations about topics like “How did the introduction of the telephone affect American democracy?” and “What effect is the Internet having on social protest movements in the Mideast?” The answer to both questions might be that these technologies have had no effect, but I don’t want the questions ruled out as meaningless. Technology does affect us. Color me a technodeterminist.

It has not always been obvious that technology affects us in any core way. What we called technology today we used to call tools, and tools were assumed to be simply means for implementing human will. They thus weren’t an interesting topic when considering human history. In the 19th century when people started to write histories of tools, they were thought of as histories of inventions, interesting only because of the human ingenuity they displayed. Who’d waste time writing a history of the shovel? It was a snort-worthy idea.

In the 20th century, historians began to pay attention to how inventions had effects beyond the straightforward services they enabled. The telephone affected the willingness of family members to move away. National highways changed life in small towns and farms even if they were not near one. Cultural critics like Lewis Mumford and then the mighty Marshall McLuhan addressed directly the question of whether and how our tools have effects far beyond the soil they dug or the metal they hammered.

Shaped by need

But how do mere material objects affect us? They’re just lumps. We humans are the creatures with agency and free will. How could the tools we use also use us? Aren’t technodeterminists making a fundamental mistake about how the whole shebang of human consciousness works?

No, we’re not. Unless, of course, you’re assuming that we’re saying that tools shape us the way we shape a piece of wood with a tool.

In fact, it seems pretty straightforward to me how this works. Pardon me while I state what I think is blindingly obvious.

Someone invents a tool with a purpose in mind. That tool is literally shaped to address that need.

The creator expresses that purpose in the shape of the tool—it has a place you can grab, it seems to point in a direction, etc.—and in the instructions that come with it, in the name, in which sorts of stores are approached to stock it, etc.

That is, as some in the field would say, by the time almost all of us come upon a tool, it’s been “domesticated.” It’s not thrown at us from the side of a highway with no context. Rather, we see it in a hardware store with contextual information all around it, or we see a friend using it. We might even ask our friend how it works. We thus aren’t presented with a crowbar as just a lump of metal, but instead see a friend using it to pry something open. All of this occurs within a culture that provides a “domesticated” context for it. For example, we learn about keys within a cultural context that has strong norms about the sorts of things that should be locked. There’s no magic here.

Messy matrix

This is true of the Internet as well. We generally don’t stumble upon it in the wild. Rather, we (these days) grow up around people who are using it in particular ways that vary from culture to culture. We see our parents reading the newspaper on a tablet, looking up info on a mobile phone or playing a game. Our teachers show us how to use it to explore a topic. Our friends show us where the somewhat illicit material is. If you’re a child of the Taliban, you will be introduced to a Net that has been domesticated in other ways.

From this there are implicit lessons. In a typical Western introduction to the Net, we’ll learn that it’s massive, that we can post to it without asking permission of anyone, that what we post is open to everyone, that ideas and content are linked, that links lead us to unexpected places, that there is good and bad information on the Net, etc. Each of these discoveries brings with it implicit values. Apparently we in the West think that open access to information overall is a good thing, although it brings risks. We apparently see value in some types of collaborative efforts. We apparently believe that knowledge spreads out sidewise in unending and messy webs, rather than “really” living in sequential books.

There is nothing inevitable about how the Net has been domesticated within our environment, nor about the lessons we draw. For example, it is not only conceivable but common for children to be taught that the openness of the Net to unvetted ideas is a terrible failing. The lesson those children draw will likely be negative about uncontrolled information and positive about authority.

That means that technology doesn’t cause us to use it one way instead of another, much less to come away from it with particular beliefs or attitudes. But it does affect us, and it does so the way anything does: in a messy matrix of cultural and personal values, with a healthy helping of pure accident.

That’s a technodeterminism I can believe in.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues