June 19, 2009
Early this spring I went to a lecture by designer John Thackara in New York. He spoke to a room of design students, stylish rain boots, and very wet umbrellas. Thackara’s book In the Bubble: Designing in a Complex World explores the following dilemma: “We’re filling up the world with technology and devices, but we’ve lost sight of an important question: What is this stuff for? What value does it add to our lives?”
He posed a related question in his lecture – is more technology always the answer to solving the challenges of daily life? He told a story about spending a number of months in a town with a team of designers. The town was having trouble getting children to school. It was an expensive service for the town to pay for and parents did not have the time to drop off and pick up their children everyday. The designers proposed a number of gadgets and monitoring systems they thought could help solve this problem because they could help the community coordinate ridesharing.
The community was not comfortable with this – how much would this system cost to build, what knowledge would be necessary for its upkeep, and who would make sure that the people picking up their children were trustworthy? Ultimately, the community settled on a paper & pen solution. The school head, who everybody trusted, created a bulletin board that helped her to coordinate carpooling, saving parents and the school district a fair amount of travel time and expense.
Despite this low-tech success story, Thackara himself remains excited about the possible uses of information technology. For example, sustainability “dashboards” that would give building managers, policy makers, homeowners, and other decision makers access to a cross-spectrum of data they might need in one interface – weather patterns, energy use, location of water, location of community centers, etc. Technology can be useful, it just isn’t needed all the time.
In a related way, I have had conversations with a number of people lately on the topic of: How can we better find out where our food comes from and how it was grown? e.g. How far away was it grown, is it organic, does the farm have fair labor practices, how much energy went into making your package of peanut butter cookies and gallon of milk? While new labeling and certification schemes are making headway in this area, we encounter a lot of complexity in an actual trip to the grocery store. It can be hard to weigh things like whether conventionally grown fruit from a place closer to where you live is better to buy than organically grown fruit that comes from farther away, especially if you’ve only got 15 minutes to shop.
In a surprising number of these conversations people suggested creating some kind of gadget that would help shoppers make decisions – an iPhone application, for example, that would have data about particular brands of food and types of fruits, vegetables, and meats that would be better for your health and/or the planet. Perhaps this would help. But, maybe it is just a case of creating a solution that is more complex than the problem?
Rather than trying wrap our minds around the global food system, maybe we’d be better off growing and making more food locally – were we know how it is made, and the people who are growing it. In his book In Defense of Food, author and professor Michael Pollan suggests a similarly simple solution – if you want to eat well, don’t eat anything that your great grandparents wouldn’t recognize as food.
So, when is it appropriate to use information technology to get a handle on a complex situation? And, when might we actually solve the problem by forgetting technology, and simply making things more simple?Systems
© 2017 deeptech