Featured

No Bosses, and No Rewards

I have two personal mottoes:

The first one is from a Talking Heads song, and the second one is the title of this blog because it applies to a lot of the most interesting questions:

Many of the posts and favorite texts here were prompted by a book I’d read recently or been reminded of, most likely non-fiction—the history of science, crime investigations, important ideas. It helps me organize my thoughts and let the lessons sink in. For me, posting things here is like exercise, which is to say I consider it good for me whether anyone else notices or not.


In addition to reading books and writing this blog, I wrote a book about the former mayor of my small city because I believed his story was so compelling it deserved to be told. You can buy it here.

A lot of the other content here—linked from the items in the header—has migrated over from my previous personal sites. I brought it along because there’s a lot of me in those words, pictures and code. And I realize now, there always will be.

My personal site, At Speed, in 1997
In the early days of the World Wide Web, designing and creating a personal web site was a bit like building your own house on the prairie, as opposed to just moving into an established neighborhood, as is more often the case now. I began creating my first web site at a free hosting provider in 1996 as a hobby and just for fun, but many of the skills I learned in the process—HTML, CSS, Javascript, digital image formats—worked their way over to my career and helped me make a living. Like any creative expression, a big part of building your own web site was deciding what it would be about: What do I have some knowledge of? What can I contribute?

Around that time, I attended the Canadian Grand Prix several years in a row, so Formula One racing became my site’s initial primary topic. I posted photographs and descriptions of my trip to Montreal each year, and I also set out to catalog the history of all the Formula One races in the United States, including the ones I’d been to in the 1970s. I bought books and made regular trips to the library for race accounts going back to the 1950s in Road & Track magazine. I thought it was pretty cool that in the first few months, Yahoo! created a category just for me, and I received email comments from Formula One fans in 17 countries who visited my site. One guy cursed me in his message for all the time he’d spent reading the information there.

At that time, as the web was just beginning to be configured, explored and staked out, it was intoxicating to plant something in that fertile ground. It was a new forum for untamed organic expression, whether all you knew was the <p> tag or you could write Javascript and Perl code. The server wasn’t mine, but the code, text and images were, and on that platform—unlike any previous medium—they could be viewed instantly by anyone anywhere in the world. Those personal sites were like plots in a community garden planted and tended for their own sake.

Tim Berners-Lee, creator of the World Wide Web, said that at the beginning, “The spirit there was very decentralized. The individual was incredibly empowered. It was all based on there being no central authority that you had to go to to ask permission.”

“The vision,” said Louis Menand of The New Yorker, “was of the Web as a bottom-up phenomenon, with no bosses, and no rewards other than the satisfaction of participating in successful innovation.”

Alas, now—instead of doing our own chaotic designing, building and growing or adding something to the “sum of human knowledge”—we mostly just offer up our personal information to mammoth sites that exist for their own benefit and wonder how they were able to take advantage of us.

As Katrina Brooker said, “The power of the Web wasn’t taken or stolen. We, collectively, by the billions, gave it away with every signed user agreement and intimate moment shared with technology. Facebook, Google, and Amazon now monopolize almost everything that happens online, from what we buy to the news we read to who we like. Along with a handful of powerful government agencies, they are able to monitor, manipulate, and spy in once unimaginable ways.”

Nature is Beautiful That Way

The motto of the academic health system in my community is, “In Science Lives Hope.” Walter Isaacson’s book, The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race, affirms that assertion.

Isaacson said that the three great revolutions of modern times are the discoveries of the three fundamental kernels of our existence: the atom, the bit, and the gene. According to Isaacson:

  1. The first half of the twentieth century featured a revolution driven by physics, which led to atom bombs and nuclear power, transistors and spaceships, lasers and radar.
  1. The second half of the twentieth century was an information-technology era, which led to microchips, computers and the internet.
  1. Now we have entered a life-science revolution which began just in time to combat the COVID-19 pandemic.

Bacteria “have been battling viruses for more than a billion years,” Isaacson said, by developing “clustered repeated sequences, known as CRISPRs, that can remember and then destroy viruses that attack them. In other words, it’s an immune system that can adapt itself to fight each new wave of viruses.”

Jennifer Doudna, an American biochemist at the University of California, Berkeley, said, “CRISPR evolved in bacteria because of their long-running war against viruses. We humans don’t have time to wait for our own cells to evolve natural resistance to this virus, so we have to use our ingenuity to do that.” She pondered, “Isn’t it fitting that one of the tools is this ancient bacterial immune system called CRISPR? Nature is beautiful that way.”

Doudna’s work, said Isaacson, “illustrates, as Leonardo da Vinci’s did, that the key to innovation is connecting a curiosity about basic science to the practical work of devising tools that can be applied to our lives—moving discoveries from lab bench to bedside.”

“The invention of easily reprogrammable RNA vaccines was a lightning-fast triumph of human ingenuity,” Isaacson said, “but it was based on decades of curiosity-driven research into one of the most fundamental aspects of life on planet earth: how genes encoded by DNA are transcribed into snippets of RNA that tell cells what proteins to assemble.’

Echoing Doudna, he said, “Great inventions come from understanding basic science. Nature is beautiful that way.”

Find the Beautiful

Bill Bryson’s book, The Road to Little Dribbling: Adventures of an American in Britain, provided lots of interesting stories—anecdotal and historical—about a number of Britain’s small towns as he drove, walked and rode from the southern coast of the island to its northern tip. He said, “One of the things that I really, really like about Britain: it is unknowable. There is so much to it—more than any person can ever see or figure out or begin to know.”

It seemed to me that one could say that about a lot of places—nothing wrong with that—but when Bryson stood at the northern tip of Great Britain at the end of his trip, he said, “Now that I had reached the cape, I rather expected some feeling of finality and accomplishment to settle upon me…So I stood, hands clasped at my back, staring into the wind, patiently waiting, but no special feeling came.”

He affirmed what has become my default feeling about travel, best expressed—perhaps—by Ralph Waldo Emerson, who advised, “Though we travel the world over to find the beautiful, we must carry it with us, or we find it not.”

In retirement, my wife and I have traveled to a number of “exotic” (not!) locations, including:

And in each place, we have found the beautiful. It’s not the travelling, but the traveller that makes the difference.

Regulation, Huh? Who Would Have Thought?

Mississippi River pollution is 1% of what it was before the 1980s

In February, Future Crunch, that fount of “good news you probably didn’t hear about” and science news that is “indistinguishable from magic,” linked to a story reporting that, “The Mississippi River is the cleanest it’s been in more than a century.” It’s another powerful repudiation of the blanket opposition to government regulation common since the Reagan administration.

The story in The New Orleans Advocate—“Mississippi River pollution plunged after passage of Clean Water Act”—said, “A new LSU study shows that the lowest downriver reaches of the river have been getting cleaner since the 1970s, when Congress passed the Clean Water Act, a landmark piece of legislation considered one of the most powerful environmental laws in U.S. history.”

Specifically, “A century’s worth of river testing…shows a clear and steady decline in bacteria, lead and other pollutants since the Clean Water Act was enacted in 1972.”

“The precipitous drop in bacteria—much of which stems from human and animal waste—was stunning,” Eugene Turner, an oceanography and coastal sciences professor and the study’s author, said.

“It’s 1% of what it was before the 1980s,” Turner said.

“The reason is simple,” according to Turner. “We have sewage treatment plants now.”

“Before the Clean Water Act, it was common for cities to pipe sewage into the nearest creek or river. Efforts to build or improve sewage treatment plants were often met with fierce resistance, usually over their high costs. Dumping is, after all, much cheaper than treating.”

“The Clean Water Act set minimum standards for waste discharges for each industry and municipal waste manager. And it developed regulations for specific problems, including chemical releases and oil spills,” according to Advocate environment reporter Tristan Baurick.

“Along with bacteria, the law took particular aim at lead, an element that’s especially harmful to the development of children’s brains and was once common in household paints and gasoline. In the lower Mississippi, lead concentrations are about 1,000 times lower than they were in 1979, according to water quality data Turner collected.”

“Turner’s study underlines the importance of preserving the Clean Water Act and expanding its reach, said Olivia Dorothy, a Mississippi River management expert with American Rivers.”

“‘The Clean Water Act has been tremendously effective at decreasing the amount of industrial and urban pollution, as this study shows,’ Dorothy said. ‘We need to protect the act and all of its authorities, [and] we also need to start looking at expanding it to cover the emerging public safety threats as they relate to water.’”

“Emerging threats include pharmaceuticals-laced sewage and viruses, including COVID-19, which can spread from partially treated wastewater, overwhelmed treatment systems and aging septic tanks, [Turner] said. Plastic waste is another growing concern.”

“‘We’re putting plastic in our river in incredible amounts,’ Turner said. ‘That’s what we used to do with sewage and lead. We just threw it in the river until we eventually realized that’s not good and we did something about it.’”

As Future Crunch concluded, “Regulation, huh? Who would have thought?”

What America Might Look Like

A Promised Land, the most recent of Barack Obama’s multiple memoirs, naturally provided lots of inside perspectives on familiar aspects and episodes from his background, campaigns and terms as President.

The last event he described was the killing of Osama bin Laden in 2011 and the lack of personal satisfaction he might have expected it to produce. He pondered:

“Was that unity of effort, that sense of common purpose, possible only when the goal involved killing a terrorist? The question nagged at me. For all the pride and satisfaction I took in the success of our mission in Abbottabad, the truth was that I hadn’t felt the same exuberance as I had on the night the healthcare bill passed. I found myself imagining what America might look like if we could rally the country so that our government brought the same level of expertise and determination to educating our children or housing the homeless as it had to getting bin Laden; if we could apply the same persistence and resources to reducing poverty or curbing greenhouse gases or making sure every family had access to decent day care. I knew that even my own staff would dismiss these notions as utopian. And the fact that this was the case, the fact that we could no longer imagine uniting the country around anything other than thwarting attacks and defeating external enemies, I took as a measure of how far my presidency still fell short of what I wanted it to be—and how much work I had left to do.”

Upon examining the chart above, this failure—not unique to Obama’s administration, of course—is no surprise.

Far Less Attention

In his 1999 book, Dealers of Lightning: Xerox PARC and the Dawn of the Computer Age, Michael Hiltzik said that “a certain quality [was] possessed by [the Palo Alto Research Center] in its extraordinary early years”: Magic. And it was the source of multiple seminal technologies including the laser printer, Ethernet and object-oriented programming.

Malcolm Gladwell said, “If you were obsessed with the future in the seventies, you were obsessed with Xerox PARC,” and its unofficial credo was, as Alan Kay, head of the center’s Learning Research Group said, “The best way to predict the future is to invent it!” About his colleagues at PARC, Kay said, “The people here all have track records and are used to dealing lightning with both hands.

But the best known story about the halcyon days of PARC surrounds the graphical user interface that became ubiquitous on personal computers in the 1980s when it was commercialized by Apple and Microsoft, not Xerox.

In early 1979, Xerox Development Corporation chief Abraham Zarem was exploring the idea of having “a young, hungry company with a modest cost structure” create a marketable product from PARC’s personal computer technology. In April of that year, Steve Jobs, co-founder of Apple Computer—which was less than four years old at the time—offered XDC an opportunity to purchase Apple stock in exchange for a demo of Xerox technology.

“Apple Computer was scarcely a blip on the radar screen of most PARC engineers,” according to Hiltzik. “They were Ph.D.s who had worked on some of the biggest computing projects the world had ever seen; Apple was a bunch of tinkers.” But Larry Tesler of the PARC Learning Research Group insisted that “if PARC did not change its attitude [about Apple and ‘the growing underground of youthful hackers’]…it was going to look back one of these days and discover it had been passed by.”

Most of the members of the Systems Science Lab at PARC—which was developing most of the innovative personal computer technology—still hoped, however, that “Xerox might eventually get around to bringing out the technology on its own.” And Adele Goldberg “felt adamantly that disclosing PARC’s intellectual property to a team of engineers capable of understanding it and, worse, exploiting it commercially would be a mortal error,” said Hiltzik.

They soon learned, however, that “the engineers could decide how to stage the demo, but Xerox headquarters had decreed that one way or another, it was going to happen,” Hiltzik said.

At PARC in December, 1979, members of Kay’s Learning Research Group—including Goldberg, Tesler, Dan Ingalls and Diana Merry—gave a thoroughly sanitized demo on an Alto computer to a group including Jobs, Apple president Mike Scott and programmer Bill Atkinson. “It was very much a here’s-a-word-processor-there’s-a-drawing-tool demo of what was working at the time,” said Goldberg. “No harm done, no problem. What they saw, everyone had seen. The conversation they had with us, everyone had. There was no reason not to do it, it was fine.”

Significantly, they had not shown Apple the Smalltalk programming language, which was the foundation of the user interface and functionality they had seen. Jobs, who had been “skeptical of what PARC might have to offer,” seemed satisfied when he left, but he quickly learned that much had been left out of the demo.

A team of ten or so Apple people appeared at PARC again two days later, determined to learn more. Scott and Xerox’s Harold Hall danced around the issues in “executive-speak” for several minutes while Jobs waited impatiently, according to Hiltzik. Then Jobs jumped from his chair, saying, “There’s no point trying to keep all these secrets.” He said to Scott, “These guys think we’re going to make the Xerox computer, but we all know we want them to help us with the Lisa!” Xerox knew nothing about any Apple computer called Lisa.

As the rest of the Apple team sat dumbstruck, an Apple engineer broke the awkward silence, explaining, “Lisa is an office computer we’ve designed with a bitmapped screen and a simple user interface. We think some of your technology would be useful in helping us make the machine easier to use.”

Tesler knew that the Smalltalk interface, parts of which Apple had not seen, would indeed make Apple’s computer easier to use, and he was eager to demonstrate it fully: “If Xerox was not going to market a personal computer, why should all the Learning Research Group’s work simply go to waste?,” Hiltzik believed Tesler was thinking. But Goldberg, co-developer of Smalltalk with Alan Kay, insisted it would take a direct order from Xerox corporate to overcome her objections. With amazing speed, the order came from Bill Souders, executive vice president and head of Xerox’s business planning group in Stamford, Connecticut, who ordered the demo team to give Apple the “confidential briefing.”

As they demonstrated the full power of Smalltalk via programs with “capabilities that had never been seen in a research prototype anywhere, much less in a commercial system,” the Apple engineers watched with rapt attention. Atkinson, particularly, “was asking extremely intelligent questions that he couldn’t have thought of just by watching the screen,” according to Tesler. “It turned out later that they had read every paper we’d published, and the demo was just reminding them of things they wanted to ask us…They asked all the right questions and understood all the answers. It was clear to me that they understood what we had a lot better than Xerox did.”

According to Tesler, “Jobs was waving his arms around, saying, ‘Why hasn’t this company brought this to market?’” Jobs claimed afterward that after seeing the demo, he knew that “every computer would work this way some day,” but the developers of the Lisa had already designed their own GUI. Theirs was “far more static than the Alto’s” and placed much less reliance on the mouse, but Atkinson, who’d been stuck for months on several programming problems, had his confidence boosted by the demo and was finally able to solve the problems in his own way. “That whirlwind tour left an impression on me,” Atkinson said. “Knowing it could be done empowered me to invent a way it could be done,” and in fact, several of Atkinson’s solutions proved to be much more effective than Xerox’s. According to Hiltzer, Atkinson “resented the importance others have attached to his visit to PARC: ‘In hindsight I would rather we’d never have gone,’ Atkinson said. ‘Those one and a half hours tainted everything we did, and so much of what we did was original research.’”

Apple’s work was not “serial reproduction…[but] the evolution of a concept,” Malcolm Gladwell said. “Jobs’s software team took the graphical interface a giant step further…It emphasized ‘direct manipulation.’ If you wanted to make a window bigger, you just pulled on its corner and made it bigger; if you wanted to move a window across the screen, you just grabbed it and moved it. The Apple designers also invented the menu bar, the pull-down menu, and the trash can—all features that radically simplified the original Xerox PARC idea.”

“The difference between direct and indirect manipulation,” Gladwell insisted, “is the difference between something intended for experts, which is what Xerox PARC had in mind, and something that’s appropriate for a mass audience, which is what Apple had in mind. PARC was building a personal computer. Apple wanted to build a popular computer.”

Indeed, Apple was influenced by their exposure to Xerox’s technology but, according to Hiltzer, the visits may have affected the PARC scientists even more. Jobs’ “fanatic enthusiasm for their work hit them like a lightning bolt. It was a powerful sign that the outside world would welcome all they had achieved within their moated palace while toiling for an indifferent Xerox.”

In 1980, Apple asked to license Smalltalk for use in the Lisa but was turned down. Instead, they hired Larry Tesler, and he became the head of the Lisa interface team, helped design the Macintosh and became Apple’s chief scientist.

According to Malcolm Gladwell, the engineers at PARC “weren’t the source of disciplined strategic insights. They were wild geysers of creative energy.” The common characterization of Xerox as unable to bring new technologies to market, said Hiltzer, “presupposes that a corporation should invariably be able to recoup its investment in all its basic research,” and overlooks Xerox’s generous funding of PARC through decades of tumultuous change in the computer industry. And he insisted, “Apple was able to market the PC not in spite of its small size, but because of it.”

In Making the Macintosh, Alex Soojung-Kim Pang, Stanford University librarian and historian, said, “Turning expensive, hard-to-use, precision instruments into cheap, mass-producible, and reliable commercial products requires its own ingenuity and creativity. This marketplace intelligence is different from, but not inferior to, the intelligence of the laboratory; it just gets far less attention by journalists and historians.”

An Extra Life

Since the mid-1700s, advances that were initiated by scientists and accompanied by new social movements, new forms of persuasion and new kinds of public institutions—namely, vaccines, germ theory, and antibiotics—have combined to double the average human lifespan, essentially giving humanity, according to Steven Johnson, “an extra life.”

In 1796, British physician Edward Jenner created the first true vaccine—for smallpox—by using a quantity of a less virulent but related disease to trigger an immune response. He was hailed as a hero but also mocked as a quack.

“The vaccine immediately had its skeptics,” according to David Motadel, historian at the London School of Economics and Political Science. “Clerics warned their congregations about contaminating the purity of the human body with animal matter and condemned it as unchristian. And many of Jenner’s peers who had forged careers on useless but lucrative ‘cures’ for smallpox were quick to denounce it as dangerous.”

“In 1879…the Anti-Vaccination Society of America was founded, and activists fought compulsory vaccination laws in several states.”

But the vaccine “quickly became standard medical practice in Britain, Europe and the United States,” said Motadel, “and by the middle of the 19th century, smallpox was a relatively minor cause of death in Europe.”

“Ever since the time of Jenner, whenever new vaccines have been approved, anti-vaccination campaigns have been part of the response,” Motadel noted. “They have been driven by exquisitely complex and diverse reasons: uncertainty, fear, science skepticism, anti-intellectualism and anti-elitism; sometimes they are motivated by profit.”

In mid-1800s America, unrefrigerated cow’s milk had become a “liquid poison,” killing thousands of young children, according to Johnson. Louis Pasteur discovered that “both fermentation and spoilage [in milk] were…a byproduct of living microbes,” and he invented a process in 1865 to kill the microbes before they could cause any harm.

But pasteurization was “ferociously opposed by the milk industry and its representatives in statehouses around the country…not just because it added an additional cost to the production process but also because they were convinced, with good reason, that it would hurt their sales.” As a result, it took a half century for it to have a meaningful effect on the safety of milk in the United States. “Progress,” Johnson insisted, “is never a result of scientific discovery alone; it also requires other forces: crusading journalism, activism, politics.” By the early 1920s, unpasteurized milk had been outlawed in almost every major American city. “The fight for pasteurized milk,” according to Johnson, “triggered the first truly egalitarian rise in life expectancy.”

In the first decades of the 20th century in Jersey City, New Jersey, typhoid was responsible for 30 deaths per 100,000 people per year. In 1908, physician, health officer and sanitary advisor John Leal quietly added chlorine to the public reservoirs to kill the harmful bacteria he had identified in the water. Leal was sued by the city for not supplying “pure and wholesome” water as his contract had stipulated, but the experiment proved successful, and soon city after city began implementing chlorine disinfectant systems in their waterworks. Within three decades, the death rate from typhoid was reduced by a factor of 10.

Regarding the Covid-19 pandemic, Motadel said, “There are growing concerns that the United States might soon reach what some experts call the ‘vaccine wall,’ when the problem stops being how to supply enough and starts being how to convince the holdouts.” But he said, “Opposition to vaccination is as old as vaccination itself. And despite consistent and often widespread hostility, vaccination campaigns have always, eventually, succeeded.”

“Enough people have accepted vaccines [for polio, measles, mumps, rubella] that they have always been effective in immunizing societies,” Motadel insisted. “And that’s likely to be true of this pandemic, too.”

Networks of Ideas

The genius of the glassmakers of Murano, Italy was created as much by sharing as by competitive pressures

Steven Johnson’s book, How We Got to Now: Six Innovations That Made the Modern World, and its companion PBS and BBC series, examined the largely unplanned roles played in our lives by:

  • Glass—”A world without glass would strike at the foundation of modern progress.”
  • Cold—”Our mastery of cold is helping to reorganize settlement patterns all over the planet and bring millions of new babies into the world.”
  • Sound—”Sound…was the first of our senses to be electrified…And once those sound waves became electric, they could travel vast distances at astonishing speeds.”
  • Clean—”Between 1900 and 1930…[the implementation of] chlorination systems…led to a 43 percent reduction in total mortality in the average American city.”
  • Time—The development of reliable watches allowed the industrial revolution to satisfy its need for clock time, and demonstrated that “our ability to measure things [was] as important as our ability to make them.”
  • Light—”The lightbulb shines light on more than just our bedside reading; it helps us see more clearly the way new ideas come into being, and how to cultivate them as a society.”

Within each topic, according to Johnson, an innovation triggered changes in an entirely different and unintended area. He called this the “hummingbird effect,” after the way certain plants and animals may have evolved symbiotically. These traceable connections are in contrast to the well-known “butterfly effect,” which describes an unpredictable and “virtually unknowable chain of causality.”

This distinction is important, Johnson said, because “ideas are fundamentally networks of other ideas. We take the tools and metaphors and concepts and scientific understanding of our time, and we remix them into something new.”

But this typically requires having “the right building blocks,” Johnson insisted, without which “you can’t make the breakthrough, however brilliant you might be.” Therefore, “most innovation happens in the present tense of the adjacent possible, working with tools and concepts that are available in that time.”

This is why, he observes, many groundbreaking ideas come to multiple people simultaneously and independently: Calculus to Newton and Leibniz. Natural selection to Darwin and Wallace. The light bulb to Edison and many others.

He did admit, on the other hand, that there have been a few people like Ada Lovelace—whom Johnson calls “time travelers”—that made great intellectual leaps beyond the ideas of their day. The leaps made by these individuals, according to Johnson, may have been possible because they “worked at the margins of their official fields, or at the intersection point between very different disciplines,” and they “remind us that working within an established field is both empowering and restricting at the same time.”

Examining these networks of ideas is important, Johnson said, because “learning from the patterns of innovation that shaped society in the past can only help us navigate the future more successfully.”

The Invention of Surgery

For 1,500 years, medical practice in the Western and Arab worlds, such as it was, was dominated by the doctrines of two ancient Greeks: Hippocrates and Galen, who explained the inner workings of the body with the theory of the four humors.

But in his book, The Invention of Surgery: A History of Modern Medicine: From the Renaissance to the Implant Revolution, surgeon David Schneider observed that the exalted position of these teachings was undeserved. He said, “Even the savants of the Renaissance, who were forced to contemplate the function of the body in a world without science, were powerless to resist the allure of Hippocratic musings. Because the philosophical foundation was a fraud, medicine was ineffectual, even lethal. The Hippocratics provided much explanation for why the therapies worked: it never occurred to them that they did not.”

Schneider defined the fifteenth century as the beginning of “modern medicine” in part because “so little had changed from the time of Hippocrates to the 15th century,” but when the scientific methods of the Renaissance belatedly began to be applied to medicine, the understanding of the human body and the ability to treat it began to grow.

In a new environment of discovery, “the invention of surgery was crafted by tinkerers, oddballs, lonely geniuses, inspiring mentors, and stubborn misfits,” according to Schneider. Over centuries, a series of remarkable men used empirical observation and analysis to make insights we now take for granted:

  • Andreas Vesalius—In the late 1530s, Vesalius was one of the first to learn from his own anatomical dissections, and he showed that Galen was not infallible. His book, De humani corporis fabrica (“On the Structure of the Human Body”), was “a visually stunning, didactic tour de force that…challenged 1,500 years of [Galen’s] authority.” Moreover, De fabrica “was not simply a book about the body…[but] an instruction manual for physicians.”

  • William Harvey—In the seventeenth century, Harvey correctly explained the function of the heart when he declared that ”movement of the blood occurs constantly in a circular manner and is the result of the beating of the heart.” He followed Vesalius’s devotion to first-hand examination of the body, but applied his own background in physics to posit that the body operated as a machine with each organ performing a specific function that worked in relation to other organs in the body.

  • Giovanni Battista Morgagni—In the late eighteenth century, Morgagni first connected symptoms with anatomical conditions and initiated modern medical diagnosis. His work, De Sedibus et causis morborum per anatomem indagatis (“Of the seats and causes of diseases investigated through anatomy”) “made pathological anatomy a science, and diverted the course of medicine into new channels of exactness or precision.”

  • John Dalton—Though he did not work in medicine, when Dalton formalized the atomic theory at the turn of the nineteenth century—explaining how atoms combine together to form chemical compounds—he prompted the rapid rise of chemistry and the ability to synthesize drugs; for the first time man could make chemicals that had “genuine and powerful effects on the human body.”

  • Carl von Rokitansky and Rudolf Virchow—performed organ-based and cellular-oriented autopsies, thus furthering the understanding of morbidity. Schneider said, it was Virchow and his successors who “fathomed the significance of the cellular basis of life—forever destroying the ancient, mystical speculations about vital spirits, humors, and life forces.”

In the twentieth century, according to Schneider, “The antibiotic revolution [of the 1940s], building on the breakthroughs of the understanding of the organ and cellular basis of disease, and the founding of bacteriology meant for the first time ever that it was worth going to a doctor when you were sick.”

Nowadays, a child rightly learns to trust that, when something is wrong, the doctor can likely make you better. Knowing that that was not always the case and that we owe our well-placed modern confidence in healthcare professionals to centuries of research, discovery and diagnosis provides reassurance and comfort in times of crisis.

TV Worth Watching

David Bianculli, the TV critic for National Public Radio’s Fresh Air since 1987, is the founder and editor of a website called, TV Worth Watching. The title is called for because, as David Byrne said, “People like to put the television down,” but television is no more monolithic than movies, magazines or books.

There is good TV and bad TV. And having more control now than ever over the things we watch means that our choices tell more than ever about us. There have been some amazing things—new and not so new—on TV in the last several years, if you were watching:

The majority of these programs are British; they are short on guns, profanity and sex; and they are long on strong characters, stunning photography and provocative ideas. Coincidence or is that why they’re “good”?