Monday, July 9, 2007


What a tricky business it is trying to figure out how to stay safe these days. One scientific study says one thing, the next one says something else. And the scary parts are magnified by the 24/7 barrage of news reports screaming about the risk du jour. How are we supposed to make informed decisions about our health and safety?

A recent example illustrates the broader dilemma. The news media have reported at length that mercury can damage cognitive development in the fetus. The biggest source of exposure to mercury for pregnant women, we are warned, is consumption of seafood. The safe thing to do, then, is eat less fish, right?

But seafood is rich in all kinds of nutrients, particularly the fatty acids the fetal brain needs for healthy development. A study a few months back in the British medical journal The Lancet found that the less seafood pregnant women ate, the worse their kids did on a raft of developmental tests. Kids born to mothers who ate less than about three quarters of a pound of seafood per week were at risk of having lower verbal IQ scores, and “… increased risk of suboptimum outcomes for prosocial behaviour, fine motor, communication, and social development scores. For each outcome measure, the lower the intake of seafood during pregnancy, the higher the risk of suboptimum developmental outcome.”, the authors write.

So what’s a pregnant mom to do? Some science says that more fish = more mercury = possible brain damage to the unborn child. Other science says less fish = less nutrients = possible brain damage to the unborn child. Conflicting scientific evidence. How are we non-scientists supposed to decide?

The news media could help, but in several ways they make things worse. There is generally more emphasis on the threatening side of things, so a story about how “Fish Is Bad For Your Kids” will get more play than one that says “Fish is Good For Your Kids”. In the three days after the Lancet study was published, there was less reporting about it (fewer stories, smaller stories, buried-inside-the-newspaper stories) than there generally has been about the dangers of mercury in seafood. (The New York Times didn’t report on the Lancet study at all, based on a search for the words “Lancet” “seafood” and “mercury”.) That means some people won’t learn about these new findings. It’s hard to make an informed choice about conflicting scientific evidence if some of it, particularly the more reassuring information, is missing.

Sometimes the information is widely reported, but misleading. Many news reports about scientific findings suggest that the study being described offers THE definitive answer. Several stories about the Lancet study had headlines like this one from a Texas TV station’s website; “Study: Eating fish while pregnant leads to smarter children.” Case closed. Scientists know it takes a lot of evidence from a lot of studies to develop a clear answer. It shouldn’t be hard for journalists to acknowledge this. In fact, for the sake of accuracy, it is their obligation.

News reports about risks also usually fail to give both sides of the risk-benefit tradeoffs involved. Mercury and seafood is a classic example. There were lots of scary stories about the dangers of mercury in fish, but only some of them, usually late in the story, mentioned the benefits of seafood. The mercury story isn’t the only example. Stories about estrogen replacement therapy cite the cancer risks, but rarely mention the potential heart and bone protective benefits. Stories about the risks of nuclear power almost never mention the tens of thousands of people who get sick or die each year due to air pollution from burning coal and oil. There is no general right or wrong to any of these risk-benefit choices. It’s up to each individual to decide. But in order to make informed choices we need to know what the tradeoffs are.

Journalists are only part of the problem. Too many scientists trumpet their findings as THE answer. Some do it out of intellectual arrogance, some out of honest passion on their issue, many out of a desire for career advancement and more research money. And many scientists need to win the war of ideas. It matters to them personally, intellectually, that they’re right, that their view prevails. Conflicting studies breed disagreement between scientists with differing views that can be really personal and nasty. The public and policy makers get caught in the confusion of this intellectual combat.

We deserve some of the blame too. In our rushed, short attention span world, we want things black and white. What’s safe and what’s not. Spare me the details, my cell phone is ringing. Even if the news story has all the relevant facts, if we don’t read more than the headline and the first few paragraphs, shame on us for not knowing what we need to know.

Many of the hazards of our modern world are complex, and our scientific understanding of them can only develop one piece at a time, each new study adding another piece to a giant jigsaw puzzle. Sometimes the new piece doesn’t seem to fit with the ones already laid down. But there are solutions to the confusion from conflicting scientific evidence. Scientists have to be honest about how they develop, interpret, and report their results. They have to be careful about claiming that their findings are THE answer. And we have to be smarter news consumers and collect more information before we make up our minds.

But in the end, a great deal of responsibility falls on the news media. Beyond our own personal daily experience, what we know of what’s going in the larger world is determined by what the news media tell us and how they tell it. What stories get covered, how the reporter gets his or her information, how the story is written…they all involve decisions. There is a huge public trust involved in how journalists make those choices.

We can’t expect journalism to be some high-minded calling that serves only the public interest. Journalism is largely a for-profit affair and journalists are driven by self-interest. The bosses want news that will grab our attention. Reporters want news that will make the front page. Both motivations are inescapable realities, and both encourage coverage about threats and danger that is more alarming, not less.

But we are their customers. We can and should demand better. We should reward with our readership and viewership and listenership those news organizations that report risk well, with accuracy, balance, a bit of caution, and an occasional touch of context, so we can make sense of conflicting and incomplete scientific evidence about the risk-filled complexities of our modern world.


It was a nano-scale event – so small it was hard to detect. But the action taken recently by the city of Berkeley California could have vast repercussions for nanotechnology and it’s incredible potential. For all of nanotechnology’s promise, there also may be serious risks, and Berkeley wants to know just what kind of genie we’re letting out of the bottle.

Berkeley has passed what may be the first law putting restrictions on nanotechnology. Municipal code 15.12.04 directs that “All facilities that manufacture or use manufactured nanoparticles shall submit a separate written disclosure of the current toxicology of the materials reported, to the extent known, and how the facility will safely handle, monitor, contain, dispose, track inventory, prevent releases and mitigate such materials.” Not a red light, to be sure, but for the first time a government is turning the go-go nanotech green light to cautious yellow. It would not be surprising to see other governments, at all levels, around the world, following suit. Cambridge Massachusetts, the first community in the U.S. to put restrictions on recombinant DNA work, is considering a similar ordinance.

It will be interesting to see how companies in Berkeley comply. Firms have to report on toxic potential of the nano materials they make or use, but the fact is that we know practically nothing about the possible human and environmental health impacts of these incredibly small materials. In fact, we don’t even have the tools to figure out what these materials do.

That’s because these particles are as small as just a few atoms, which is why they have such potential, and potential for harm. You’d have to place 80,000 nanometer-sized particles next to each other to get from one side of a human hair to the other. Stack 100,000 of them on top of each other and you get the thickness of a piece of paper.

At those sizes, matter behaves in new ways. Silver becomes a powerful antibacterial. Silver nano particles are already being used in refrigerators and plastic bags to keep food fresh. When exposed to ultraviolet light, nano sized crystals called quantum dots light up as much as 1,000 times brighter than most medical dyes. They’re already being used to identify cancer cells in the human body and in commercial lighting to reduce the need for energy. Carbon nanotubes have unique properties that make them unbelievably strong, and much more efficient than copper at conducting electricity.

But at those sizes, nano particles are too small to capture in filters. Their properties are too novel for toxicologists to even figure out what harmful effects these materials might have. The science of developing nano materials has far outpaced the science of understanding their risks.

But that hasn’t stopped nano products from showing up in the marketplace daily. From car windscreens to stain-proof clothing to cosmetics to medical devices to tennis rackets to, yes, the iPod Nano, hundreds of products are on the market that use materials at sizes hard to imagine. More are in development, with promise that ranges from better medicines to safer food to huge reductions in our use of energy.

With the almost unimaginable potential that comes from the ability to manipulate matter at the atomic level, it’s no surprise that governments have been pouring money into research and development for years, particularly the U.S., the E.U., and Japan. But very little of that money - just 4% in the U.S. - is going toward research on the risks these materials might pose. The money being spent to figure out what these particles can do for us is vastly greater than what is being spent figuring out what nanotech can do to us.

And that’s dangerous. One newsmaking instance of nanotechnology causing real harm and the media will undoubtedly make the public aware that, when it comes to risk, governments are clearly putting the nano cart before the horse. And that would surely deal all of nanotechnology a blow that could set it back years. It would dramatically delay the application of this technology and, while reducing possible risks, deny us the remarkable benefits nanotech offers.

Berkeley has asked for more than can be reasonably produced. The EU’s Health and Consumer Protection Directorate-General Commission says “…existing toxicological and ecotoxicological methods may not be sufficient to address all of the issues arising with nanoparticles.” In short, these materials are so small, their behaviors so novel, that we don’t even know what to test, or how, to see if they are safe. But Berkeley has asked the right questions. The governments of the U.S., the E.U., and Japan, should be spending far more on answering them.

Slow Down, Doctors Frankenstein

The main character in Mary Shelley’s novel “Frankenstein” has always been known by that name. Frankenstein. But in the book, the new life form created by Dr. Victor Frankenstein has no name. In public talks, Shelley was said to have referred to the creature as “Adam”…the human created in the Bible by God. The subtitle of the book is “The Modern Prometheus”…another God who, according to some versions of Greek myth, created Man.

Now we are a significant scientific step closer to the day when the creators of life will be us, not in the stories of novel and myth, but in our modern laboratories. A group of scientists has taken the DNA out of one bacterium, put it into a bacterium of a similar but different species, and watched as the inserted DNA took over and turned the host cell into a replica of the species from which the DNA first came. Now all we have to do is manufacture the DNA ourselves, add own specifications, and we will have the ability to create new, completely synthetic, forms of life.

We’re not quite at the moment when Igor throws the switch and the voltage brings Dr. Frankenstein’s creation to life. But this work has brought us much closer. It has also brought us closer to a dangerous conflict between scientific progress and public perception. Public fear that scientists are creating new, synthetic forms of life….playing God in the lab. …could bring this work, with its risks and its phenomenal potential benefits, to a screeching halt.

Recent history bears this out. There was public apprehension about and resistance to recombinant DNA research in the 70’s. Uncertainty was high, and fear followed close behind. Scientists were alleged to be “toying with life”. Legislative restrictions started to limit such work. Experiments were shut down. Progress in the field slowed dramatically.
The relatively tepid resistance to that earlier work will pale compared to the public uproar sure to erupt when scientists announce they have manufactured DNA, put it in a cell, and created life. Reaction to that could interfere with progress in life sciences for years.

As was the case with recombinant DNA research in the 70’s, the science of manipulating life is charging ahead with all the energy of human curiosity, the seduction of ego, the lure of riches, and the promise of solutions to many of our most pressing health and environmental problems. And as was the case with recombinant DNA science, while the ethical implications of synthetic life science are being considered, the public perception implications are not. Far too little is being done to communicate to the public about this work; the safety controls under which it is done, the great benefits it can bring, the respect that scientists have (or should have) for public concerns, and their willingness to develop and live by self-controls.

Again, the recombinant DNA episode instructs. As the pressure mounted back then, researchers gathered with lawyers and doctors at a conference near Asilomar State Beach in California. They came up with a long list of biological safety procedures, self-imposed legal restrictions, and the vital acknowledgment that for scientific knowledge to advance, scientists also have to develop guidelines for how their work should be regulated.

The participants at Asilomar also recognized the need to help the public understand their work, to demystify their science, to respect and address public apprehension. Many of them engaged much more actively with the press and accepted the responsibility that communicating about their work to the general public was nearly as important as the work itself.

Asilomar was in many ways an act of self-interest. Nonetheless, by recognizing and responding to public apprehension in tangible ways, the participants at Asilomar took a vital step in re-establishing public trust in science, which in turn has allowed for decades of progress in biology that has put us on the brink of being able to create synthetic life.

But the lessons of Asilomar seem to have faded. The leaders in the field of synthetic life science need to recognize the concerns the public is starting to have about their work…ethical concerns, safety concerns…and deal with our apprehensions actively, now, before Igor throws the switch and some lab creates a life form in a glass beaker that has never existed on earth before. They need to tell us what they are doing to keep their work safe. They need to tell us what they are doing to try to develop new ways of improving safety. They need to tell us how they, and we as a society, might oversee their work in ways that will allow progress but avoid harm. They need to simply explain what they’re doing, to reduce uncertainty and the fear that goes with it. They need to demonstrate that they take our worries seriously, and not just give those worries lip service while they chase their Nobel Prizes, patents, and personal fortunes.

The ability to construct DNA to our specifications and insert it into living cells, the ability to powerfully influence all biological life, has profound ethical and safety concerns. But it also offers almost unimaginable promise, to eliminate hunger, clean the environment, cure disease. Far less of that promise will be realized if the men and women doing this work don’t recognize and address our concerns about what they are doing. Otherwise they may learn how to create their Adam, only to find that, out of fear, we want to chase down what they have done and kill it.

Tuesday, March 13, 2007

Breast Implants and Chicken Little - A Cautionary Tale

Remember the story of Chicken Little, the one where the chicken gets hit on the head by a falling acorn she doesn’t see, and leaps from that partial evidence to her panicked cry “The Sky is Falling! The Sky is Falling!” The other animals turn into a frightened pack and follow Chicken Little in a mass flight to safety. Now consider the analogous story of silicone breast implants. Not a fable, but certainly a cautionary tale.

To learn the lessons from the silicone saga you have to go back in time to the day the acorn fell. In November 1988, The Health Research Group, associates of Ralph Nader’s Public Citizen, sounded the alarm that silicone breast implants cause cancer. They based their claim on documents from Dow Corning, the silicone manufacturer, indicating that in one test, a group of rats injected with silicone got 23% more tumors at the site of the injection than a control group. Cancer is a pretty big acorn.

The health advocates also claimed that Dow Corning had hidden those findings from the FDA and the public. Now you not only have Cancer, but a lying chemical company protecting their profits at the expense of public health. That would understandably feel like the sky is falling if you happened to be one of the tens of thousands of women who had these devices in your chest.

About two years later, the fear multiplied significantly when the press…we’ll call them Turky Lurky…added their loud voice to the alarm after a high-profile piece on a CBS news program. Most of the newspapers and TV stations in America joined the Sky is Falling chorus, as they usually do with the latest risk du jour. (Mea culpa. I was one of those reporters who did numerous stories on this awful new bogeyman, without really checking out the evidence.)

Women who were suffering scarring and immune system problems and other ailments got understandably scared, and angry. The lawsuits started. The public hearings started. Advocacy groups formed. Politicians took up the cause. And the federal government did what it has to do in a democracy. It responded to the Sky is Falling fears, and in 1992 banned the devices for almost all uses.
No matter that the kind of tumors those lab rats got don’t occur in humans. Never mind that dozens of other studies indicated silicone is not a carcinogen. Never mind that Dow Corning had indeed reported its findings to the government and hadn’t kept them secret. Never mind the far more plausible explanations for why breast implants were harming women.

Time and again this is how we react to the first hints that something is dangerous. We jump to the worst-case possibility on sketchy evidence, and protect ourselves with fear and precaution. Fear is not automatically a bad thing. It helps protect us. And precaution is a great idea and should be the starting place of all government policies.

But too much fear and precaution, too fast, based on not enough information, can do us more harm than good. Thousands of women experienced additional suffering because too little attention was paid to the far more likely reasons for their health problems. Many, afraid of cancer, had their implants removed, running a far more likely risk of serious infection and scarring. How about the excruciating stress hundreds of thousands of women endured? Chronically elevated levels of stress can cause cardiovascular damage, weaken the immune system, contribute to depression, impair fertility, weaken bones. Finally, millions were spent researching the cancer claim, and much less was spent researching the other ways breast implants harm women.

Remember what happened at the end of one version of the Chicken Little story? The frightened animals ran into a cave where they figured they’d be safe from the falling sky. Only to be eaten by Foxy Loxy. Their fear, based on the powerful drive for self-protection but only the sketchiest evidence, did them more harm than good.
Bravo to the advocates who sound the alarm that something out there might be dangerous. But shame on those Chicken Littles who immediately jump from small bits of evidence to “The Sky is Falling.” And shame on all of us, the press and the public, if we blindly subscribe to such alarmism and run to hide in some cave without giving at least a little careful thought to how real the latest peril might be.

Tuesday, February 13, 2007

Boston, Lite Brites, and the risk of being TOO afraid

Remember Lite Brite, that toy for little kids where they plug colored plastic pegs into a grid of holes and make faces and houses and...terrorist bombs? Welcome to the new normal, post 9/11, where risk is in the eye of the beholder and a lot of people are having trouble seeing straight. And a lot of us are suffering as a result.

Boston suffered in all sorts of ways recently as a result of this impaired vision, as roads, subways, and major facilities were shut down because of a possible terrorist threat that turned out to be innocuous. The response was triggered by vigilance, yes. With hindsight, that vigilance was excessive, and costly. But it can teach us something.

Let’s start with the highway maintenance guy who noticed a ‘suspicious device’ hanging under a highway overpass. Like any cautious citizen these days, he assumed that something with lights and wires in a public place might be a bomb. He called it in and the system went immediately from ‘It’s a normal day’, directly to ‘The Sky is Falling!’. First lesson. How about an intermediate step, like ‘Let’s check this out before we overreact.’

Parts of Boston’s subway system were shut down. Local TV news leapt into alarmist live coverage. Reports of similar devices started coming in from other locations. By early afternoon, major roadways were shut down. The Coast Guard blocked off access to the Charles River. The Department of Homeland Security and the FBI were called in, and the emergency response systems in Boston were in high gear. The devices were removed by heavily armored law enforcement explosives experts. Some were taken to a range and blown up. Some were neutralized with high pressure water cannons.

That’s a pretty frightened response to a bunch of Lite Brite boxes depicting a cartoon character brandishing his middle finger that had been hanging around for the last couple weeks as part of a marketing campaign for a TV cartoon program. Oh, and they’d been hanging around in several other cities too, provoking absolutely no concern. But these days it just takes one Chicken Little and we all turn into Henny Penny and Turkey Lurky and head for the nearest cave. The result of which, in this case, was a really big mess that badly inconvenienced tens of thousands of people, endangered some, cost the economy a lot of money, and took a healthy bite out of the public safety budgets of the city, state, and federal governments.

Fumed Boston Mayor Tom Menino, “It is outrageous, in a post 9/11 world, that a company would use this type of marketing scheme." State Attorney General Martha Coakley promises an investigation into “…the roots of how this happened to cause panic in this city.”

If that investigation is thorough it needs to look at how the government handled things. Maybe the official response had just a little to do with that panic, don’t you think? In these jittery post-9/11 days, governments have the responsibility to be careful, absolutely. But don’t they also have to be careful about how quickly they go to red alert? That highway worker, or his bosses, or somebody up the decision-making ladder, should have taken a closer look at these devices before jumping to worst-case assumptions and contributing mightily to the mess Boston suffered.

Our senior officials, from police chiefs and Mayors up to the Secretary of Homeland Security and the President, have to deliver responsible vigilance, or they play right into the terrorist’s hands. Jennifer Mason, a 26 year-old local resident, had it about right when she told one news organization, "It's almost too easy to be a terrorist these days. You stick a box on a corner and you can shut down a city."

The marketing people who put this stuff up deserve their share of the blame too. Hanging boxes full of lights and wires in public places these days is pretty dumb. Terrorism is real. We are in a new normal. The people trying to create a little buzz for their product have to be careful that they don’t create a lot more than that. (Of course even as they publicly apologize and agree to compensate the city for its costs, the people behind this are probably chuckling over the great exposure they and their program are getting. Except for the head of the Cartoon Network, which was behind the ad campaign, who ultimately resigned after paying the city $2 million for its expenses.)

But the Boston scare should remind us of what happened in the Chicken Little fable. The animals were so jumpy that they readily followed Chicken Little into that cave for protection. Where Foxy Loxy was waiting, to turn them into lunch. Vigilance is fine. But hyper-vigilance can be dangerous. Just ask all the people who suffered in Boston last week. Shame on us, and our officials, if we let our worries get out of hand, and in a well-intended effort to make things better, we make them worse.