Stories are transformative. They have the power to change our minds and open our hearts, to help us experience the world through other eyes and walk a mile in other shoes. For example, I would argue that stories, as much as any political action, boosted popular support for LGBT equality and led to legalization of gay marriage in 19 states and counting. Living season after season with sympathetic gay characters in Will & Grace and Modern Family, or watching two men fall hopelessly in love in Brokeback Mountain, taught mainstream Americans what decades of argument and invective couldn’t—that gay people are just people.
That’s why I’m excited about a new TV drama that premieres this week as WE tv’s first scripted original series. It’s called The Divide and it stars Marin Ireland as Christine Rosa, a third-year Philadelphia law student and caseworker with the Innocence Initiative, which, like the real-life Innocence Project on which it is modeled, uses DNA testing to exonerate wrongfully convicted prisoners. Christine unearths new evidence in the case of Jared Bankowski (Chris Bauer), a white convict awaiting imminent execution for the brutal murder of the African-American Butler family twelve years earlier. Christine pushes for a stay of execution and a DNA retest of Bankowski, and encounters resistance from her boss Clark Rylance (Paul Schneider), prosecutor Adam Page (Damon Gupton), and surprisingly from the convicted man himself. But her own family history with the criminal justice system inflames her passion, and she’s willing to break the rules to press on. “I don’t like it when the law gets manipulated by people who think they matter more than other people,” she tells Bankowski. “I hate their arrogance. I hate that they feel safe. I hate that they feel entitled to feel safe. I want to make them sweat, even if they win. Don’t you?”
If the stars sound unfamiliar, don’t worry. The acting is excellent, led by Ireland’s understated intensity; she served an internship with the Innocence Project after she was cast. Production values are solid and the show’s pedigree is sterling. The Divide is the brainchild of actor/director/producer Tony Goldwyn (you might know him as President Fitzgerald Grant on ABC’s Scandal) and producer Andrew Sugerman, who collaborated in 2010 on the feature film Conviction. That movie starred Hillary Swank in the true story of Betty Anne Waters, a working-class mom who obtained her GED and put herself through college and law school to free her falsely convicted brother from prison. (Full disclosure: Sugerman is a friend and colleague.)
Barry Scheck, co-founder of the Innocence Project, served as an advisor to Conviction and was portrayed in it. When Scheck spoke at the movie’s Hollywood premiere, he introduced nine of the now 316 wrongfully convicted prisoners (18 of them on death row) who have been freed due to the organization’s work. Goldwyn became an active supporter of the Project and co-chairs its Artists’ Committee. “Every single story is inherently, extraordinarily dramatic,” he told Sugerman, and he agreed, “There’s got to be a TV series here.”
He was right—in spades. I have a few quibbles after viewing the first hour of The Divide, such as the inelegant way Adam, the prosecutor, reveals an important secret. But pilot episodes are notoriously difficult—trust me, I recently wrote one—because they have to introduce the characters and explain the essential backstory while simultaneously telling the show’s first story. The Divide is eminently successful in drawing us in and making us care. More importantly, it has great potential to take us inside a world we haven’t seen.
I’ve had my own encounter with the criminal justice system in the last few years, helping an incarcerated relative appeal a conviction based on misinterpreted scientific evidence, so I’ve seen a bit of how it works from the inside. Most of our stories about the justice system portray prosecutors, judges, and cops as the Good Guys. And many of them are—but they’re good in a complex way. Even in the stories where these characters are corrupt, it’s not the same as showing us the system’s inherent bias against defendants.
The Divide doesn’t shy away from portraying the system’s complexity or fragility. In the first hour, Adam gets almost as much screen time as Christine. He made his reputation by convicting Bankowski in the first place, and we see the pressures on him to win, to appease his African-American supporters, and to achieve “justice” for the murdered family’s one surviving member, no matter the cost. The show is perched on this gray area of moral ambiguity.
Most of us keep the criminal justice system at a distance, on the expectation that if we haven’t done anything wrong, we have nothing to fear. But when hundreds—or more likely thousands—of people can be accused, tried, convicted, and imprisoned for crimes they demonstrably did not commit, no one is safe. After all, as the show’s tag line says, everyone is guilty of something.
There’s a rich trove to mine here—of morality, ambition, ethics, politics, and race. I hope that, over its eight-episode season, The Divide will take us deep into this world. If it does, it will be not just a good TV show but a transformative story, changing the way we think and feel about the complex, inexact, and very human matter of crime and punishment. The criminal justice system affects us all, and we need to do more than just avoid jury duty and vote for the candidate who promises us law and order.
The Divide premieres Wednesday, July 17, with a two-hour episode on WE tv. (Check cable and satellite listings.) Until then, the first hour is available on demand as well as on Roku and wetv.com.
Cross-posted in my “Power of Story” series on LinkedIn.
My grandfather loved horses and gambling more than he loved my grandmother, so he spent a lot of time at the racetrack. From time to time a small envelope would appear in our mail addressed to me, bearing the elegant raised blue return address of his butcher shop. Inside, I would find a brief note; a crisp, new hundred-dollar bill; and a clipping from the Racing Form with the winning horse circled. Its name would inevitably be Big John, or John Henry, or Johnny Diablo, or some other variant of my name.
I treasured these little missives, even though my mom always whisked the bill away immediately, to be applied to my college fund or piano lessons or the furnace repair. I liked that I received these winnings more often than my brother or sister—and not just because horses named John were more common than ones named Rick or Marcy. I liked that the money simply arrived, without my having to make my bed or earn an A or give a recital. I liked that my name alone was enough to bring my grandfather luck.
My grandfather has been gone almost forty years. Which might explain why a few years ago, when I first slipped a crisp, new hundred-dollar bill into my sister’s birthday card, she broke into tears.
An Epidemic of Overtreatment
By John Unger Zussman
Last month, I previewed Roger Weisberg’s new documentary about medical overtreatment, Money and Medicine, which premiered on PBS on September 25. You can view the film in full on its PBS website. In this post, I’d like to continue the discussion of the issues raised by the film and by readers’ thoughtful comments on my earlier post, with special emphasis on the politics of overtreatment.
In a particularly poignant scene in Money and Medicine, Dywane Stonum shows his elderly mother, Willie Stonum, black-and-white photos of her younger self on an electronic viewer. In one, she holds her first baby while her husband holds her. In another, she smiles proudly, decked out in an elegant jacket, a scarf tied in a bow, and a filigreed hat. “Here’s your favorite picture,” he tells her.
Willie Stonum doesn’t stir, or blink, or even acknowledge his presence.
Ten months earlier, she suffered a massive stroke. Now she languishes in her bed at UCLA Medical Center, occasionally opening her eyes. She is on a ventilator, uses a feeding tube, relies on dialysis, and needs constant medical support to maintain her blood pressure and fight pneumonia and other infections. Her Medicare bills exceed $5 million.
Dywane Stonum is his mother’s proxy for medical decisions. “I feel like my mom is my baby,” he says. “If there’s something that can sustain her medically, she would want that. Miracles happen if you believe in miracles. You do everything you can to preserve life. That’s what my mother would want.”
After ten months, UCLA’s Ethics Committee has decided that, should Willie Stonum experience another crisis, no heroic efforts should be undertaken to save her life. Dywane Stonum is incensed. “They’re pulling the plug,” he protests. “I call it a medical execution. It is euthanasia.”
“We do not practice euthanasia under any circumstances,” says Dr. Neil Wenger, chair of the Ethics Committee. But “it’s possible to use these advanced tools not to help patients, but to prolong a death, or to produce more suffering or less comfort. And under those circumstances, physicians may well say no.”
Nine years ago, my wife and I stood by my brother-in-law’s ICU bedside with his wife, daughter, and son-in-law. At age 59, after living with kidney disease for years, he had passed out and fallen in his bedroom. Although his wife quickly called 911, he had lost brain function by the time the paramedics arrived. Now we held hands and cried softly as the nurse removed his breathing tube. Minutes later, he died peacefully.
Long before his ultimately fatal illness, my brother-in-law had made this decision easy on his wife. “No way I want to live like a vegetable,” he told her, and us, and anyone who would listen. “Pull the plug. Put me out of my misery.” When he died, we felt relieved, not guilty. We were absolutely certain that’s what he would have wanted.
Leaving aside the question of why Willie Stonum is occupying a bed at UCLA—at a cost of over $16,000 a day—rather than at a nursing home, it is impossible not to wonder if she would want this kind of prolonged death—and if she ever had that conversation with her doctor or her son.
Of course, a proposal to compensate physicians to counsel their Medicare patients about end-of-life care options was originally included in the Affordable Care Act, often known as Obamacare. But when Sarah Palin falsely branded it a “death panel”—a claim that merited PolitiFact’s Lie of the Year award in 2009—it was removed from the bill.
Money and Medicine shows us two patients having just this conversation with their doctors at Intermountain Health Care in Salt Lake City. One, Davis Sargent, is suffering from end-stage congestive heart and kidney failure. “When it’s time, it’s time,” he says. “Of course I don’t want to die, but going out kicking and screaming doesn’t change the going out.” Sargent makes it clear that, instead of rescue care in an ICU, he wants hospice care at home. “I’m only 6 feet from a nice place to sit in the sun in the front yard, and I love that more than anything else.”
After his discharge, Sargent received comfort care at home for ten days before he died—as he wished—surrounded by his loved ones.
The film argues that reducing overtreatment at the end of life is not simply a question of reducing costs—it’s also what patients want when given the choice. “To deny people an opportunity to talk about death, to discuss how they want to die, to be given choices about dying, I think is a really cruel thing,” says Shannon Brownlee, acting director of the Health Policy Program at the New America Foundation, and the author of Overtreated: Why too much medicine is making us sicker and poorer. “And we have to start being able to talk about it. And not just because we’re spending a huge amount of money on it, but because a medicalized death is not what most people want.”
That sentiment is echoed by Sir Thomas Hughes-Hallett, former CEO of Marie Curie Cancer Care in the UK. “It’s not about hastening death,” Hughes-Hallett said in a recent New York Times Op-Ed. “It’s about recognizing that someone is dying, and giving them choices. Do you want an oxygen mask over your face? Or would you like to kiss your wife?”
Money and Medicine shows us overtreatment in a variety of settings and contexts. Filmmaker Weisberg argues that overtesting, overtreatment, and waste are inherent in the way we provide health care in the U.S. Our whole system is geared toward doing something rather than nothing, even when it doesn’t help or causes harm.
But there are alternatives, even within the current health-care system. Money and Medicine profiles Intermountain Medical Center, where Sargent received end-of-life counseling, as a case study in designing the best available science into care. IMC takes time to educate patients in the real risks and probabilities of both disease and treatment. And it regularly reviews practices and metrics in light of medical evidence.
For example, “elective induction of birth” by cesarean section has proliferated in much of the U.S., not for any medical reason, but for the convenience of patient and/or doctor. When IMC realized that more than a quarter of obstetrician referrals for elective C-sections were poor candidates for the procedure, they instituted team reviews of the cases. “Maybe I need my counselor who advises the surgery to not be the surgeon,” reasons Brent James, IMC’s chief quality officer. This resulted in a dramatic drop in C-sections, fewer babies in need of neonatal intensive care, and a savings of $50 million. Ironically, IMC suffered financially by receiving lower reimbursements because it performed fewer expensive procedures.
And this upside-down system of reimbursements—fee-for-service medicine—is at the core of the problem. Money and Medicine demonstrates that overtesting and overtreatment are not isolated or accidental, but integral parts of the American medical system. Every decision, every incentive—for patient, doctor, hospital, pharmaceutical and device manufacturer, insurer, and politician—is weighted toward doing more rather than less, even if it causes harm.
Of course, it’s wasteful—it consumes, by one estimate, 30% of U.S. health care spending, or $800 billion a year. But, as IMC’s James puts it, “one person’s waste is nearly always another person’s income.” In fee-for-service medicine, no one gets paid unless the test is ordered, the medication is prescribed, or the procedure is performed.
It would be one thing if overspending and overtreatment resulted in positive patient outcomes. But the U.S. achieves, at best, middling results when compared to other Western countries, while outspending them significantly. The system’s defenders like to point to foreign “medical tourists” who come to the U.S. for treatment, and the Deloitte Center for Health Solutions estimated there were more than 400,000 of them in 2008. But that number is dwarfed by the 1.5 million Americans who sought health care abroad in the same year.
Stanford geriatrician Dr. Walter Bortz blames overtesting and overtreatment on the collision of biology and capitalism in fee-for-service medicine:
I’m a capitalist, I believe in capitalism; it’s the best social contract we have to make the gears of society work. But it’s selling the wrong product. Our capitalism sells disease. We want you to bleed. Or we want you to have a spot that we don’t know, and that will generate X-ray after X-ray after X-ray … Stanford, where I love my life, is a repair shop. You go there to get fixed. Why? Because we can send you a bill for that.
The emphasis on profit means that we haven’t even done enough research to know which of our current treatments are actually working. Pharmaceutical and device manufacturers, who underwrite clinical trials, have no incentive to finance research on drugs and devices that are already making them money. “Medical research is dominated by research on the new: new tests, new treatments, new disorders, new fads, and new markets,” says Dartmouth professor Dr. H. Gilbert Welch. “We have to start directing more money toward evaluating standard practices—all the tests and treatments that doctors are already providing.”
I wish Money and Medicine had the time to show in detail the harm that can be caused by excessive screening tests and treatments, since many patients (and even doctors) discount it. What’s the downside, they reason, in getting an annual PSA or mammogram, or in receiving chemotherapy that reduces your chance of recurrence by a couple more percent? But as James says, “treatments that are powerful enough to heal can also harm.” This was illustrated last June, when Good Morning, America co-anchor Robin Roberts, who had gone public with her successful battle with breast cancer in 2007, announced that she was diagnosed with myelodysplastic syndrome (MDS). MDS is a bone marrow disease often caused by chemotherapy and radiation received in an earlier cancer treatment. “We always think of the drug as a double-edged sword,” says Otis Brawley, chief medical officer of the American Cancer Society. “It’s one of the reasons why I’m outspoken about only using chemotherapy when we absolutely need chemotherapy.” Roberts received a bone marrow transplant earlier this month.
With health care a central issue in the current presidential election, I asked Weisberg how the Obama and Romney campaigns would address the overtreatment and waste issues raised in his film. He began with Obamacare:
The Affordable Care Act that the Supreme Court recently upheld extends health care coverage to over 30 million uninsured Americans but actually does very little to make health care more affordable. The main thrust of the legislation was to expand access, not to contain costs. However, there are a number of provisions that fund demonstration projects that attempt to alter the reimbursement system in order to reward value instead of volume—to reward the quality instead of the quantity of medical services. One of the best-known initiatives involves the creation of Accountable Care Organizations or ACOs.
If the Affordable Care Act doesn’t actually do enough to make health care affordable, it’s tempting to blame Republicans and their lies about death panels. But let’s remember that the Obama administration gave away much of the store before the debate actually began, in its attempt to win support from industry and Congress. For example, in exchange for support and cost concessions from the Pharmaceutical Researchers and Manufacturers Association (PhRMA), the White House agreed not to use government leverage to bargain for lower drug prices or to import drugs from Canada.
But at least the ACOs encouraged by Obamacare take a shot at overturning the overtreatment incentives of fee-for-service medicine. Romney’s plan—to the extent he has revealed it—aims to reduce government expenditures for health care, but not the costs or structure of health care itself. Weisberg’s take:
The Romney plan, like many of his policies, is not terribly fleshed out. His mantra is “repeal and replace.” What we do know is that he would make Medicaid a block grant program, leaving states to struggle with declining budgets and decide who gets what kind of care. He would also turn Medicare into a voucher program, with the result that over time the voucher would cover a smaller and smaller portion of the medical bills of seniors.
As I write this, shortly after the final presidential debate, the campaigns have not seriously discussed medical overtreatment or cost control. In fact, so far they have not progressed beyond a fight between siblings. “You want to destroy Medicare!” “No, you do!”
Faced with governmental inaction, hospitals and professional medical organizations have begun to take responsibility for reducing overtreatment and waste in their own domains. Memorial Sloan-Kettering Cancer Center and the Mutual Of Omaha Medicarerecently decided to drop an expensive new colorectal cancer drug (Zaltrap) from its formulary, despite the fact that Medicare would reimburse them for it. The reason: it works no better than a similar drug (Avastin), but costs more than twice as much.
Another ray of hope comes from the Choosing Wisely initiative, sponsored by the American Board of Internal Medicine Foundation. The initiative has recruited professional associations for major medical specialties, like the American Association of Cardiology and the American Society of Clinical Oncology. Each association has identified “Five Things Physicians and Patients Should Question”—common tests or treatments that are expensive, overused, and unsupported by medical evidence. Working with these professional societies, Consumer Reports has compiled clear and objective guidelines for patients on such topics as heartburn, Pap tests, and lower back pain. As the film suggests, when patients are informed about the choices available to them and their risks and benefits, they are less likely to choose overtreatment. (My thanks to Cameron Ward for alerting me to this site in a comment to my earlier post.)
Some commentators maintain that, to truly control health care spending in the U.S., we need to ration health care. “In the famous ‘third rails’ of American politics,” argues Steven Rattner, former counselor to President Obama’s treasury secretary, invoking the spectre of death panels, “none stands taller than overtly acknowledging that elderly Americans are not entitled to every conceivable medical procedure or pharmaceutical.”
In the long run, that’s probably true. But we’re not there yet—not even close. There’s plenty of low-hanging fruit to pick first. “It’s not rationing to get rid of stuff that’s bad for you,” says Brownlee, author of Overtreated. “It’s not rationing to get rid of care that won’t benefit you.” In the film, she cites a recent study of late-stage cancer patients that compared palliative care—making the patient comfortable without actively trying to combat the disease—to standard, aggressive treatment. The patients who received only palliative care actually lived longer than those who received the standard treatment.
“This isn’t withholding necessary care,” echoes IMC’s James. “It’s withholding unnecessary injuries.” Eliminating overtesting and overtreatment have little downside and great upside. But to do that, we need to rely on science to sort out which tests and treatments are medically warranted. And we need to eliminate the incentives of fee-for-service medicine and embrace an ACO model in which healthy outcomes, not tests and treatments, are rewarded.
It’s time to have that “adult conversation” our leaders keep promising—whether they choose to participate or not.
I never knew Steve Jobs, but I almost worked for him twice. I interviewed for a technical writer position at Apple in 1980, and for the director of documentation role at NeXT in 1993.
In truth, I was both sorry and thankful that I didn’t end up with those jobs. Jobs was a true visionary, and he inspired great loyalty, but he also had a fearsome reputation as a manager. As two early Apple employees told me years ago, “I don’t know anyone who worked for Jobs who didn’t have a strong love/hate relationship with him.”
Still, I had many friends at Apple, and got to observe the company closely as a freelance computer journalist, notably as a contributing editor and regular columnist for InfoWorld in 1981–83 and for A+ Magazine in 1984–85. A+ was an independent Ziff-Davis publication devoted to covering Apple news, products, and culture. My bimonthly column was called “Electronic Brainstorming.”
Although there have been many tributes and eulogies published in the week since Steve Jobs’s death, most have focused on his transformation of Apple after returning to the company in 1997. I thought it might be illuminating to revisit the Apple of the mid-eighties, when Jobs was a brash, young C.E.O. whose early meteoric success was threatened by the market entry of the industry’s most fearsome competitor, IBM, and the concurrent rise of Microsoft. I republish this essay from 1984—a few months after the first Macintosh shipped—as a tribute to a man who, in the end, turned out to be both brilliant and smart.
Readers who weren’t around at the time will find references to inexplicable concepts like computer stores, typewriters, secretaries, MS-DOS (this was even before Windows), CP/M, 1-2-3, floppy disks, Apple computers before Macintosh (Apple II, III, and Lisa), and a raft of prominent computer and software companies that ceased to exist long ago. These were the days when 256K—K, not M—was a lot of memory. If you’re curious … ask your parents.
The Apple / IBM Difference
By John Unger Zussman
from A+ Magazine, May 1984
A war’s going on between Apple and IBM, and it’s not just a war of competing products. It’s a war of competing approaches, styles, and philosophies. The outcome may well determine the type of computers we’ll be using ten years from now—and how we’ll buy them and how much they’ll cost.
The differences between the two companies are apparent as soon as you enter their respective corporate offices. At Apple’s headquarters in Cupertino, California, there’s an Apple—sometimes two—on every desk. In addition, all employees get an Apple to take home on indefinite loan. Once employees have been with the company for a year, the Apple is theirs to keep.
Conditions are different at IBM’s Entry Systems division in Boca Raton, Florida, where the Personal Computer is made. As InfoWorld columnist Doug Clapp recently observed, Apple managers have computers; IBM managers have secretaries. The secretaries, in turn, have Selectric typewriters, not PCs. “I don’t believe that small computers are as pervasive, or as effective, at IBM as they are at Apple,” commented Clapp.
It’s a long-standing tradition in the microcomputer industry for the cobbler’s kids to go barefoot. Since start-up companies typically have little money, any equipment generally goes to the technical staff. Managers and clerks make do with manual systems—an arrangement that often continues, through sheer inertia, well past the early financial crises.
Still, IBM is hardly a start-up. It’s difficult to believe the company couldn’t put a PC on every employee’s desk if it wanted to. The fact is, even at Entry Systems, IBMers seem more oriented toward selling personal computers than toward using them. They work on micros; Apple people work with them. To IBM, personal computing is a business. To Apple, it’s a passion; the business aspect is almost a sideline.
How Did We Get Into This Mess?
Both companies entered the personal-computer business reluctantly, although they came from vastly different directions. Apple founders Steve Jobs and Steve Wozniak built their first computer, in that legendary garage, for their own amusement. They were surprised when their friends and fellow tinkerers also wanted computers; but, finally convinced, they set out to build a computer that others could use. The result, the venerable Apple II, still sells strongly going into its eighth year, in an industry where a five-year product life span is enviable.
IBM initially spurned the micro market, clinging to its mainframe and minicomputer product lines until it could no longer deny the reality of the personal-computer revolution. Ironically, the success of products like the Apple II compelled IBM to take notice. Once persuaded, IBM devoted its efficient, methodical, calculated approach—and its abundant resources—to developing a personal computer.
The result, the IBM PC, is a masterpiece of market and industry positioning, more than of technological sophistication. In fact, its engineering is notably conservative. The PC used mature, tried-and-true technology that other companies had already surpassed. But the PC matches precisely the needs of its intended market—corporate managers and operators of small businesses.
Moreover, for the first time in IBM’s history, the engineering was open. IBM included five expansion slots in the PC and published detailed technical specifications. Independent companies had no trouble developing PC-compatible hardware and software.
It’s significant that IBM’s technology has consistently lagged behind that of these third-party vendors. For example, the independents offered double-sided disk drives, hard disks, and color monitors for the PC long before they were available from IBM. Similarly, within three weeks of IBM’s announcement of the PCjr, two companies announced enhanced PCjr keyboards.
What IBM has built is less a computer than a bandwagon. It has made up for its late entry to actively fostering a market movement. In other words, IBM engineers have developed products that others want to use.
Apple engineers, in contrast, have developed products they want to use. They’ve relied on being brilliant—which is both Apple’s great strength and its great weakness. Apple’s major products to date—the Apple II, and Lisa, and the Macintosh—are bold, innovative, and technically advanced, almost experimental.
Apple’s engineers, trusting their own instincts, have occasionally guessed wrong. For example, Apple made several efforts to develop its own disk drives for the Lisa and Macintosh. It finally abandoned the project and customized a Sony drive instead.
Most often, Apple’s instincts have been right on target, though. That’s why the Apple II has lasted so long and why the Lisa and the Macintosh have received so much acclaim. It’s instructive that the Apple III—the only product Apple designed for others—is its least successful product.
Apple, designing for itself, tries to find its own engineering solutions to all foreseeable problems. For example, it has actively developed its own proprietary operating systems, starting with Apple DOS for the Apple II. Even when Apple supported industry-standard system options, such as UNIX on Lisa, it published a long list of suggested programming rules for interaction between programmers and users.
Brilliant vs. Smart
While Apple was busy being brilliant, IBM was busy being smart. IBM initially offered three standard operating systems (MS-DOS, CP/M, and the p-System) and recently announced a fourth (UNIX). IBM lets the market make the choices and the improvements.
Here is another example. Apple has tried to maintain absolute control over its computers. It has curtailed mail-order distribution channels, patented its technology, and actively prosecuted manufacturers of Apple-compatible machines. So far, IBM has taken no action against the PC-compatibles. In fact, by registering no patents and publishing its specifications, IBM has encouraged imitation.
These actions have made Apple somewhat of an innovative loner, an image the company is promoting in its recent advertising campaigns. In one TV commercial, an Apple user is working alone in a cavernous room ahead of (but also isolated from) a crowded roomful of other computer users. In another ad, a manager has clearly spent all night at the office, working alone with his Lisa computer. He calls home with an exhausted smile to report he’ll be back for breakfast.
Apple obviously wants to appeal to people who fit its corporate image—young (baby-boom generation), innovative, and independent. Apple users, the ads suggest, are loners too—they demand brilliance and aren’t content to use technology that isn’t thoroughly up to date.
But brilliance isn’t always smart. Consultants Barbara and John McMullen recently observed computer stores that carried both the Lisa and the PC. “Invariably there is a much larger crowd around Lisas that around PCs,” they reported, “yet the stores always sell more PCs than Lisas.”
Apple has recently shown encouraging signs that it is conscious of industry standards. It has dropped the price of the Lisa dramatically, increased its emphasis of UNIX on the Lisa, and announced support for Rana Systems’ innovative new 8086/2, which gives IBM PC compatibility to the Apple. These decisions may not be brilliant, but they are smart.
Take the Rana expansion option, for example. It contains an Intel 8086 co-processor, 256K bytes of memory, and two floppy-disk drives. All together, they allow an Apple II or IIe to run MS-DOS, the most popular PC operating system. Shortly after Apple blessed the 8086/2, Lotus Development Corporation announced a version of its integrated application program, 1-2-3—by far the best-selling PC software package—to run on Apples equipped with the Rana option. Score one for Apple.
The Macintosh, on the other hand, is unabashedly brilliant, even revolutionary. It’s also built from the ground up and ignores virtually every established standard in the business: no color, no cursor keys, no expansion slots, small disk drive, yet another proprietary operating system. “Who cares?” asks Apple. “These are the standards that have alienated millions of potential computer users.” A good point.
Apple means to set a whole new standard, to steal the standard-setting business away from the IBM PC (which stole it, in turn, from the Apple II). Is this brilliant? Of course. Is it smart? I don’t know. It’s risky. If it works, it’s smart.
Last year, every software company I know of had boarded the IBM bandwagon. All my programmer friends were working feverishly at their PCs, leaving their dusty Apples in the corner. Even software developers such as Mitch Kapor (Lotus), Dan Bricklin (Software Arts), and Fred Gibbons (Software Publishing), who’d made their first million on the Apple, seemed intent on making their second on the IBM PC.
Now they’re writing for the Macintosh. But they’ve still got one hand on the PC.
In the mainframe world, a long-standing joke had it that the market consisted of Snow White (IBM) and the seven dwarfs (Burroughs, Honeywell, NCR, Univac, RCA, General Electric, and Control Data). (This was an old joke—way before Apple, even before DEC.) If Apple abandons its brilliance and becomes merely smart, it will surely be destined for dwarfdom.
Conversely, should Apple continue to be brilliant but not smart, it might not survive at all. That would be a shame. We need an Apple that’s both brilliant and smart to keep IBM from dominating the industry and slowing the pace.
For Chelsea and Maz, whose wedding last weekend prompted me to dust this off.
And for Patti.
Pas de Deux on the High Wire
by John Unger Zussman
One Saturday, in our twenties,
We put up the tightrope.
Eyed it warily. No sweat,
We reassured each other.
Piece of cake.
At first, we could barely manage
A few quick steps on the rope,
Stretched taut, inches off the floor.
We’d push out tentatively,
Teeter, recover, flail, step off.
Laugh nervously, try again.
No instructor, no mentor,
Just the trying, and ourselves.
Slowly we learned to center our balance,
Arms extended, touching lightly for support.
The posts grew higher, year by year,
And our moves more intricate.
We took tumbles, gathered bruises.
When one wavers, the rope jiggles,
Endangering the other.
Once I toppled from eight feet.
She came too. It took years
To recover. The scars
Remind us how we learned.
Now we run, hand in hand, from peak to peak,
Skyscraper to skyscraper, triple pirouette, Grand jeté, entrechat huit. I partner her
In a deep penché, the tip of her pointe shoes
Balanced on the narrow wire. It is exhilarating
And marvelous. We are confident until,
In middle age, we look down. Then we seem
Unbelievably, foolishly precarious.
Just us two. No children, families distant,
A few friends gazing up from below.
We are working without a net. It is glorious,
Yet there is always the premonition
Of the inevitable gust of wind.
In the eight years since the human genome was sequenced, the search has been on for genes that underlie various diseases and disorders. We seem to be obsessed with genetic explanations for human physiology and behavior.
And when we find them, we often assume that biology is inescapable destiny. For example, some women with no evidence of cancer, but with one of the breast cancer mutations (BRCA1 or BRCA2), choose to have preventive mastectomies rather accept the elevated risk of breast cancer they have inherited. In the race between nature and nurture, nature seems to be winning—at least in our minds.
I want to argue here that nature vs. nurture is the wrong way to think about this question. Not everyone with a BRCA mutation develops breast cancer. Something must be intervening between genetics and outcome. The problem with BRCA is we don’t know what.
There are other conditions, however, where we do know the intervening factors. So, to illustrate this new perspective, let’s examine a serious disease called phenylketonuria.
Never heard of it? Perhaps you’ve noticed the fine print on a can of diet soda that says Phenylketonurics: Contains phenylalanine. If you’ve wondered what those words mean and whether you should avoid diet soda—you should, but not because of the warning—read on.
Phenylketonuria (FEE-nil-KEE-tun-YUR-ia, but you can call it PKU) is a rare gene-linked condition that affects about 1 out of 10–15,000 people in the U.S. Phenylketonurics have trouble producing an enzyme that metabolizes the essential amino acid phenylalanine (one of the building blocks of proteins). Phenylalanine is found in most proteins in the human diet (including breast milk) and the artificial sweetener NutraSweet (aspartame), which is used in many “sugar-free” food products and over-the-counter medicines. If phenylalanine builds up in the blood, it can outcompete other amino acids in transport across the blood-brain barrier, starving the brain of other amino acids that are necessary for development. Serious cognitive impairment can result, including mental retardation, ADHD, brain damage, and seizures.
In developed countries, most newborns are screened for PKU soon after birth by testing blood for phenylketones. If PKU is diagnosed, parents are advised to start the child on a special diet low in phenylalanine. Some phenylalanine is necessary, but it must be strictly limited. This means severe restrictions on meat, chicken, fish, eggs, nuts, cheese, legumes, milk, and other dairy products, and no aspartame. Infants’ diets are often supplemented with special formula; as the child grows, pills or special protein foods can substitute. By following this diet, phenylketonurics can avoid phenylalanine buildup and its serious effects, and lead fully normal lives.
When I first lectured about PKU years ago, it was believed that phenylketonurics could go off the special diet by age 5-6 and no further damage would result. However, later research confirmed the benefits of extending the diet to age 18, and now, “for life.”
Since PKU is caused by an anomaly in a single gene that interferes with enzyme production, you might think of it as a textbook example of a completely genetic disorder. (Technically, it’s a recessive autosomal disorder, in which the child inherits one copy of the mutation from each parent.) And from the point of view of a child fed a “normal” diet, you’d be right. Whether she suffers the cognitive damage of PKU depends completely on whether she has two copies of the mutation. It’s 100% nature and 0% nurture, right?
Well, no. Because diet mediates the effect of the gene. So consider the situation from the point of view of a child with two copies of the mutation. Whether she suffers the cognitive damage of PKU depends completely on her diet. It’s 0% nature and 100% nurture!
Something is wrong here, and I would argue that it’s our way of considering nature and nurture in “either/or” opposition. Nature and nurture always work together in “both/and” collaboration. It’s the interaction of nature and nurture that determines our outcomes. In other words, it’s 100% nature and 100% nurture.
When you think about it, that’s true almost by definition. Genes are not directly apparent in our physiology and behavior; they are always expressed through the physical environment of our bodies, brains, and their surroundings. In a recent blog post, California geriatrician Walter Bortz talks about the Pima Indians of Arizona, who have one of the highest rates of type 2 diabetes in the world.
When these folks pursue their very physically active lives south of the border, in Mexico, they have no diabetes. But as soon as they cross north of the Rio Grande, they find McDonald’s and a far-more leisurely life. They soon thereafter develop diabetes. Their genes certainly have not changed during their trip north, but their lifestyle did.
If the Pima have an underlying genetic predisposition for diabetes—and no such genes have yet been found—it is masked until they enter the world of fast food and couch surfing.
(Bortz is an octogenarian marathon runner who just published his seventh book, Next Medicine, a dire diagnosis and optimistic prescription for the American health care system that is well worth reading.)
When you start to look for the nature/nurture interaction, you see it everywhere. We generally think of physical height as highly heritable. But the genes for tallness can be undermined by poor diet. Bortz cites research by John Komlos of the University of Munich, who has studied population height over time. Two hundred years ago, the average Dutch man was 3 inches shorter than the average American. Now, he is 3 inches taller, at 6’1”. Genes don’t change this quickly—but diet and nutrition do. “America has gone from being the tallest nation in the world,” observes Komlos, “to the fattest.”
In the last few decades, we have learned a great deal about the factors that influence genetic expression. Everything goes into the mix, from the prenatal environment of your mother’s womb, to the viruses you happen to be exposed to, to the billions of microbes that live in your gut, to the air you breathe, the water you drink, and the food you eat. (Of course, these factors can also cause mutations in our genes, not part of our genetic inheritance, that can cause cancer and other disorders.)
In addition, the whole science of epigenetics has emerged—the study of what turns genes on and off—telling us that environmental factors like diet, behavior, and environmental toxins are key. I’ll leave epigenetics to a future post, but significantly, these epigenetic switch settings can be passed down to future generations. Not only does environment influence the way genes are expressed, it also directly influences those genes themselves.
So the next time you hear a claim that some characteristic or quality is 80% (or 50% or 20%) genetic, think twice. At best, it’s an oversimplification; at worst, it’s wrong. Many factors intervene between gene and outcome. Nature and nurture are invariably, inextricably intertwined. If we understand that, we can make wiser decisions about our health and our lives.
I’ve been asked to explain a bit about the personal essays I often contribute to this blog. I wrote most of them for the popular “Readers Write” feature in my favorite literary magazine, The Sun. Each month they propose a topic like “Rites of Passage” and invite readers to contribute their own stories. Of the submissions they receive—sometimes as many as a thousand—they publish the most interesting. An abridged version of this essay appears in the current print edition of The Sun (June 2011).
Rites of Passage
By John Unger Zussman
In seventh grade, my peer group began to play kissing games at parties. Spin the Bottle, Seven Minutes in Heaven—tame stuff, in retrospect, but to me it seemed intimidating and immoral and I wanted no part of it. Entering adolescence shortly after my father died, I had no adult male hand to guide me. (I did have an older friend who breathlessly explained that babies resulted when the boy peed into a little hole in the girl. I knew that couldn’t be right.)
It’s not that I wasn’t interested in girls; I was desperately interested, and spent many nights agonizing over how to get them to like me. No, it was sex I wasn’t interested in, even when I got the story straight. I had absorbed a strict moral code from my mother and was convinced that sex before marriage was wrong. I was after girls’ admiration and love, and I believed I would win that by respecting them.
I didn’t leave the parties when the games began; I would simply not partake. For a while, my best friend felt the same way, and we would watch awkwardly from the edge of the circle. But soon, he succumbed too, and I was left to uphold my moral code alone.
(Years later, I asked my mother what she thought of the way I abstained from those games. “I thought you were dumb,” she told me bluntly. Thanks a lot, Mom. Now you tell me. All I needed was someone to explain that girls were sexual beings too, and that they were just as curious about exploring those feelings as I was, if not quite so driven or tormented.)
By the time I started dating in tenth grade, I had decided that kissing, at least, was permissible. My dates and I spent hours necking, in my car or in their living room, at summer camp or youth group retreats. One girl, bored with kissing, urged me to go further. Her previous boyfriend had a serious disease, she explained, that had pushed them into early intimacy. Despite her clear invitation, I was immobilized by impending guilt.
And so the task was left to Wendy, my girlfriend at the beginning of senior year. Exasperated after yet another marathon make-out session, she took my hand and placed it gently on her breast. That act of mercy opened the floodgates, and for that, Wendy, my wife and I are forever grateful.
I’ve written about cancer previously in these pages. In Against Medical Advice, I recounted what I learned when someone I loved (I called her Bonnie) was diagnosed with breast cancer. In Poisoned, I traced Bonnie’s and my efforts, once her treatment was over, to identify the root causes of our cancer epidemic and comprehend why forty years of the “War on Cancer” have failed to dramatically reduce cancer rates. Finally, in Not Your Median Patient, I paid tribute to my two of my scientific idols, evolutionary biologist Stephen Jay Gould and climate scientist Stephen Schneider, who applied their own scientific expertise and methods to understanding and fighting their own cancer.
One organization I commended in “Poisoned” was Breast Cancer Action for their efforts to eliminate the root causes of cancer in our environment. BCA’s seminal Think Before You Pink™ campaign urges consumers to resist buying pink-ribbon products from companies that actually worsen the cancer epidemic. BCA has recently stepped up their outreach, including a new blog and an informative monthly webinar series. For example, I learned in this month’s webinar that National Breast Cancer Awareness Month (NBCAM) was co-founded by the American Cancer Society and the pharmaceutical division of Imperial Chemical Industries. ICI is now part of pharmaceutical giant AstraZeneca, which manufactures not only several breast cancer drugs but also the herbicide Acetochlor, a known carcinogen, thus profiting from both causing and alleviating cancer.
So I’m pleased today to reprint an essay by BCA’s communications manager, Angela Wall, about the need to go beyond breast cancer “awareness” (as if we’re not already aware of breast cancer) to identify and eliminate the toxins that cause it. If the beginning of wisdom is to call things by their right names, as the Chinese proverb says, then calling NBCAM “Breast Cancer Industry Month” is wise indeed. My thanks to Ms. Wall and BCA for permission to reprint her essay here.
Good News? Not So Fast …
By Angela Wall
Good news on breast cancer, says Sadie Stein writing for Jezebel. Why? Well, because of pink ribbon awareness campaigns more women are getting screened and diagnosed earlier. Hold on. Does this ring false to anyone else?
Awareness only got them to make a screening appointment to detect the cancer that was already developing.
Ordinarily, I celebrate an article that tacitly suggests that we’ve had enough pink awareness. I’d certainly celebrate the end of the pink noise and hypocrisy that accompanies breast cancer industry month because then instead of having our attention distracted by pink awareness campaigns, we could all start addressing the real issues that increase our risk of developing breast cancer and we might actually be able to focus on reducing diagnoses rather than celebrating them.
I doubt that’s going to happen though. There’s too much money to be made every October from slapping a pink ribbon on a product. Plus and the feel good rewards that accompany pink ribbons can really boost a company’s image regardless of whether or not the product being sold actually contributes to breast cancer. Heaven forbid we make consumers aware that the products they are purchasing actually contain ingredients that might cause cancer. Awareness apparently doesn’t need to go that far. It’s no surprise then that awareness never prevented anyone from developing breast cancer.
Awareness campaigns have never addressed why more white women get diagnosed with breast cancer but more women of color die from it. Awareness and pink ribbon campaigns have only ever distracted us. Awareness campaigns don’t demand we demand tighter state and federal regulations around the manufacturing and production of cancer causing chemicals or their being included as “ingredients” in the products we use to clean our homes.
I’ve never seen anything to celebrate about breast cancer and I certainly get deeply troubled by the idea that we might have done enough simply because people are being screened more regularly even though more cancer is being detected. Surely, screening rates are only to be celebrated when fewer people receive a cancer diagnosis.
I would agree that awareness has served its purpose. Now it’s time to demand that chemical corporations stop manufacturing products known to cause cancer. I would celebrate if Eli Lilly announced that they were stopping production of their cancer-linked recombinant bovine growth hormone (rBGH), which contaminates a third of US dairy products. I would celebrate if the FDA declared that rather than meeting with Roche Pharmaceuticals to reconsider approving Avastin as a treatment for metastatic breast cancer (despite evidence demonstrating that it doesn’t work), they refined their approval guidelines and insisted that treatments cost less, do more than existing options, and improve the quality of life of women with breast cancer who take them. So I think I’ll hold off on my celebrating if nobody minds until the studies start to show real systemic changes are reducing breast cancer diagnoses over the long term.
Detroit was a great town for labor when I grew up there in the ‘50s and ‘60s. While I never worked on the assembly lines—my uncle’s steel warehouse had first dibs on my summer labor—I had friends who did. The work was stultifying, but the pay was good, and they socked away money for college.
The high pay was because of Henry Ford, and it was not an accident. Ford had no love of labor unions (or Jews, for that matter, but that’s another story)—quite the opposite. Yet in 1914, he stunned the business world by offering a wage of $5 a day—more than double the prevailing rate.
Ford’s bold move did more than attract skilled mechanics and workers to Detroit and reduce Ford Motor Company’s heavy employee turnover. It spearheaded the creation of a vast middle class, including blue- as well as white-collar workers. The canny Ford realized that, if he was going to sell his mass-produced cars, his own workers (and those of other companies) had to make enough money to afford them. “I believe in the first place,” he wrote in his memoir, “that, all other considerations aside, our own sales depend in a measure upon the wages we pay.” For seventy years, American industry and the American middle class grew in tandem.
That link has now been strained to the breaking point. For the last thirty years, the American middle class has been under assault. It started with the right’s beloved Saint Reagan, whose orgy of tax cuts and deregulation spawned not only the savings and loan crisis but also a massive transfer of wealth from workers to the rich. Real middle-class incomes have stagnated since Reagan. “Morning in America”—to the extent it was ever more than a campaign slogan—was confined to the wealthy.
Corporations—no longer American but global—can afford to underpay or lay off American workers because they are no longer dependent on them either as labor or as consumers. Automation and robotics have reduced the proportion of labor costs in manufacturing. Advances in telecommunications and transportation allow corporations to relocate operations to locations where wages are lowest. And a growing global middle class provides a market of eager customers with money to spend.
All this represents capitalism in the 21st century, as companies find new ways to compete and find their market. But satisfying the market becomes almost irrelevant when big corporations can lobby the government to keep themselves solvent. Automakers make SUVs that become impossible to sell when gas prices rise, so they run to the government for bailouts. Oil companies reap generous tax breaks even when they make record profits. Food conglomerates collect extravagant farm subsidies even when their factory farming practices endanger the environment and produce food that makes us obese and sick. This is corporate socialism, showering benefits on people who decry socialism.
Nowhere has the link between wealthy corporations and the struggling middle class been severed more completely than in the financial industry. In the wake of the financial meltdown, the big banks have eliminated the need to have markets at all. Matt Taibbi’s recent exposé of the financial bailouts in Rolling Stone describes the feeding frenzy of free money and guaranteed profits that followed the financial meltdown of 2008.
In one of Taibbi’s classic examples of corporate welfare to the financial industry—though by no means the most egregious—the Fed offered loans to banks at near-zero interest rates, intending to buoy up their balance sheets and encourage lending to businesses and consumers. But because the Bush (and later Obama) administrations attached few strings to those loans, the banks simply took the money and bought Treasury bills, realizing a 2% risk-free profit for essentially lending the government back its own—that is to say, our own—money!
The strained link between corporate and middle-class prosperity is why we have another jobless recovery, why corporate profits and GDP and the stock market have been on a tear for two years while the middle class struggles. It is class warfare and we ought to call it that. Unfortunately, right-wing politicians and media—who somehow still dominate political discourse in this country despite the debacle of the Bush administration—have co-opted the term, crying “class warfare” whenever someone proposes that tax rates be returned to those of the “oppressive” ‘90s. They have somehow convinced the American middle class that their interests, as Bill Maher likes to point out, are the same as Steve Forbes’.
We need to take the term back. Class warfare is what we need to wage, and it ought to be our battle cry as we tell the American people who’s been waging war on whom.
No, Henry Ford was not a socialist. But he felt a responsibility to American workers. If he were alive today, he just might be on our side.
It’s telling that some of the best reporting on the financial meltdown has come from Taibbi’s groundbreaking reports in Rolling Stone, Planet Money and Pulitzer Prize-winning ProPublica’s features on public radio’s This American Life, and Charles Ferguson’s Oscar-winning documentary, Inside Job. Why have their startling revelations found so little resonance in the supposedly liberal mainstream media? Because mainstream media are big corporations too.
Astute reader Robert Mayer, professor of consumer studies at the University of Utah, observes that former secretary of labor Robert Reich makes similar points in his recent book, Aftershock. Reich has spoken out strongly to decry the right’s attack on the middle class. “After 1980,” he writes, “the pendulum swung backward.” Indeed.
A new study by the Congressional Research Office now confirms the extent to which the big banks borrowed at low interest rates from the Fed and then, instead of lending to consumers or businesses, simply turned around and bought Treasury bills. The report was requested by Senator Bernie Sanders (I-VT), who has taken a lead role in exposing such practices.
Last month, I posted an inside view of the American corrections system by Mark Unger. Today, I examine another aspect of our criminal justice system—the death penalty—with a preview of the documentary, No Tomorrow. The film premieres on PBS this Friday, March 25.
Think of the issues you’re most passionate about. If you’re reading this blog, they might include universal health care, our social safety net, climate change, civil rights, feminism, reproductive rights, gay marriage, war, nuclear proliferation, or capital punishment.
Now imagine that someone uses two years of your most intensive, committed work to argue, eloquently and effectively, against that issue.
That’s what happened to filmmakers Roger Weisberg and Vanessa Roth, veteran documentarians whose films air regularly on PBS. Their work has won numerous awards, including two Oscar nominations for Weisberg and one Oscar win for Roth. (Full disclosure: Weisberg is a long-time family friend.) READ MORE