Thursday, February 28, 2013

Thinking About Chocolate Rather Than Eating It


As a Swiss citizen, eating chocolate is my favorite way of supporting the national economy. Alas, this Lenten season, I can only think about it, rather than indulging in it.

And thinking about chocolate can be dispiriting. The truth is that the fabulous final product often hides an ugly sourcing story. I wrote about this a few months ago (here), when Whole Foods removed Hershey Brand's high-end Scharffen Berger chocolate line from its shelves because of Hershey's " failure to assure that the cocoa is sourced without the use of forced child labor."

Forced child labor in West Africa's huge cocoa market is only one sad story from the chocolate front. Another is the continued exploitation of women workers, who routinely earn less than half of what men are paid, for the same job.

A recent report from Oxfam (here; note: launches as PDF) notes that while global chocolate sales keep rising (breaking $100 billion in 2011), the tight control on the market by its largest players has kept that success from reaching the farmers who produce the base crop. Ninety percent of the world's cocoa is grown by small-scale farmers, and most -- especially in West Africa, from which 70% of cocoa is sourced -- live below the poverty line.

And it is the women who suffer the most.

One example should suffice:
While everyone struggles to survive on the meager income made from cocoa, it is the women laborers who appear to fare the worst. Agnes Gabriel, a 37-year-old migrant worker living in Ayetoro-Ijesa [Nigeria] says that one of the jobs she is hired to do on local cocoa farms is lug water to be mixed with pesticides. Other tasks include removing the beans from their pods during harvest time, carrying them to the site where they will ferment and then helping with the drying process. For her efforts, she earns 500 Naira a day, or just over $3. Farmers say women are paid $2–3 for a typical day’s work, while men earn about $7 per day.

What should the companies be doing? The Oxfam report has three steps it recommends: (1) "know and show" how women are treated in their supply chains; (2) Commit to the adoption of a "plan of action" to increase opportunities for women and address inequalities in women’s pay and working conditions; and (3) influence other powerful and relevant public and private actors to address gender inequality (for example, by signing on to the UN Women's Empowerment Principles, here).

What can any one individual do? Know what you buy.

The Food Empowerment Project has a "Food Is Power" list of chocolate products that appear to be produced from areas where child labor / slavery have been reduced if not eliminated (here), but it does not address the issue of gender discrimination.

Oxfam has a "Behind the Brands" scorecard, not limited to chocolate, where consumers can learn more about the supply-chain ethics behind some of their favorite products. Sadly, one of my favorite non-Lenten indulgences, Toblerone, produced by Mondelez International, does not score well on any of Oxfam's seven vectors (land, women, farmers, workers, climate, transparency, water).

I'll happily pay more for chocolate that I know is ethically sourced. But I won't pay more for chocolate if it just means that the CEO's pay goes from $4.2 million per year to $4.4 million.

Transparency, folks. That's what I'm looking for.


Wednesday, February 20, 2013

A BA in Alphabetizing


Let's say you run an office that generates a lot of paper (side question: What ever happened to the "paperless office"?). Perhaps it's a real estate office. Or a medical office. Or a law firm. Whichever the case, you decide to hire a file clerk to help you manage all that paper.

What's the critical qualification for that job? How about: Knowing one's way around the alphabet.

After that... well, there isn't much, is there?

But increasingly, according to Catherine Rampell's article in today's New York Times, the new minimum requirement is a B.A.

As one example, Rampell writes about a growing law firm in Atlanta which "hires only people with a bachelor's degree, even for jobs that do not require college-level skills."

Why does that bother me? After all, in a terrible job market, why shouldn't an employer hire the highest-quality employee possible?

First of all, there's no guarantee that a college graduate will make a more effective file clerk than a high-school graduate.

Secondly, as Rampell writes,
This up-credentialing is pushing the less educated even further down the food chain, and it helps explain why the unemployment rate for workers with no more than a high school diploma is more than twice that for workers with a bachelor's degree: 8.1 percent versus 3.7 percent.

Me, I would worry that my college-graduate file clerk would take off as soon as something even a little more challenging showed up, leaving me with the cost of finding yet another file clerk.

But that's not the heart of the issue for me. The heart is, in an increasingly unequal society, we keep finding ways to make it harder yet for the poor to break out of the poverty cycle. I've written before about employers who will only consider job applicants who are currently employed, adding insult to injury for the long-term unemployed, and about the friend-of-a-friend hiring practices, that can so easily slide into discrimination against people "not like us".

While many more Americans today earn undergraduate degrees than was the case, say, 50 years ago, the rapid rise in the cost of a higher education has made that rung of the ladder a much harder one for the poor to reach. And now they can't even get hired as a file clerk?

The managing partner of the Atlanta law firm likes his college-graduates requirement, because "a floor of college attainment also creates more office camaraderie..... There is a lot of trash-talking of each other's college football teams, for example."

Rampell quotes him as saying,
You know, if we had someone here with just a  GED or something, I can see how they might feel slighted by the social atmosphere here. There really is something sort of cohesive or binding about the fact that all of us went to college.

What's wrong with that? Just try replacing that final phrase "went to college" with: "are white" or "are Christians" or "belong to the country club". Now how does it sound?

But my "favorite" comment was from an executive recruiter, explaining the BAs-only requirement: "When you get 800 resumes for every job ad, you need to weed them out somehow."

Funny, I thought that was what you hired recruiters for.

Tuesday, February 19, 2013

If the Doctors Don't Speak Out, Who Will?

I hadn't planned to write anything about the first trial -- of what is likely to be several -- of Johnson & Johnson for the failure of the artificial hips manufactured and sold by its DePuy Orthopaedics division.

For one thing, I've written about J&J and DePuy before (e.g., here). For another, it was just too damn depressing.

Two years ago, I wondered whether the DePuy complaints were evidence of systemic problems at J&J. I'm not really wondering any more.

The news from the courtroom has been pretty bad. Late last month, in a New York Times article, reporter Barry Meier wrote that J&J's own internal reviews had "found that the company had not adequately assessed the device’s potential risks before it was used in more than 90,000 patients" (full article, here). More damaging still:
DePuy conducted the post-mortem review of the A.S.R. in November 2010, just three months after it recalled the all-metal implant, but it never released the analysis.
The metal-on-metal "Articular Surface Replacement" (ASR) device failed within five years in about 40% of patients who received it. Among other problems, the metal-on-metal format resulted in the release of minute metal fragments into the patient, causing, in some cases, serious internal damage. Problems with the DePuy device began to surface shortly after its release;
In previously recorded testimony presented in court on Wednesday, DePuy’s president, Andrew Ekdahl, was shown an e-mail in which he was warned about the A.S.R.’s problems nearly three years before it was recalled.

But what really got me thinking was an analysis that Meier wrote in the "Sunday Review" section of the 17 February New York Times, titled "Doctors Who Don't Speak Out." (full article, here).
It might not be surprising to find that executives acted to protect a company's bottom line. Still, the Johnson & Johnson episode is also illuminating a broader medical issue: while experts say that doctors have an ethical obligation to warn their peers about bad drugs or medical devices, they often do not do so.

Meier quotes a Yale School of Medicine professor: "Questioning the status quo in medicine is not easy."

To be fair, questioning the status quo is rarely easy -- whether you're talking about medicine, politics, academia, credit default swaps, or nutritional guidelines.

Meier outlines several reasons why doctors would be reluctant to "warn their peers about bad drugs or medical devices": fear of a lawsuit; belief that a problem is really an anomaly; aversion to reporting; and (I believe most importantly) financial ties to a drug or device manufacturer.

I've written before about the overly-cozy relationship between doctors and drug and device companies, about dental-device manufacturers who underwrite an entire issue of the Journal of the American Dental Association (here), about pharmaceutical companies writing entire textbooks (here), and about plans to require drug companies to disclose payments made to doctors for research, consulting, travel, and entertainment (here).

It's everywhere, and it needs to stop. And doctors need to develop a culture where -- to borrow a phrase from high-quality assembly-line manufacturing -- it's more important to stop the line for a possible problem (that turns out not to be one) than to let a real problem roll by. Because the patients are the ones who are suffering from their doctors' silence.

What ever happened to, "First, do no harm"?

Monday, February 11, 2013

That "Free" Research You Get From Your Broker May Be Worth What You Paid for It


Conflicts of interest can be hard to spot from the inside, while they seem obvious to those on the outside. That's, of course, one of the things that makes such conflicts so complicated.

I've been thinking about conflicts of interest, and how to structure organizations to reduce their frequency, since reading Saturday's New York Times column by James B. Stewart on analysts' bullish recommendations on Apple stock, long after it started its current slide (it is now down by about one-third since its September 2012 peak). The entire column can be found here.

As Stewart noted, we have been aware of conflicts of interest in the banking and brokerage arenas for some time. A decade ago, "There were some notorious examples of analysts who curried favor with investment banking clients and potential clients by producing favorable research, and then were paid huge bonuses out of investment banking fees."

Those lapses seem outrageous to outsiders, but probably didn't to those on the inside at the time. It was just "how the business works". And we know that humans are stunningly bad at knowing how easily they are manipulated, how quickly they respond to even small rewards (hello, "gimme" caps!). So if you had asked an analyst at the time whether the fact that her bonus was paid by investment banking fees influenced her recommendations, she would have been insulted at the very thought.

Since then, thanks to Congressional and state investigations and the passage of several laws, including Sarbanes-Oxley, "investment banking and research operations were segregated. Conflicts had to be disclosed, and research and analyst pay was detached from investment banking revenues...."

So far, so good.

So how could analysts still be saying, "Buy, Buy," about Apple, when the stock is slipping? What else is going on?

A key problem, according to Prof. Stuart Gilson of Harvard Business School, is that "research is funded through the trading desks."

In other words,
If you're an analyst and one way your report brings in revenue is through increased trading, a buy recommendation will do this more than a sell. For a sell, you have to already own the stock to generate a trade. But anybody can potentially buy a stock. That's one hypothesis about why you still see a disproportionate number of buy recommendations.

But fixing all the conflicts -- even if that were truly possible -- might not solve all the problems. Analysts, it appears, fall into many of the same traps we ordinary investors do, such as extrapolating past performance into the future.

Moreover, like many of us, analysts "have a tendency to tell their audience what it wants to hear."

How can an organization fix that problem?

Carlo Besenius, "the only analyst who even came close to calling the peak in Apple's stock", isn't employed by an investment bank or a brokerage firm. Instead, he runs his own firm, founded a decade ago after many years in brokerage-house research.
I'm paid based on performance.... I have to go to my clients and explain why they should pay for my research when they can get it for nothing from the firms were they pay their trading commissions.

You know what they say about advice? It's worth what you pay for it....