2014完整版《经济学人》英文原版(3)

2019-03-16 19:53

a way to bribe a generics firm to delay its introduction of a cut-price product. American antitrust officials worry this is to the detriment of the consumer. Another explanation is that the cost and legal uncertainty associated with patent trials are simply too great. Daniel Glazer of Shearman and Sterling, a big law firm, argues that even a firm convinced of the integrity of its patents may well settle “to avoid the all-or-nothing scenario”.

But there is a less charitable explanation. The big firm may know that its patent was mistakenly awarded, perhaps because the purported breakthrough was too minor or obvious. In Barr's ongoing case against Eli Lilly's Evista, the generic firm argues that a prior patent held by the University of Pennsylvania invalidates Lilly's claims. Kathleen Jaeger of America's Generic Pharmaceutical Association adds that branded firms try to extend their lucrative monopolies by filing less rigorous secondary patents designed “to block generics”. David Balto, a former official at America's Federal Trade Commission, says, “Branded pharmaceutical firms have been stretching the limits of what deserves a patent, and the courts are just catching up.”

Ready or not

Europe's financial sector is ill prepared for a coming upheaval

SOME of the most breathless commentary about Europe's financial markets in recent years has centred on the intrigues and dalliances of leading financial exchanges. All of them have flirted with, encouraged and snubbed various potential partners in both Europe and America, although no big deals have yet been completed. Amid the chatter, an important cause of all the matchmaking and matchbreaking has been largely overlooked: a piece of looming legislation that, for all its drab detail, will alter the European Union's financial markets profoundly.

Exchanges are not the only ones to feel the hot breath of the unenticingly labelled Markets in Financial Instruments Directive, known as MiFID, which is due to take effect from November 2007. An important element of the EU's plan for a single market in financial services, the directive embraces both wholesale and retail trading in securities, including shares, bonds and derivatives. As such, it will affect companies from investment banks to asset managers and stockbrokers. Some will benefit more than others .

Charlie McCreevy, the European commissioner in charge of forging a single market, jokes about the ugly moniker: “This is not a fearsome man-eating plant.” But he is evangelical about the directive's purpose. He expects MiFID to “transform” the trading of securities in Europe, reducing the cost of capital, creating growth and increasing Europe's competitiveness in the global economy.

The directive, which EU member states are supposed to weave into their own laws by January 2007, intends to accomplish all this in several ways. First, the rules aim to increase competition across borders, by extending the “single passport”, which allows financial firms to do business across Europe armed only with the approval of their home authorities. To make this possible, investor-protection rules are also to be harmonised, so as to provide a (theoretically) consistent standard in areas such as investment advice, order-handling and the completion of securities trades—“best execution”, in the jargon.

Second, MiFID aims to change the nature of competition in share trading. Although most shares in Europe are still traded on exchanges, there is growing interest in alternatives, such as off-exchange trading between investment banks. MiFID could

accelerate this trend. In some countries—notably France, Italy and Spain—existing rules force all share trades through local bourses. The new rules will end those monopolies. No wonder exchanges, facing the threat of greater competition, are weighing up mergers.

A third intention of MiFID is more transparency. In future, investors should be able to subscribe to information services that let them see the whole market in certain shares, not only what is on offer at the local stock exchange. The goal is to let investors find the best prices in the market. This will mean competition for the London Stock Exchange, for example, which earns a healthy sum from selling such information. Investment banks are already banding together to develop alternative reporting services.

Checking the thermostat

Property prices are cooling fast in America, but heating up elsewhere

HOUSES are not just places to live in; they are increasingly important to whole economies, which is why The Economist started publishing global house-price indicators in 2002. This has allowed us to track the biggest global property-price boom in history. The latest gloomy news from America may suggest that the world is on the brink of its biggest ever house-price bust. However, our latest quarterly update suggests that, outside America, prices are perking up.

America's housing market has certainly caught a chill. According to the Office of Federal Housing Enterprise Oversight (OFHEO), the average price of a house rose by only 1.2% in the second quarter, the smallest gain since 1999. The past year has seen the sharpest slowdown in the rate of growth since the series started in 1975. Even so, average prices are still up by 10.1% on a year ago. This is much stronger than the series published by the National Association of Realtors (NAR), which showed a rise of

only 0.9% in the year to July.

The OFHEO index is thought to be more reliable because it tracks price changes in successive sales of the same houses, and so unlike the NAR series is not distorted by a shift in the mix of sales to cheaper homes. The snag is that the data take time to appear. Prices for this quarter, which will not be published until December, may well be much weaker. A record level of unsold homes is also likely to weigh prices down. The housing futures contract traded on the Chicago Mercantile Exchange is predicting a fall of 5% next year.

Elsewhere, our global house-price indicators signal a cheerier story. House-price inflation is faster than a year ago in roughly half of the 20 countries we track. Apart from America, only Spain, Hong Kong and South Africa have seen big slowdowns. In ten of the countries, prices are rising at double-digit rates, compared with only seven countries last year.

European housing markets—notably Denmark, Belgium, Ireland, France and Sweden—now dominate the top of the league. Anecdotal evidence suggests that even the German market is starting to wake up after more than a decade of flat or falling prices, but this has yet to show up the index that we use, which is published with a long lag (there are no figures for 2006). If any readers know of a more timely index, please let us know.

Some economists have suggested that Britain and Australia are “the canaries in the coal mine”, giving early warning of the fate of America's housing market. The annual rate of increase in house prices in both countries slowed from around 20% in 2003 to close to zero last summer. However, the canaries have started to chirp again. In Australia average prices have picked up by 6.4% over the past year, although this is partly due to a 35% surge in Perth on the back of the commodities boom. Likewise British home prices have perked up this year, to be 6.6% higher, on average, than they were a year ago. Thus it is claimed that housing markets in Britain and Australia have had a soft landing.

Mind the gap

Pay discrimination between male and female scientists

SEVEN years ago, a group of female scientists at the Massachusetts Institute of Technology produced a piece of research showing that senior women professors in the institute's school of science had lower salaries and received fewer resources for research than their male counterparts did. Discrimination against female scientists has cropped up elsewhere. One

study—conducted in Sweden, of all places—showed that female medical-research scientists had to be twice as good as men to win research grants. These pieces of work, though, were relatively small-scale. Now, a much larger study has found that discrimination plays a role in the pay gap between male and female scientists at British universities.

Sara Connolly, a researcher at the University of East Anglia's school of economics, has been analyzing the results of a survey of over 7,000 scientists and she has just presented her findings at this year's meeting of the British Association for the Advancement of Science in Norwich. She found that the average pay gap between male and female academics working in science, engineering and technology is around £1,500 ($2,850) a year.

That is not, of course, irrefutable proof of discrimination. An alternative hypothesis is that the courses of men's and women's lives mean the gap is caused by something else; women taking “career breaks” to have children, for example, and thus rising more slowly through the hierarchy. Unfortunately for that idea, Dr Connolly found that men are also likely to earn more within any given grade of the hierarchy. Male professors, for example, earn over £4,000 a year more than female ones. To prove the point beyond doubt, Dr Connolly worked out how much of the overall pay differential was explained by differences such as seniority, experience and age, and how much was unexplained, and therefore suggestive of discrimination. Explicable differences amounted to 77% of the overall pay gap between the sexes. That still left a substantial 23% gap in pay, which Dr Connolly attributes to discrimination.

Besides pay, her study also looked at the “glass-ceiling” effect—namely that at all stages of a woman's career she is less likely than her male colleagues to be promoted. Between postdoctoral and lecturer level, men are more likely to be promoted than women are, by a factor of between 1.04 and 2.45. Such differences are bigger at higher grades, with the hardest move of all being for a woman to settle into a professorial chair.

Of course, it might be that, at each grade, men do more work than women, to make themselves more eligible for promotion. But that explanation, too, seems to be wrong. Unlike the previous studies, Dr Connolly's compared the experience of scientists in universities with that of those in other sorts of laboratory. It turns out that female academic researchers face more barriers to promotion, and have a wider gap between their pay and that of their male counterparts, than do their sisters in industry or research institutes independent of universities. Private enterprise, in other words, delivers more equality than the supposedly egalitarian world of academia does.

Alpha betting

The industry is splitting in two—and investors are gambling on the expensive bit

IT HAS never been easier to pay less to invest. No fewer than 136 exchange-traded funds (ETFs) were launched in the first half of 2006, more than in the whole of 2005.

For those who believe in efficient markets, this represents a triumph. ETFs are quoted securities that track a particular index, for a fee that is normally just a fraction of a percentage point. They enable investors to assemble a low-cost portfolio covering a wide range of assets from international equities, through government and corporate bonds, to commodities. Morgan Stanley estimates that ETFs control some $487 billion of assets, up 16.7% from a year ago. It predicts they will have $2 trillion of assets by 2011. No longer must investors be at the mercy of error-prone and expensive fund managers.

But as fast as the assets of ETFs and index-tracking mutual funds are growing, another section of the industry seems to be flourishing even faster. Watson Wyatt, a firm of actuaries, estimates that “alternative asset investment” (ranging from hedge funds through private equity to property) grew by around 20% in 2005, to $1.26 trillion. Investors who take this route pay much higher fees in the hope of better performance. One of the fastest-growing assets, funds of hedge funds, charge some of the highest fees of all.

At first sight, this might seem like a typical market, with low-cost commodity producers at one end and high-charging

specialists at the other. Buy a Rolls-Royce rather than a Trabant and you can expect a higher standard of luxury and engineering in return for the much greater price. But fund management is not like any other industry; paying more does not necessarily get you a better service.

An index represents the average performance of all investors, before costs are deducted. If the fee paid to the fund manager increases, the return achieved by the average investor must decline. After fees, hedge-fund returns this year have been feeble. From January 1st through to August 31st, the average hedge fund returned just 4.2%, according to Merrill Lynch, less than the S&P 500 index's 5.8% total return.

So why are people paying up? In part, because investors have learned to distinguish between the market return, dubbed beta, and managers' outperformance, known as alpha. “Why wouldn't you buy beta and alpha separately?” asks Arno Kitts of Henderson Global Investors, a fund-management firm. “Beta is a commodity and alpha is about skill.”

The fund-management splits began with the decline of balanced managers, which took complete charge of an investor's portfolio, running everything from American equities through Japanese bonds to property. Clients became convinced that no one firm could produce good performance in every asset class, nor could they master the art of timing the switch from one asset to another.

Powering up

Improved devices may make better use of sunlight

MOST of the power generated by mankind originates from the sun. It was sunlight that nurtured the early life that became today's oil, gas and coal. It is the solar heating of the Earth's atmosphere and oceans that fuels wave power, wind farms and hydroelectric schemes. But using the sun's energy directly to generate power is rare. Solar cells account for less than 1% of the world's electricity production.

Recent technological improvements, however, may boost this figure. The root of the problem is that most commercial solar cells are made from silicon, and silicon is expensive. Cells can be made from other, cheaper materials, but these are not as efficient as those made from silicon.

The disparity is stark. Commercial silicon cells have efficiencies of 15% to 20%. In the laboratory, some have been made with an efficiency of 30%. The figure for non-traditional cells is far lower. A typical cell based on electrically conductive plastic has an efficiency of just 3% or 4%. What is needed is a way to boost the efficiency of cells made from cheap materials, and three new ways of doing so were unveiled this week in San Francisco, at the annual meeting of the American Chemical Society.

Solar cells work by the action of light on electrons. An electron held in a chemical bond in the cell absorbs a photon (a particle of light) and, thus energised, breaks free. Such electrons can move about and, if they all move in the same direction, create an electric current. But they will not all travel in the same direction without a little persuasion. With silicon, this is achieved using a secondary electrical field across the cell. Non-silicon cells usually have a built-in “electrochemical potential” that encourages the electrons to move away from areas where they are concentrated and towards places where they have more breathing space.

Kwanghee Lee of Pusan National University, in South Korea, and Alan Heeger of the University of California, Santa Barbara, work on solar cells made of electrically conductive plastics. (Indeed, Dr Heeger won a Nobel prize for discovering that some plastics can be made to conduct electricity.) They found that by adding titanium oxide to such a cell and then baking it in an oven, they could increase the efficiency with which it converted solar energy into electricity.

The trick is to put the titanium oxide in as a layer between the part of the cell where the electrons are liberated and the part where they are collected for dispatch into the wider world. This makes the electrically conductive plastic more sensitive to light at wavelengths where sunlight is more intense. Pop the resulting sandwich in the oven for a few minutes at 150°C and the plastic layer becomes crystalline. This improves the efficiency of the process, because the electrons find it easier to move through crystalline structures.

The technique used by Dr Lee and Dr Heeger boosts the efficiency of plastic cells to 5.6%. That is still poor compared with silicon, but it is a big improvement on what was previously possible. Dr Lee concedes that there is still a long way to go, but says that even an efficiency of 7% would bring plastic cells into competition with their silicon cousins, given how cheap they are to manufacture.

A second approach, taken by Michael Gr?tzel of the Swiss Federal Institute of Technology, is to copy nature. Plants absorb solar energy during photosynthesis. They use it to split water into hydrogen ions, electrons and oxygen. The electrons released by this reaction are taken up by carrier molecules and then passed along a chain of such molecules before being used to power the chemical reactions that ultimately make sugar.

Dolling up the dole

A better way to help America's jobless

“MANY of our most fundamental systems—the tax code, health coverage, pension plans, worker training—were created for the world of yesterday, not tomorrow. We will transform these systems.” With these words George Bush laid out an agenda of domestic reform at the Republican convention in 2004. That agenda, starting with last year's attempt to transform America's vast state pension system, has gone nowhere. But Mr Bush's basic argument is right. Much of the machinery of America's domestic economic policy dates from the 1930s and needs repair. Unemployment insurance is a case in point.

Created by Franklin Roosevelt in 1935, America's dole has barely changed since. It provides temporary income support to laid-off workers and is financed by a small tax on wages. The details vary from state to state, but full-time workers who lose their jobs get a cheque worth, on average, just over a third of their previous wage for up to six months. Benefits can be paid for longer if the economy is in recession, but only if Congress agrees. By European standards, America's dole is short-lived, a characteristic that encourages people to get a new job quickly.

As a macroeconomic tool, the dole works well. Unemployment cheques support spending when workers are laid off, helping to smooth the business cycle. But the cash is not aimed at those who need it most. That is because a rising share of the unemployed are not laid off temporarily, but have seen their jobs disappear for good.

Research by Erica Groshen and Simon Potter, of the Federal Reserve Bank of New York, suggests that whereas temporary lay-offs explained much of the jumps in unemployment during the recessions of the 1970s and early 1980s, nowadays structural job losses dominate. People who are unemployed because their job has gone permanently need to find new lines of work. It takes them longer to find a job and, when they do, they are often paid considerably less than before.

Jeffrey Kling, an economist at the Brookings Institution, argues that the unemployment-benefit system ought to distinguish those who are temporarily out of a job but may find similar, or higher-paid work, and those who face permanently lower income. In a paper for the Hamilton Project, a research programme at Brookings that seeks new policies for America's centre-left, Mr Kling suggests that the dole should become less like a handout from the government and more like an insurance policy that individual workers finance themselves.

The idea is to give every worker an account, unsnappily called a “temporary earnings replacement account” or TERA. While in work, people could set aside money in these accounts. Those who lose their jobs could take cash out. The level and duration of withdrawals would be set by the government and would be the same as under today's unemployment system.

Bitter consequences

Green vegetables really do taste horrible

“EAT up your greens” is the exasperated cry of many a parent when faced with a fussy child. But the paradox of vegetables is that they are both good and bad for you. The cultivated plants consumed by all folks except hunter-gatherers have evolved an

ambiguous relationship with people, in which they exchange the risk of being eaten by a human for the reproductive security that domestication brings. But the wild plants from which these cultivars are descended are very definite about the matter. They do not want to be consumed and they make that opinion known by deploying all sorts of poisonous chemicals to discourage nibbling herbivores. In many cases, those poisons have persisted into the cultivated varieties, albeit at lower levels.

Animals, of course, have evolved ways of dealing with these poisons. The best of these, from a plant's point of view, is when an animal can taste, and thus reject, a poisonous chemical. This has long been assumed to be the basis of the taste of bitterness, but that theory has only now been put to a clear test. In a paper just published in Current Biology, Mari Sandell and Paul Breslin, of the Monell Chemical Senses Centre, in Philadelphia, have looked at the phenomenon in that bête noire of presidents and parents alike: broccoli.

Bitter tastes are detected by receptor proteins that are, in turn, encoded by a family of genes known collectively as TAS2R. Humans have around 200 TAS2R genes, each sensitive to different groups of chemicals. That variety, in itself, indicates the range of the plant kingdom's weaponry. Dr Sandell and Dr Breslin, though, focused their attentions on just one of these receptor genes, called hTAS2R38. The protein derived from this gene is known, from laboratory experiments, to be sensitive to a substance called phenylthiocarbamide (PTC). This compound contains a molecular group called thiourea. And thiourea-containing substances are known from other studies to inhibit the function of the thyroid gland.

Cruciferous vegetables, such as watercress, turnips and—most pertinently—broccoli, are rich in a group of thiourea-containing compounds called glucosinolates. Dr Sandell and Dr Breslin wondered if there might be a connection. And, since different versions of hTAS2R38 code for proteins that have different levels of reaction to PTC, they wondered if that might be reflected in the fact that some people like broccoli, and others do not.

The two researchers assembled a group of volunteers and checked which versions of the hTAS2R38 gene they had. They then fed the volunteers vegetables and recorded their reactions. All of the vegetables were thought by at least some people to be bitter, but not all of them were cruciferous plants. The non-cruciferous ones were plants which, so far as is known, do not contain glucosinolates.

The results were clear. All volunteers found the non-cruciferous vegetables equally bitter, but their reactions to the cruciferous ones depended on their genes. Those with two copies of the version of hTAS2R38 coding for the protein that binds best to PTC (one copy having been inherited from each parent) thought broccoli and its cousins the most bitter. Those who had two copies of the poorly binding version thought they tasted fine. Those with one of each had an intermediate reaction.

Despite broccoli's bad reputation, the most bitter vegetables, according to this research, are swedes and turnips. That accords well with work which has shown that eating these vegetables suppresses the uptake of iodine into the thyroid gland. Iodine is an essential ingredient of thyroxine, a hormone produced by that gland.

The upshot of all this is that the complaints of children (and, indeed, of many adults) that green vegetables are horrid contains a lot of truth. There is no doubt that such vegetables are good for you. But they are not unequivocally good. As is often observed in other contexts, there is no free lunch.

Running rings round storms

Trees keep records of passing hurricanes

STUDYING the past is a good way to understand the present, and may even illuminate the future. But the past does not give up its secrets easily. Hurricane scientists, for instance, would like to know about long-term changes in the frequency and strengths of the storms they study. That would help to show whether the shifting pattern of hurricanes seen in the past few decades is cyclical, random or part of a trend that might be caused by global warming. Unfortunately, meteorologists have been keeping systematic tabs on the relevant data for only about 60 years. Before that, records are sporadic and anecdotal—and that is not enough to see the bigger picture.

Human records, however, are not the only sort available. Trees are popular with scientists who want to look at what happened a few hundred years ago, because their annual growth rings mean that their wood can be dated accurately. And Dana Miller, of the University of Tennessee, and her team have used that insight to search for hurricanes that humanity has failed to record. Their results, just published in the Proceedings of the National Academy of Sciences, have identified a number of previously unknown storms that hit the south-west coast of North America. The trick they used to do this was to look at the isotopic composition of the oxygen in the wood of local trees.

Water contains two isotopes of oxygen, one of which has two more neutrons than the other, making it heavier. When a hurricane forms, it tends, initially, to rain water molecules containing the heavier isotope. At that point it is still over the sea.


2014完整版《经济学人》英文原版(3).doc 将本文的Word文档下载到电脑 下载失败或者文档不完整,请联系客服人员解决!

下一篇:中小学小广播员、小主持人、小记者培训材料

相关阅读
本类排行
× 注册会员免费下载(下载后可以自由复制和排版)

马上注册会员

注:下载文档有可能“只有目录或者内容不全”等情况,请下载之前注意辨别,如果您已付费且无法下载或内容有问题,请联系我们协助你处理。
微信: QQ: