option
list
question
stringlengths
11
354
article
stringlengths
231
6.74k
id
stringlengths
5
8
label
int64
0
3
[ "The new fuel cells run on sugar that is easy to find.", "The new fuel cells are environment-friendly.", "The new fuel cells are biologically degradable.", "It will take some time before the new fuel cells can be used in popular products." ]
According to the last paragraph, which is NOT true of the new fuel cells?
Using enzymes commonly found in living cells, a new type of fuel cell produces small amounts of electricity from sugar. If the technology is able to succeed in mass production, you may some day share your sweet drinks with your cell phone. In fuel cells, chemical reactions generate electrical currents. The process usually relies on precious metals, such as platinum. In living cells, enzymes perform a similar job, breaking down sugars to obtain electrons and produce energy. When researchers previously used enzymes in fuel cells, they had trouble keeping them active,says Shelley D. Minteer of St LouisUniversity. Whereas biological cells continually produce fresh enzymes, there's no mechanism in fuel cells to replace enzymes as they quickly degrade. Minteer and Tamara Klotzbach, also of St LouisUniversity, have now developed polymers that wrap around an enzyme and preserve it in a microscopic pocket. "We tailor these pockets to provide the ideal microenvironment for the enzyme," Minteer says. The polymers keep the enzyme active for months instead of days. In the new fuel cell, tiny polymer bags of enzyme are embedded in a membrane that coats one of the electrodes. When glucose from a sugary liquid gets into a pocket, the enzyme oxidizes it, releasing electrons and protons. The electrons cross the membrane and enter a wire through which they travel to the other electrode, where they react with oxygen in the atmosphere to produce water. The flow of electrons through the wire constitutes an electrical current that can generate power. So far, the new fuel cells don't produce much power, but the fact that they work at all is exciting, says Paul Kenis, a chemical engineer at the University of Illinois at Urhana-Champaign. "Just getting it to work," Kenis says, "is a major accomplishment." Sugar-eating fuel cells could be an efficient way to make electricity. Sugar is easy to find. And the new fuel cells that run on it are biodegradable, so the technology wouldn't hurt the environment. The scientists are now trying to use different enzymes that will get more power from sugar. They predict that popular products may be using the new technology in as little as 3 years.
1248.txt
3
[ "illustrate that the new machine European physicists have invented is full of wonder.", "describe the mood of the European physicists facing the delay of the Large Hadron Collider", "tell people that the European physicists will put off the presentation of their new machine.", "reflect the public's eagerness in using the Large Hadron Collider." ]
The sentences in Alice in Wonderland are cited in the first paragraph in order to _
"OH DEAR! Oh dear! I shall be too late!" So muttered the White Rabbit just before he plunged into Wonderland, with Alice in pursuit. Similar utterances have been escaping the lips of European physicists, as it was confirmed last week that their own subterranean Wonderland, a new machine called the Large Hadron Collider, will not now begin work until May 2008. This delay may enable their American rivals to scoop them by finding the Higgs boson-predicted 43 years ago by Peter Higgs of Edinburgh University to be the reason why matter has mass, but not yet actually discovered. The Large Hadron Collider is a 27km-long circular accelerator that is being built at CERN, the European particle-physics laboratory near Geneva, specifically to look for the Higgs boson. When it eventually starts work, it will be the world's most powerful particle collider. It will also be the most expensive, having cost SFr10 billion ($8 billion) to build. The laboratory had hoped it would be ready in 2005, but the schedule has slipped repeatedly. The most recent delay came at the end of March, with the dramatic failure of a magnet assembly that had been supplied by CERN's American counterpart, the Fermi National Accelerator Laboratory (FermilaB. near Chicago. This device was one of four designed to focus beams of particles before they collide in the experimental areas. Admittedly, it had been placed under extreme conditions when it failed, but such forces are to be expected from time to time when the machine is running normally. The magnets have yet to be fixed, although physicists think they know how to do it. Other, smaller hitches have compounded the problem. The collider has been built in eight sections, each of which must be cooled to temperatures only just above absolute zero. This is because the magnets used to accelerate the particles to the high energies needed for particle physics rely on the phenomenon of superconductivity to work-and superconductivity, in turn, needs extremely low temperatures. Unfortunately, the first of the eight sections took far longer to chill than had been expected. If, as the other seven sections are cooled, further problems emerge, the start date will have to be put back still further. It takes a month to cool each section, and a month to warm each one back up to normal temperatures again. If it took, say, a month to fix any problems identified as a section cooled, each cycle would postpone the start date by three months. To accelerate progress (as well as particles), CERN's management decided last week to cancel an engineering run scheduled for November. Instead of beginning slowly with some safe-but-dull low-energy collisions, the machine's first run will accelerate its particles to high energies straight away. Such haste may be wise, for rumours are circulating that physicists working at the Tevatron, which is based at Fermilab and is currently the world's most powerful collider, have been seeing hints of the Higgs boson. Finding it would virtually guarantee the discoverer a Nobel prize-shared jointly, no doubt, with Dr Higgs. Hence the rush, as hundreds of physicists head down the rabbit hole, seeking their own adventures in Wonderland.
3624.txt
2
[ "forestall.", "defeat.", "surpass.", "vanquish." ]
The word "scoop" (Line 6, Paragraph 1) most probably means _
"OH DEAR! Oh dear! I shall be too late!" So muttered the White Rabbit just before he plunged into Wonderland, with Alice in pursuit. Similar utterances have been escaping the lips of European physicists, as it was confirmed last week that their own subterranean Wonderland, a new machine called the Large Hadron Collider, will not now begin work until May 2008. This delay may enable their American rivals to scoop them by finding the Higgs boson-predicted 43 years ago by Peter Higgs of Edinburgh University to be the reason why matter has mass, but not yet actually discovered. The Large Hadron Collider is a 27km-long circular accelerator that is being built at CERN, the European particle-physics laboratory near Geneva, specifically to look for the Higgs boson. When it eventually starts work, it will be the world's most powerful particle collider. It will also be the most expensive, having cost SFr10 billion ($8 billion) to build. The laboratory had hoped it would be ready in 2005, but the schedule has slipped repeatedly. The most recent delay came at the end of March, with the dramatic failure of a magnet assembly that had been supplied by CERN's American counterpart, the Fermi National Accelerator Laboratory (FermilaB. near Chicago. This device was one of four designed to focus beams of particles before they collide in the experimental areas. Admittedly, it had been placed under extreme conditions when it failed, but such forces are to be expected from time to time when the machine is running normally. The magnets have yet to be fixed, although physicists think they know how to do it. Other, smaller hitches have compounded the problem. The collider has been built in eight sections, each of which must be cooled to temperatures only just above absolute zero. This is because the magnets used to accelerate the particles to the high energies needed for particle physics rely on the phenomenon of superconductivity to work-and superconductivity, in turn, needs extremely low temperatures. Unfortunately, the first of the eight sections took far longer to chill than had been expected. If, as the other seven sections are cooled, further problems emerge, the start date will have to be put back still further. It takes a month to cool each section, and a month to warm each one back up to normal temperatures again. If it took, say, a month to fix any problems identified as a section cooled, each cycle would postpone the start date by three months. To accelerate progress (as well as particles), CERN's management decided last week to cancel an engineering run scheduled for November. Instead of beginning slowly with some safe-but-dull low-energy collisions, the machine's first run will accelerate its particles to high energies straight away. Such haste may be wise, for rumours are circulating that physicists working at the Tevatron, which is based at Fermilab and is currently the world's most powerful collider, have been seeing hints of the Higgs boson. Finding it would virtually guarantee the discoverer a Nobel prize-shared jointly, no doubt, with Dr Higgs. Hence the rush, as hundreds of physicists head down the rabbit hole, seeking their own adventures in Wonderland.
3624.txt
0
[ "It was him who initiated the idea that there existed such a boson to make matter have mass.", "He made the famous estimate that human beings would find out such a boson in the future.", "He claimed with convincing evidence that America would outrun Europe in discovering the Higgs boson.", "It was him who first discovered the boson which makes matter have mass." ]
Which one of the following statements is TRUE of Peter Higgs' contribution in this field?
"OH DEAR! Oh dear! I shall be too late!" So muttered the White Rabbit just before he plunged into Wonderland, with Alice in pursuit. Similar utterances have been escaping the lips of European physicists, as it was confirmed last week that their own subterranean Wonderland, a new machine called the Large Hadron Collider, will not now begin work until May 2008. This delay may enable their American rivals to scoop them by finding the Higgs boson-predicted 43 years ago by Peter Higgs of Edinburgh University to be the reason why matter has mass, but not yet actually discovered. The Large Hadron Collider is a 27km-long circular accelerator that is being built at CERN, the European particle-physics laboratory near Geneva, specifically to look for the Higgs boson. When it eventually starts work, it will be the world's most powerful particle collider. It will also be the most expensive, having cost SFr10 billion ($8 billion) to build. The laboratory had hoped it would be ready in 2005, but the schedule has slipped repeatedly. The most recent delay came at the end of March, with the dramatic failure of a magnet assembly that had been supplied by CERN's American counterpart, the Fermi National Accelerator Laboratory (FermilaB. near Chicago. This device was one of four designed to focus beams of particles before they collide in the experimental areas. Admittedly, it had been placed under extreme conditions when it failed, but such forces are to be expected from time to time when the machine is running normally. The magnets have yet to be fixed, although physicists think they know how to do it. Other, smaller hitches have compounded the problem. The collider has been built in eight sections, each of which must be cooled to temperatures only just above absolute zero. This is because the magnets used to accelerate the particles to the high energies needed for particle physics rely on the phenomenon of superconductivity to work-and superconductivity, in turn, needs extremely low temperatures. Unfortunately, the first of the eight sections took far longer to chill than had been expected. If, as the other seven sections are cooled, further problems emerge, the start date will have to be put back still further. It takes a month to cool each section, and a month to warm each one back up to normal temperatures again. If it took, say, a month to fix any problems identified as a section cooled, each cycle would postpone the start date by three months. To accelerate progress (as well as particles), CERN's management decided last week to cancel an engineering run scheduled for November. Instead of beginning slowly with some safe-but-dull low-energy collisions, the machine's first run will accelerate its particles to high energies straight away. Such haste may be wise, for rumours are circulating that physicists working at the Tevatron, which is based at Fermilab and is currently the world's most powerful collider, have been seeing hints of the Higgs boson. Finding it would virtually guarantee the discoverer a Nobel prize-shared jointly, no doubt, with Dr Higgs. Hence the rush, as hundreds of physicists head down the rabbit hole, seeking their own adventures in Wonderland.
3624.txt
0
[ "the device of focusing beams of particles was having a screw loose.", "it took longer to cool down the superconductivity so that the collider could work normally.", "it took a very long period to make eight sections cool down or regain temperature.", "there happened a dramatic failure of a magnet assembly which was not beyond expectation" ]
The schedule of the Large Hadron Collider has slipped repeatedly because of the following reason except _
"OH DEAR! Oh dear! I shall be too late!" So muttered the White Rabbit just before he plunged into Wonderland, with Alice in pursuit. Similar utterances have been escaping the lips of European physicists, as it was confirmed last week that their own subterranean Wonderland, a new machine called the Large Hadron Collider, will not now begin work until May 2008. This delay may enable their American rivals to scoop them by finding the Higgs boson-predicted 43 years ago by Peter Higgs of Edinburgh University to be the reason why matter has mass, but not yet actually discovered. The Large Hadron Collider is a 27km-long circular accelerator that is being built at CERN, the European particle-physics laboratory near Geneva, specifically to look for the Higgs boson. When it eventually starts work, it will be the world's most powerful particle collider. It will also be the most expensive, having cost SFr10 billion ($8 billion) to build. The laboratory had hoped it would be ready in 2005, but the schedule has slipped repeatedly. The most recent delay came at the end of March, with the dramatic failure of a magnet assembly that had been supplied by CERN's American counterpart, the Fermi National Accelerator Laboratory (FermilaB. near Chicago. This device was one of four designed to focus beams of particles before they collide in the experimental areas. Admittedly, it had been placed under extreme conditions when it failed, but such forces are to be expected from time to time when the machine is running normally. The magnets have yet to be fixed, although physicists think they know how to do it. Other, smaller hitches have compounded the problem. The collider has been built in eight sections, each of which must be cooled to temperatures only just above absolute zero. This is because the magnets used to accelerate the particles to the high energies needed for particle physics rely on the phenomenon of superconductivity to work-and superconductivity, in turn, needs extremely low temperatures. Unfortunately, the first of the eight sections took far longer to chill than had been expected. If, as the other seven sections are cooled, further problems emerge, the start date will have to be put back still further. It takes a month to cool each section, and a month to warm each one back up to normal temperatures again. If it took, say, a month to fix any problems identified as a section cooled, each cycle would postpone the start date by three months. To accelerate progress (as well as particles), CERN's management decided last week to cancel an engineering run scheduled for November. Instead of beginning slowly with some safe-but-dull low-energy collisions, the machine's first run will accelerate its particles to high energies straight away. Such haste may be wise, for rumours are circulating that physicists working at the Tevatron, which is based at Fermilab and is currently the world's most powerful collider, have been seeing hints of the Higgs boson. Finding it would virtually guarantee the discoverer a Nobel prize-shared jointly, no doubt, with Dr Higgs. Hence the rush, as hundreds of physicists head down the rabbit hole, seeking their own adventures in Wonderland.
3624.txt
1
[ "they are afraid that American researchers will get to see the Higgs boson ahead of them.", "the safe-but-dull low-energy collision was too old fashioned.", "they wanted to dispel the rumours that physicists at the Tevatron have been seeing hints of the Higgs boson.", "they wanted to have a try of the new way of accelerating particles." ]
CERN'S management decided to have the machine's first run accelerate its particles straight away because _
"OH DEAR! Oh dear! I shall be too late!" So muttered the White Rabbit just before he plunged into Wonderland, with Alice in pursuit. Similar utterances have been escaping the lips of European physicists, as it was confirmed last week that their own subterranean Wonderland, a new machine called the Large Hadron Collider, will not now begin work until May 2008. This delay may enable their American rivals to scoop them by finding the Higgs boson-predicted 43 years ago by Peter Higgs of Edinburgh University to be the reason why matter has mass, but not yet actually discovered. The Large Hadron Collider is a 27km-long circular accelerator that is being built at CERN, the European particle-physics laboratory near Geneva, specifically to look for the Higgs boson. When it eventually starts work, it will be the world's most powerful particle collider. It will also be the most expensive, having cost SFr10 billion ($8 billion) to build. The laboratory had hoped it would be ready in 2005, but the schedule has slipped repeatedly. The most recent delay came at the end of March, with the dramatic failure of a magnet assembly that had been supplied by CERN's American counterpart, the Fermi National Accelerator Laboratory (FermilaB. near Chicago. This device was one of four designed to focus beams of particles before they collide in the experimental areas. Admittedly, it had been placed under extreme conditions when it failed, but such forces are to be expected from time to time when the machine is running normally. The magnets have yet to be fixed, although physicists think they know how to do it. Other, smaller hitches have compounded the problem. The collider has been built in eight sections, each of which must be cooled to temperatures only just above absolute zero. This is because the magnets used to accelerate the particles to the high energies needed for particle physics rely on the phenomenon of superconductivity to work-and superconductivity, in turn, needs extremely low temperatures. Unfortunately, the first of the eight sections took far longer to chill than had been expected. If, as the other seven sections are cooled, further problems emerge, the start date will have to be put back still further. It takes a month to cool each section, and a month to warm each one back up to normal temperatures again. If it took, say, a month to fix any problems identified as a section cooled, each cycle would postpone the start date by three months. To accelerate progress (as well as particles), CERN's management decided last week to cancel an engineering run scheduled for November. Instead of beginning slowly with some safe-but-dull low-energy collisions, the machine's first run will accelerate its particles to high energies straight away. Such haste may be wise, for rumours are circulating that physicists working at the Tevatron, which is based at Fermilab and is currently the world's most powerful collider, have been seeing hints of the Higgs boson. Finding it would virtually guarantee the discoverer a Nobel prize-shared jointly, no doubt, with Dr Higgs. Hence the rush, as hundreds of physicists head down the rabbit hole, seeking their own adventures in Wonderland.
3624.txt
0
[ "They had a series of unusually good breading seasons.", "They expanded into the Rocky Mountain West.", "Their population levels fell.", "They were harvest in significant numbers for the first time." ]
According to paragraph 2, what happened to raccoons in the 1930s?
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
2
[ "typically", "predictably", "increasingly", "Reliably" ]
The word "consistently" in the passage is closest in meaning to
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
3
[ "They were not easily transplanted there from Indiana.", "They were not found there prior to 40 years ago.", "They were often restocked because of illegal hunting.", "They expanded into that area from nearby suburban and urban settings." ]
According to paragraph 3, which is true about raccoons in Utah's Great Salt Lake Valley?
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
1
[ "motivated by a desire to have raccoons among the local wildlife.", "illegal", "carried out by hunters who wanted more raccoons to hunt.", "Unsuccessful" ]
According to paragraph 3, the introduction of raccoons into Utah's Great Salt Lake Valley appears to have been an example of an introduction that was
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
0
[ "Raccoons thrive in suburban areas.", "Hunting raccoons has become illegal in most areas.", "People enjoy having raccoons as part of their environment.", "A transplanted raccoon will generally be able to succeed in its new environment." ]
All of the following are mentioned in paragraph 3 as helping to explain the raccoon's dramatic increase in abundance and distribution EXCEPT
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
1
[ "it has been established.", "it has been incorrectly stated.", "it can be assumed.", "it can be demonstrated." ]
The word "presumably" in the passage is closest in meaning to
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
2
[ "The spread of agriculture destroyed some of the raccoon's natural habitats and reduced their populations in Kansas and eastern Colorado.", "Because of the availability of corn and other cereal grains as a result of the spread of agriculture.", "The spread of agriculture may have contributed to some raccoon expansion but has not always caused raccoon populations to expand.", "The spread of agriculture to Kansas and eastern Colorado brought increased raccoon populations in the 1870s and 1880s." ]
According to paragraph 4, how has the spread of agriculture affected raccoon populations?
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
2
[ "They were widely distribute throughout the region.", "There were found mostly in areas of open prairie.", "They were not found in most of Canada.", "They had not yet reached the wooded areas of the southeastern portion of the region." ]
According to paragraph 5, what was true about raccoons before the arrival of European settlers?
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
2
[ "In the 1950s both coyotes and raccoons increased their populations.", "Coyotes are more difficult to control than raccoons are.", "Coyotes and raccoons both tend to prefer regions that have rivers, streams, and wooded areas.", "More evidence is needed to determine if controlling coyotes contributed to raccoon expansion in the 1950s." ]
What can be concluded from the discussion in paragraph 5 about coyotes and raccoons in Manitoba?
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
3
[ "predictable", "crucial", "negative", "Contributing" ]
The word "critical" in the passage is closest in meaning to
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
1
[ "were relatively safe from conflict with humans.", "had little trouble finding sufficient food.", "had some protection from wolves.", "could find a varied diet or prey." ]
According to paragraph 6, during the eighteenth century, raccoons were abundant only in forests and wooded bottomlands of major rivers mainly because those were the only places where raccoons
Raccoons have a vast transcontinental distribution, occurring throughout most of North America and Central America. They are found from southern Canada all the way to Panama, as well as on islands near coastal areas. They occur in each of the 49 states of the continental United States. Although raccoons are native only to the Western Hemisphere, they have been successfully transplanted to other parts of the globe. Following a decline to a relatively low population level in the 1930s, raccoons began to prosper following their 1943 breeding season. A rapid population surge continued throughout the 1940s, and high numbers have been sustained ever since. By the late 1980s, the number of raccoons in North America was estimated to be at least 15 to 20 times the number that existed during the 1930s. By now, their numbers have undoubtedly grown even more, as they have continued to expand into new habitats where they were once either rare or absent, such as sandy prairies, deserts, coastal marshes, and mountains. Their spread throughout the Rocky Mountain West is indicative of the fast pace at which they can exploit new environments. Despite significant numbers being harvested and having suffered occasional declines, typically because of disease, the raccoon has consistently maintained high population levels. Several factors explain the raccoon's dramatic increase in abundance and distribution. First, their success has been partially attributed to the growth of cities, as they often thrive in suburban and even urban settings. Furthermore, they have been deliberately introduced throughout the continent. Within the United States, they are commonly taken from one area to another, both legally and illegally, to restock hunting areas and, presumably, because people simply want them to be part of their local fauna. Their appearance and subsequent flourishing in Utah's Great Salt Lake valley within the last 40 years appears to be from such an introduction. As an example of the ease with which transplanted individuals can succeed, raccoons from Indiana (midwestern United States) have reportedly been able to flourish on islands off the coast of Alaska. The raccoon's expansion in various areas may also be due to the spread of agriculture. Raccoons have been able to exploit crops, especially corn but also cereal grains, which have become dependable food sources for them. The expansion of agriculture, however, does not necessarily lead to rapid increases in their abundance. Farming in Kansas and eastern Colorado (central and western United States) proceeded rapidly in the 1870s and 1880s, but this was about 50 years before raccoons started to spread out from their major habitat, the wooded river bottomlands. They have also expanded into many areas lacking any agriculture other than grazing and into places without forests or permanent streams. Prior to Europeans settling and farming the Great Plains Region, raccoons probably were just found along its rivers and streams and in the wooded areas of its southeastern section. With the possible exception of the southern part of the province of Manitoba, their absence was notable throughout Canada. They first became more widely distributed in the southern part of Manitoba, and by the 1940s were abundant throughout its southeastern portion. In the 1950s their population swelled in Canada. The control of coyotes in the prairie region in the 1950s may have been a factor in raccoon expansion. If their numbers are sufficient coyotes might be able to suppress raccoon populations (though little direct evidence supports this notion). By the 1960s the raccoon had become a major predator of the canvasback ducks nesting in southwestern Manitoba. The extermination of the wolf from most of the contiguous United States may have been a critical factor in the raccoon's expansion and numerical increase. In the eighteenth century, when the wolfs range included almost all of North America, raccoons apparently were abundant only in the deciduous forests of the East, Gulf Coast, and Great Lakes regions, though they also extended into the wooded bottomlands of the Midwest's major rivers. In such areas, their arboreal habits and the presence of hollow den trees should have offered some protection from wolves and other large predators. Even though raccoons may not have been a significant part of their diet, wolves surely would have tried to prey on those exposed in relatively treeless areas.
3274.txt
2
[ "there is no great demand of chips.", "the technology advances fast.", "the demand is periodic.", "the production of silicon is unstable." ]
Profits in chips industry are unsteady because _
The main ingredient of a semiconductor is silicon, but it might as well be pyrite, or fool's gold. That is because consistently making money out of chips is notoriously difficult. Cyclical demand means that profits are volatile, and new kinds of chips quickly become commoditised. The business is also highly capital-intensive: a new fabrication plant, or fab, costs $3 billion-5 billion, and new facilities must be built every few years as technology advances. Accordingly, many Western technology firms, such as Philips, Hewlett-Packard, Motorola and Siemens, long ago spun off their chipmaking units in order to focus on the final products, rather than the bits inside them. Japan's huge electronics conglomerates have largely resisted this "fab lite" strategy. This now seems to be changing, though the companies' willingness to let go fully is still in doubt. On October 18th Sony said it would put its processor-chip division into a joint venture with Toshiba, which will also buy Sony's chipmaking facilities. Sony will no longer have to make huge investments in chip technology, and will still be sure of a supply of processors for its PlayStation 3 games consoles and other products. Its chip division lost ¥10 billion ($90m) last year, and the company has been getting rid of non-core businesses. Last month it floated its financial-services arm, raising nearly $3 billion. For its part Toshiba, one of the world's biggest chipmakers, will gain economies of scale. Sanyo, another Japanese electronics firm, had hoped to do something similar. But its plan to sell its semiconductor unit for nearly $1 billion to Advantage Partners, a private-equity fund, fell through on October 16th. Sanyo is owned by a number of investors, including Goldman Sachs, which are doing their best to revive the struggling company by slimming it down to focus on its solar-panel and battery businesses. But the banks financing the purchase of Sanyo's chip unit balked at the price and at Advantage's plan to retain the existing management. Sanyo's announcement that it would keep the unit sent its share price plummeting. Meanwhile, NEC, which in 2002 turned its chip business into a separate, publicly listed subsidiary, NEC Electronics, has spurned an offer from Perry Capital, a New York fund. Perry is willing to pay $1.3 billion to raise its stake from roughly 5% to 30%, on the condition that NEC relinquishes control of the chipmaker's board. (That works out at a premium of 60% over the average share price in the past three months.) Now Perry is quietly trying to convince other shareholders of the merits of its offer. Taken together, this action (and inaction) adds up to a test of the willingness of managers at Japan's electronics firms to take rational but uncomfortable decisions. Spin-offs make sense because there are too many firms doing the same thing on too small a scale, and the need to finance new fabs is a drag on the firms' main businesses. In Japan, however, corporate pride often trumps economic logic. Electronics giants are used to being diversified and vertically integrated: they regard selling a subsidiary as akin to amputating an arm. Still, some now see the need for surgery.
3535.txt
2
[ "Japan's electronics conglomerates are now commencing to consider of implementing this strategy.", "The attitude of Japan's electronics conglomerates towards the strategy is still quite dubious.", "The strategy is beneficial in promoting companies' manufacturing more focused on final products.", "The strategy is applicable the captive-intensive industry." ]
Which one of the following statements is TRUE of "fab lite" strategy?
The main ingredient of a semiconductor is silicon, but it might as well be pyrite, or fool's gold. That is because consistently making money out of chips is notoriously difficult. Cyclical demand means that profits are volatile, and new kinds of chips quickly become commoditised. The business is also highly capital-intensive: a new fabrication plant, or fab, costs $3 billion-5 billion, and new facilities must be built every few years as technology advances. Accordingly, many Western technology firms, such as Philips, Hewlett-Packard, Motorola and Siemens, long ago spun off their chipmaking units in order to focus on the final products, rather than the bits inside them. Japan's huge electronics conglomerates have largely resisted this "fab lite" strategy. This now seems to be changing, though the companies' willingness to let go fully is still in doubt. On October 18th Sony said it would put its processor-chip division into a joint venture with Toshiba, which will also buy Sony's chipmaking facilities. Sony will no longer have to make huge investments in chip technology, and will still be sure of a supply of processors for its PlayStation 3 games consoles and other products. Its chip division lost ¥10 billion ($90m) last year, and the company has been getting rid of non-core businesses. Last month it floated its financial-services arm, raising nearly $3 billion. For its part Toshiba, one of the world's biggest chipmakers, will gain economies of scale. Sanyo, another Japanese electronics firm, had hoped to do something similar. But its plan to sell its semiconductor unit for nearly $1 billion to Advantage Partners, a private-equity fund, fell through on October 16th. Sanyo is owned by a number of investors, including Goldman Sachs, which are doing their best to revive the struggling company by slimming it down to focus on its solar-panel and battery businesses. But the banks financing the purchase of Sanyo's chip unit balked at the price and at Advantage's plan to retain the existing management. Sanyo's announcement that it would keep the unit sent its share price plummeting. Meanwhile, NEC, which in 2002 turned its chip business into a separate, publicly listed subsidiary, NEC Electronics, has spurned an offer from Perry Capital, a New York fund. Perry is willing to pay $1.3 billion to raise its stake from roughly 5% to 30%, on the condition that NEC relinquishes control of the chipmaker's board. (That works out at a premium of 60% over the average share price in the past three months.) Now Perry is quietly trying to convince other shareholders of the merits of its offer. Taken together, this action (and inaction) adds up to a test of the willingness of managers at Japan's electronics firms to take rational but uncomfortable decisions. Spin-offs make sense because there are too many firms doing the same thing on too small a scale, and the need to finance new fabs is a drag on the firms' main businesses. In Japan, however, corporate pride often trumps economic logic. Electronics giants are used to being diversified and vertically integrated: they regard selling a subsidiary as akin to amputating an arm. Still, some now see the need for surgery.
3535.txt
0
[ "float its financial-services arm through saving huge investment from chip making.", "cater to help Toshiba's strategy of gaining economies of scale.", "focus on its main business by cutting off non-core branches.", "supply processors for its PlayStation 3 games consoles." ]
Sony hands over its processor-chip division to Toshiba in order to _
The main ingredient of a semiconductor is silicon, but it might as well be pyrite, or fool's gold. That is because consistently making money out of chips is notoriously difficult. Cyclical demand means that profits are volatile, and new kinds of chips quickly become commoditised. The business is also highly capital-intensive: a new fabrication plant, or fab, costs $3 billion-5 billion, and new facilities must be built every few years as technology advances. Accordingly, many Western technology firms, such as Philips, Hewlett-Packard, Motorola and Siemens, long ago spun off their chipmaking units in order to focus on the final products, rather than the bits inside them. Japan's huge electronics conglomerates have largely resisted this "fab lite" strategy. This now seems to be changing, though the companies' willingness to let go fully is still in doubt. On October 18th Sony said it would put its processor-chip division into a joint venture with Toshiba, which will also buy Sony's chipmaking facilities. Sony will no longer have to make huge investments in chip technology, and will still be sure of a supply of processors for its PlayStation 3 games consoles and other products. Its chip division lost ¥10 billion ($90m) last year, and the company has been getting rid of non-core businesses. Last month it floated its financial-services arm, raising nearly $3 billion. For its part Toshiba, one of the world's biggest chipmakers, will gain economies of scale. Sanyo, another Japanese electronics firm, had hoped to do something similar. But its plan to sell its semiconductor unit for nearly $1 billion to Advantage Partners, a private-equity fund, fell through on October 16th. Sanyo is owned by a number of investors, including Goldman Sachs, which are doing their best to revive the struggling company by slimming it down to focus on its solar-panel and battery businesses. But the banks financing the purchase of Sanyo's chip unit balked at the price and at Advantage's plan to retain the existing management. Sanyo's announcement that it would keep the unit sent its share price plummeting. Meanwhile, NEC, which in 2002 turned its chip business into a separate, publicly listed subsidiary, NEC Electronics, has spurned an offer from Perry Capital, a New York fund. Perry is willing to pay $1.3 billion to raise its stake from roughly 5% to 30%, on the condition that NEC relinquishes control of the chipmaker's board. (That works out at a premium of 60% over the average share price in the past three months.) Now Perry is quietly trying to convince other shareholders of the merits of its offer. Taken together, this action (and inaction) adds up to a test of the willingness of managers at Japan's electronics firms to take rational but uncomfortable decisions. Spin-offs make sense because there are too many firms doing the same thing on too small a scale, and the need to finance new fabs is a drag on the firms' main businesses. In Japan, however, corporate pride often trumps economic logic. Electronics giants are used to being diversified and vertically integrated: they regard selling a subsidiary as akin to amputating an arm. Still, some now see the need for surgery.
3535.txt
2
[ "the high price it charged.", "the dispute among the investors, some of whom pays great effort to revive the company.", "the difficulty of retaining the management of the chip manufacturing.", "the sudden fall of its share price." ]
Sanyo's plan of selling its semiconductor unit miscarried mainly due to _
The main ingredient of a semiconductor is silicon, but it might as well be pyrite, or fool's gold. That is because consistently making money out of chips is notoriously difficult. Cyclical demand means that profits are volatile, and new kinds of chips quickly become commoditised. The business is also highly capital-intensive: a new fabrication plant, or fab, costs $3 billion-5 billion, and new facilities must be built every few years as technology advances. Accordingly, many Western technology firms, such as Philips, Hewlett-Packard, Motorola and Siemens, long ago spun off their chipmaking units in order to focus on the final products, rather than the bits inside them. Japan's huge electronics conglomerates have largely resisted this "fab lite" strategy. This now seems to be changing, though the companies' willingness to let go fully is still in doubt. On October 18th Sony said it would put its processor-chip division into a joint venture with Toshiba, which will also buy Sony's chipmaking facilities. Sony will no longer have to make huge investments in chip technology, and will still be sure of a supply of processors for its PlayStation 3 games consoles and other products. Its chip division lost ¥10 billion ($90m) last year, and the company has been getting rid of non-core businesses. Last month it floated its financial-services arm, raising nearly $3 billion. For its part Toshiba, one of the world's biggest chipmakers, will gain economies of scale. Sanyo, another Japanese electronics firm, had hoped to do something similar. But its plan to sell its semiconductor unit for nearly $1 billion to Advantage Partners, a private-equity fund, fell through on October 16th. Sanyo is owned by a number of investors, including Goldman Sachs, which are doing their best to revive the struggling company by slimming it down to focus on its solar-panel and battery businesses. But the banks financing the purchase of Sanyo's chip unit balked at the price and at Advantage's plan to retain the existing management. Sanyo's announcement that it would keep the unit sent its share price plummeting. Meanwhile, NEC, which in 2002 turned its chip business into a separate, publicly listed subsidiary, NEC Electronics, has spurned an offer from Perry Capital, a New York fund. Perry is willing to pay $1.3 billion to raise its stake from roughly 5% to 30%, on the condition that NEC relinquishes control of the chipmaker's board. (That works out at a premium of 60% over the average share price in the past three months.) Now Perry is quietly trying to convince other shareholders of the merits of its offer. Taken together, this action (and inaction) adds up to a test of the willingness of managers at Japan's electronics firms to take rational but uncomfortable decisions. Spin-offs make sense because there are too many firms doing the same thing on too small a scale, and the need to finance new fabs is a drag on the firms' main businesses. In Japan, however, corporate pride often trumps economic logic. Electronics giants are used to being diversified and vertically integrated: they regard selling a subsidiary as akin to amputating an arm. Still, some now see the need for surgery.
3535.txt
0
[ "the bit by Perry is not ideal.", "it is afraid of losing its chip unit.", "the condition raised by Perry is unreasonable.", "it regards its chip unit as an inseprable part." ]
NEC refused the offer from Perry Capital because _
The main ingredient of a semiconductor is silicon, but it might as well be pyrite, or fool's gold. That is because consistently making money out of chips is notoriously difficult. Cyclical demand means that profits are volatile, and new kinds of chips quickly become commoditised. The business is also highly capital-intensive: a new fabrication plant, or fab, costs $3 billion-5 billion, and new facilities must be built every few years as technology advances. Accordingly, many Western technology firms, such as Philips, Hewlett-Packard, Motorola and Siemens, long ago spun off their chipmaking units in order to focus on the final products, rather than the bits inside them. Japan's huge electronics conglomerates have largely resisted this "fab lite" strategy. This now seems to be changing, though the companies' willingness to let go fully is still in doubt. On October 18th Sony said it would put its processor-chip division into a joint venture with Toshiba, which will also buy Sony's chipmaking facilities. Sony will no longer have to make huge investments in chip technology, and will still be sure of a supply of processors for its PlayStation 3 games consoles and other products. Its chip division lost ¥10 billion ($90m) last year, and the company has been getting rid of non-core businesses. Last month it floated its financial-services arm, raising nearly $3 billion. For its part Toshiba, one of the world's biggest chipmakers, will gain economies of scale. Sanyo, another Japanese electronics firm, had hoped to do something similar. But its plan to sell its semiconductor unit for nearly $1 billion to Advantage Partners, a private-equity fund, fell through on October 16th. Sanyo is owned by a number of investors, including Goldman Sachs, which are doing their best to revive the struggling company by slimming it down to focus on its solar-panel and battery businesses. But the banks financing the purchase of Sanyo's chip unit balked at the price and at Advantage's plan to retain the existing management. Sanyo's announcement that it would keep the unit sent its share price plummeting. Meanwhile, NEC, which in 2002 turned its chip business into a separate, publicly listed subsidiary, NEC Electronics, has spurned an offer from Perry Capital, a New York fund. Perry is willing to pay $1.3 billion to raise its stake from roughly 5% to 30%, on the condition that NEC relinquishes control of the chipmaker's board. (That works out at a premium of 60% over the average share price in the past three months.) Now Perry is quietly trying to convince other shareholders of the merits of its offer. Taken together, this action (and inaction) adds up to a test of the willingness of managers at Japan's electronics firms to take rational but uncomfortable decisions. Spin-offs make sense because there are too many firms doing the same thing on too small a scale, and the need to finance new fabs is a drag on the firms' main businesses. In Japan, however, corporate pride often trumps economic logic. Electronics giants are used to being diversified and vertically integrated: they regard selling a subsidiary as akin to amputating an arm. Still, some now see the need for surgery.
3535.txt
3
[ "They discovered a new kind of clay.", "They were compensation for the loss of an overseas supplier.", "They studied new techniques in Europe.", "The pottery they had been producing was not very strong." ]
Why did the potters discussed in the passage change the kind of pottery they made?
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
1
[ "elaborate", "puzzling", "durable", "common" ]
The word "ornate" in line 7 is closest in meaning to
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
0
[ "was decorated with simple, abstract designs", "used three-dimensional decorations", "was valued for its fancy decorations", "had no decoration" ]
The passage suggests that the earliest stoneware
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
0
[ "by sponging on a glaze", "by dusting on metallic powders", "by brown-glazing", "by firing at a high temperature" ]
How did yellow ware achieve its distinctive color?
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
3
[ "ruined by", "warned against", "based on", "sold by" ]
The phrase "derived from" in line 19 is closest in meaning to
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
2
[ "red ware", "yellow ware", "Rockingham ware", "English brown-glazed earthenware" ]
The word "It" in line 20 refers to
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
2
[ "complicated", "accepted", "careful", "different" ]
The word "Various" in line 21 is closest in meaning to
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
3
[ "explain", "restrict", "finance", "supplement" ]
The phrase "account for" in line 22 is closest in meaning to
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
0
[ "its even metallic shine", "its mottled appearance", "its spattered effect", "its varicolored streaks" ]
What was special about flint enamel?
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
3
[ "picture frames", "dishes and bowls", "curtain tiebacks", "doorknobs" ]
Which of the following kinds of Rockingham ware were probably produced in the greatest quantity?
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
1
[ "what bedpans, foot warmers, and cuspidors were used for", "well-known, modern-day potters who make Rockingham ware", "examples of Rockingham ware that collectors especially want", "pieces of Rockingham ware that are inexpensive in today's market" ]
The passage would most probably continue with a discussion of
In the North American colonies, red ware, a simple pottery fired at low temperatures, and stone ware, a strong, impervious grey pottery fired at high temperatures, were produced from two different native clays. These kinds of pottery were produced to supplement imported European pottery. When the American Revolution (1775-1783) interrupted the flow of the superior European ware, there was incentive for American potters to replace the imports with comparable domestic goods. Stoneware, which had been simple, utilitarian kitchenware, grew increasingly ornate throughout the nineteenth century, and in addition to the earlier scratched and drawn designs, three-dimensional molded relief decoration became popular. Representational motifs largely replaced the earlier abstract decorations. Birds and flowers were particularly evident, but other subjects - lions, flags, and clipper ships - are found. Some figurines, mainly of dogs and lions, were made in this medium. Sometimes a name, usually that of the potter, was die-stamped onto a piece. As more and more large kilns were built to create the high-fired stoneware, experiments revealed that the same clay used to produce low-fired red ware could produce a stronger, paler pottery if fired at a hotter temperature. The result was yellow ware, used largely for serviceable items; but a further development was Rockingham ware - one of the most important American ceramics of the nineteenth century. (The name of the ware was probably derived from its resemblance to English brown-glazed earthenware made in South Yorkshire.) It was created by adding a brown glaze to the fired clay, usually giving the finished product a mottled appearance. Various methods of spattering or sponging the glaze onto the ware account for the extremely wide variations in color and add to the interest of collecting Rockingham. An advanced form of Rockingham was flint enamel, created by dusting metallic powders onto the Rockingham glaze to produce brilliant varicolored streaks. Articles for nearly every household activity and ornament could be bought in Rockingham ware: dishes and bowls, of course; also bedpans, foot warmers, cuspidors, lamp bases, doorknobs, molds, picture frames, even curtain tiebacks. All these items are highly collectible today and are eagerly sought. A few Rockingham specialties command particular affection among collectors and correspondingly high prices.
362.txt
2
[ "In May 2016", "In May 2017", "In September 2016", "In September2017" ]
When will the job start?
Basketball Statistician Help Wanted The Athletic Department is looking for students to help assist staff during the Fall 2016, Winter 2016-17 and Spring 2017 semesters. Students in this position will be keeping live statistics during basketball games. Students must meet all of the following requirements: l Good computer skills l Available evenings and weekends l Knowing basketball rules and statistics Students interested in working for the Athletic Department should contact the Athletic Coordinator at their respective campuses. l TP/SS Athletic Coordinator, Michael Simone,240-567-1308 l Rockville Athletic Coordinator, Jorge Zuniga,240-567-7589 l Springfield Athletic coordinator, Gary Miller,240-567-2273 l Germantown Athletic Coordinator, GavriChavan, 240-567-6915
556.txt
2
[ "Sam,English major ,member of the college basketball team", "Judy,IT staff with night classes,children's basketball team coach", "Ted,computer major, basketball fan,free on evenings and weekends", "Molly,part_time programmer,high school basketball player ,new mother" ]
Who is more likely to get job?
Basketball Statistician Help Wanted The Athletic Department is looking for students to help assist staff during the Fall 2016, Winter 2016-17 and Spring 2017 semesters. Students in this position will be keeping live statistics during basketball games. Students must meet all of the following requirements: l Good computer skills l Available evenings and weekends l Knowing basketball rules and statistics Students interested in working for the Athletic Department should contact the Athletic Coordinator at their respective campuses. l TP/SS Athletic Coordinator, Michael Simone,240-567-1308 l Rockville Athletic Coordinator, Jorge Zuniga,240-567-7589 l Springfield Athletic coordinator, Gary Miller,240-567-2273 l Germantown Athletic Coordinator, GavriChavan, 240-567-6915
556.txt
2
[ "Michael", "Jorge", "Gauri", "Gary" ]
Whom should you contact if you want to apply for the job in Rockville?
Basketball Statistician Help Wanted The Athletic Department is looking for students to help assist staff during the Fall 2016, Winter 2016-17 and Spring 2017 semesters. Students in this position will be keeping live statistics during basketball games. Students must meet all of the following requirements: l Good computer skills l Available evenings and weekends l Knowing basketball rules and statistics Students interested in working for the Athletic Department should contact the Athletic Coordinator at their respective campuses. l TP/SS Athletic Coordinator, Michael Simone,240-567-1308 l Rockville Athletic Coordinator, Jorge Zuniga,240-567-7589 l Springfield Athletic coordinator, Gary Miller,240-567-2273 l Germantown Athletic Coordinator, GavriChavan, 240-567-6915
556.txt
1
[ "the number of PH.D. degree holders is declining", "the number of scientists and engineers is decreasing", "the number of 22-year-ilds is declining", "scientists and engineers are not employed" ]
The U.S.will come to lose its leading place in technology probably because _ .
The United States is on the verge of losing its leading place in the world's technology. So says more than one study in recent years. One of the reasons for this decline is the parallel decline in the number of U.S. scientists and engineers. Since 1976,employment of scientists and engineers is up 85 percent. This trend is expected to continue. However, the trend shows that the number of 22-year-olds--the near term source of future PH.D.s-is declining. Further adding to the problem is the increased competition for these candidates from other fields-law,medicine,business,etc. While the number of U.S. PH.D.s in science and engineering declines,the award of PH.D.s to foreign nationals is increasing rapidly. Our inability to motivate students to pursue science and engineering careers at the graduate level is compounded because of the intense demand industry has for bright Bachelor's and Master's degree holders. Too often, promising PH.D.candidates, confronting the cost and financial sacrifice of pursuing their education,find the attraction of industry irresistible.
1635.txt
1
[ "technology", "foreign nationals", "such fields as law, medicine and business", "postgraduates" ]
The field of science and engineering is facing a competition from _ .
The United States is on the verge of losing its leading place in the world's technology. So says more than one study in recent years. One of the reasons for this decline is the parallel decline in the number of U.S. scientists and engineers. Since 1976,employment of scientists and engineers is up 85 percent. This trend is expected to continue. However, the trend shows that the number of 22-year-olds--the near term source of future PH.D.s-is declining. Further adding to the problem is the increased competition for these candidates from other fields-law,medicine,business,etc. While the number of U.S. PH.D.s in science and engineering declines,the award of PH.D.s to foreign nationals is increasing rapidly. Our inability to motivate students to pursue science and engineering careers at the graduate level is compounded because of the intense demand industry has for bright Bachelor's and Master's degree holders. Too often, promising PH.D.candidates, confronting the cost and financial sacrifice of pursuing their education,find the attraction of industry irresistible.
1635.txt
2
[ "bright graduates and postgraduates", "new inventions", "advanced technology", "engineers" ]
Large-scale enterprises now need _ .
The United States is on the verge of losing its leading place in the world's technology. So says more than one study in recent years. One of the reasons for this decline is the parallel decline in the number of U.S. scientists and engineers. Since 1976,employment of scientists and engineers is up 85 percent. This trend is expected to continue. However, the trend shows that the number of 22-year-olds--the near term source of future PH.D.s-is declining. Further adding to the problem is the increased competition for these candidates from other fields-law,medicine,business,etc. While the number of U.S. PH.D.s in science and engineering declines,the award of PH.D.s to foreign nationals is increasing rapidly. Our inability to motivate students to pursue science and engineering careers at the graduate level is compounded because of the intense demand industry has for bright Bachelor's and Master's degree holders. Too often, promising PH.D.candidates, confronting the cost and financial sacrifice of pursuing their education,find the attraction of industry irresistible.
1635.txt
0
[ "they are not encouraged to be engaged in science", "industry does not require PH.D. holders", "they have financial difficulties", "they will spend much time and energy completing PH.D." ]
Many promising postgraduates are unwilling to pursue a PH.D. degree because _ .
The United States is on the verge of losing its leading place in the world's technology. So says more than one study in recent years. One of the reasons for this decline is the parallel decline in the number of U.S. scientists and engineers. Since 1976,employment of scientists and engineers is up 85 percent. This trend is expected to continue. However, the trend shows that the number of 22-year-olds--the near term source of future PH.D.s-is declining. Further adding to the problem is the increased competition for these candidates from other fields-law,medicine,business,etc. While the number of U.S. PH.D.s in science and engineering declines,the award of PH.D.s to foreign nationals is increasing rapidly. Our inability to motivate students to pursue science and engineering careers at the graduate level is compounded because of the intense demand industry has for bright Bachelor's and Master's degree holders. Too often, promising PH.D.candidates, confronting the cost and financial sacrifice of pursuing their education,find the attraction of industry irresistible.
1635.txt
3
[ "they find industry is attracting more and more college students", "they don't think they can prevent themselves from working for industry", "they cannot resist any attraction from all sides", "they cannot work for industry any longer" ]
PH.D. candidates "find the attraction of industry irresistible" means that _ .
The United States is on the verge of losing its leading place in the world's technology. So says more than one study in recent years. One of the reasons for this decline is the parallel decline in the number of U.S. scientists and engineers. Since 1976,employment of scientists and engineers is up 85 percent. This trend is expected to continue. However, the trend shows that the number of 22-year-olds--the near term source of future PH.D.s-is declining. Further adding to the problem is the increased competition for these candidates from other fields-law,medicine,business,etc. While the number of U.S. PH.D.s in science and engineering declines,the award of PH.D.s to foreign nationals is increasing rapidly. Our inability to motivate students to pursue science and engineering careers at the graduate level is compounded because of the intense demand industry has for bright Bachelor's and Master's degree holders. Too often, promising PH.D.candidates, confronting the cost and financial sacrifice of pursuing their education,find the attraction of industry irresistible.
1635.txt
1
[ "A punctual person does everything ahead of time while an unpunctual person does everything behind schedule.", "A punctual person does everything at the right time while an unpunctual person seldom does anything at the correct time.", "A punctual person has a lot of appointments while an unpunctual person has few appointments.", "A punctual person has much time to do everything while an unpunctual person has little time to do anything." ]
What does the author think is the main difference between a punctual person and an unpunctual person?
A punctual person is in the habit of doing a thing at the proper time and is never late in keeping an appointment. The unpunctual man, on one hand, never does what he has to do at the proper time. He is always in a hurry and in the end loses both time and his good name. A lost thing may be found again, but lost time can never be regained. Time is more valuable than material things. In fact, time is life itself. The unpunctual man is for ever wasting and mismanaging his most valuable asset as well as other's. The unpunctual person is always complaining that he finds no time to answer letters, or return calls or keep appointments promptly. But the man who really has a great deal to do is very careful of his time and seldom complains of want of it. He knows that he can not get through his huge amount of work unless he faithfully keeps every piece of work when it has to be attended to. Failure to be punctual in keeping one's appointments is the sign of disrespect towards others. If a person is invited to dinner and arrives later than the appointed time, he keeps all the other guests waiting for him. Usually this will be regarded as a great disrespect to the host and all other guests present. Unpunctuality, moreover, is very harmful when it comes to do one's duty, whether public or private. Imagine how it would be if those who are put in charge of important tasks failed to be at their proper place at the appointed time. A man who is known to be habitually unpunctual is never trusted by his friends or fellow men.
2889.txt
1
[ "he has more work to do than other people", "he is always in a hurry when he works", "he doesn't care much about time", "he always mismanages and wastes his time" ]
According to the passage, the main reason that a person is always unpunctual is that _ .
A punctual person is in the habit of doing a thing at the proper time and is never late in keeping an appointment. The unpunctual man, on one hand, never does what he has to do at the proper time. He is always in a hurry and in the end loses both time and his good name. A lost thing may be found again, but lost time can never be regained. Time is more valuable than material things. In fact, time is life itself. The unpunctual man is for ever wasting and mismanaging his most valuable asset as well as other's. The unpunctual person is always complaining that he finds no time to answer letters, or return calls or keep appointments promptly. But the man who really has a great deal to do is very careful of his time and seldom complains of want of it. He knows that he can not get through his huge amount of work unless he faithfully keeps every piece of work when it has to be attended to. Failure to be punctual in keeping one's appointments is the sign of disrespect towards others. If a person is invited to dinner and arrives later than the appointed time, he keeps all the other guests waiting for him. Usually this will be regarded as a great disrespect to the host and all other guests present. Unpunctuality, moreover, is very harmful when it comes to do one's duty, whether public or private. Imagine how it would be if those who are put in charge of important tasks failed to be at their proper place at the appointed time. A man who is known to be habitually unpunctual is never trusted by his friends or fellow men.
2889.txt
3
[ "after other guests have arrived", "before all other guests", "at the appointed time", "after the host has got things ready" ]
According to the third paragraph, when you are invited to dinner, you should arrive there _ .
A punctual person is in the habit of doing a thing at the proper time and is never late in keeping an appointment. The unpunctual man, on one hand, never does what he has to do at the proper time. He is always in a hurry and in the end loses both time and his good name. A lost thing may be found again, but lost time can never be regained. Time is more valuable than material things. In fact, time is life itself. The unpunctual man is for ever wasting and mismanaging his most valuable asset as well as other's. The unpunctual person is always complaining that he finds no time to answer letters, or return calls or keep appointments promptly. But the man who really has a great deal to do is very careful of his time and seldom complains of want of it. He knows that he can not get through his huge amount of work unless he faithfully keeps every piece of work when it has to be attended to. Failure to be punctual in keeping one's appointments is the sign of disrespect towards others. If a person is invited to dinner and arrives later than the appointed time, he keeps all the other guests waiting for him. Usually this will be regarded as a great disrespect to the host and all other guests present. Unpunctuality, moreover, is very harmful when it comes to do one's duty, whether public or private. Imagine how it would be if those who are put in charge of important tasks failed to be at their proper place at the appointed time. A man who is known to be habitually unpunctual is never trusted by his friends or fellow men.
2889.txt
2
[ "If you are an unpunctual person, you cannot be in charge of any important task.", "If your friends know that you are unpunctual, they may not see you again.", "Unpunctuality may bring about heavy losses for both public and private affairs.", "Unpunctuality may make you miss a lot of appointments and lose friends." ]
Which of the following statements best describes the harm of unpunctuality?
A punctual person is in the habit of doing a thing at the proper time and is never late in keeping an appointment. The unpunctual man, on one hand, never does what he has to do at the proper time. He is always in a hurry and in the end loses both time and his good name. A lost thing may be found again, but lost time can never be regained. Time is more valuable than material things. In fact, time is life itself. The unpunctual man is for ever wasting and mismanaging his most valuable asset as well as other's. The unpunctual person is always complaining that he finds no time to answer letters, or return calls or keep appointments promptly. But the man who really has a great deal to do is very careful of his time and seldom complains of want of it. He knows that he can not get through his huge amount of work unless he faithfully keeps every piece of work when it has to be attended to. Failure to be punctual in keeping one's appointments is the sign of disrespect towards others. If a person is invited to dinner and arrives later than the appointed time, he keeps all the other guests waiting for him. Usually this will be regarded as a great disrespect to the host and all other guests present. Unpunctuality, moreover, is very harmful when it comes to do one's duty, whether public or private. Imagine how it would be if those who are put in charge of important tasks failed to be at their proper place at the appointed time. A man who is known to be habitually unpunctual is never trusted by his friends or fellow men.
2889.txt
2
[ "The Queen Mother died on Easter Monday alone.", "The Queen Mother was an attractive person in her political life.", "The British people felt sorry for the death of the Queen Mother.", "The Queen Mother was suffering a lot when she was dying." ]
Which of the following statements is true according to the news?
LONDON - Britain awoke on Easter Monday to a period of mourning for the Queen Mother, who died over the weekend after a life spanning a century of noisy and evident change. The 101-year-old royal matriarch died in her sleep last Saturday with Queen Elizabeth, her elder and only surviving daughter, at her bedside. For a woman who was one of the best-known figures in Britain for more than 80 years - from the era of tinted portraits on tin biscuit boxes and cigarette cards to the age of the Internet, the Queen Mother remained an enigmatic and elusive figure. She achieved such a respect through aeons(, ) of, first, fawning and, later, intrusive media fascination, by remaining almost entirely silent. Her private thoughts were never paraded in public. What the public saw was a charming and benign elderly lady, adept at winning the admiration of press photographers, whom she always favoured with a particular smile. CHINA's third unmanned spacecraft, Shenzhou Ⅲ, landed safely in central Inner Mongolia Autonomous Region Monday afternoon, after orbiting the earth 108 times in slightly less than a week. The craft, which lifted off from Jiuquan in Gansu Province last Monday night, landed after successfully conducting a chain of flight and scientific experiments over a period of 162 hours. A powerful earthquake jolted Taiwan, killing five construction workers, authorities said. Over 200 injuries ware reported across the island, mostly minor, as a result of Sunday's 7.5-magnitude quake. The quake was centred off Hualien, 180 kilometres east of Taipei. It struck at 2:53 pm and lasted for nearly a minute.
2967.txt
2
[ "the craft landed in central Inner Mongolia unexpectedly", "it took the craft at least 2 hours to orbit the earth once", "the Chinese scientists did a lot of experiments in space", "China was successful in sending an unmanned spacecraft into space" ]
It can be inferred that _ .
LONDON - Britain awoke on Easter Monday to a period of mourning for the Queen Mother, who died over the weekend after a life spanning a century of noisy and evident change. The 101-year-old royal matriarch died in her sleep last Saturday with Queen Elizabeth, her elder and only surviving daughter, at her bedside. For a woman who was one of the best-known figures in Britain for more than 80 years - from the era of tinted portraits on tin biscuit boxes and cigarette cards to the age of the Internet, the Queen Mother remained an enigmatic and elusive figure. She achieved such a respect through aeons(, ) of, first, fawning and, later, intrusive media fascination, by remaining almost entirely silent. Her private thoughts were never paraded in public. What the public saw was a charming and benign elderly lady, adept at winning the admiration of press photographers, whom she always favoured with a particular smile. CHINA's third unmanned spacecraft, Shenzhou Ⅲ, landed safely in central Inner Mongolia Autonomous Region Monday afternoon, after orbiting the earth 108 times in slightly less than a week. The craft, which lifted off from Jiuquan in Gansu Province last Monday night, landed after successfully conducting a chain of flight and scientific experiments over a period of 162 hours. A powerful earthquake jolted Taiwan, killing five construction workers, authorities said. Over 200 injuries ware reported across the island, mostly minor, as a result of Sunday's 7.5-magnitude quake. The quake was centred off Hualien, 180 kilometres east of Taipei. It struck at 2:53 pm and lasted for nearly a minute.
2967.txt
3
[ "political matters", "social problems", "unexpected damage", "construction workers" ]
The third news mainly talks about the _ in Taiwan.
LONDON - Britain awoke on Easter Monday to a period of mourning for the Queen Mother, who died over the weekend after a life spanning a century of noisy and evident change. The 101-year-old royal matriarch died in her sleep last Saturday with Queen Elizabeth, her elder and only surviving daughter, at her bedside. For a woman who was one of the best-known figures in Britain for more than 80 years - from the era of tinted portraits on tin biscuit boxes and cigarette cards to the age of the Internet, the Queen Mother remained an enigmatic and elusive figure. She achieved such a respect through aeons(, ) of, first, fawning and, later, intrusive media fascination, by remaining almost entirely silent. Her private thoughts were never paraded in public. What the public saw was a charming and benign elderly lady, adept at winning the admiration of press photographers, whom she always favoured with a particular smile. CHINA's third unmanned spacecraft, Shenzhou Ⅲ, landed safely in central Inner Mongolia Autonomous Region Monday afternoon, after orbiting the earth 108 times in slightly less than a week. The craft, which lifted off from Jiuquan in Gansu Province last Monday night, landed after successfully conducting a chain of flight and scientific experiments over a period of 162 hours. A powerful earthquake jolted Taiwan, killing five construction workers, authorities said. Over 200 injuries ware reported across the island, mostly minor, as a result of Sunday's 7.5-magnitude quake. The quake was centred off Hualien, 180 kilometres east of Taipei. It struck at 2:53 pm and lasted for nearly a minute.
2967.txt
2
[ "try to rid himself of his world of illusion", "accept his words as being one of illusion", "apply the scientific method", "learn to acknowledge both" ]
The author suggests that in order to bridge the puzzling difference between scientific truth and the world of illusion, the reader should _ .
The table before which we sit may be, as the scientist maintains, composed of dancing atoms, but it does not reveal itself to us as anything of the kind, and it is not with dancing atoms but a solid and motionless object that we live. So remote is this " real" table--and most of the other " realities" with which science deals--that it cannot be discussed in terms which have any human value, and though it may receive our purely intellectual credence it cannot be woven into the pattern of life as it is led, in contradistinction to life as we attempt it. Vibrations in the ether are so totally unlike the color, purple that the gulf between them cannot be bridged, and they are, to all intents and purposes, not one but two separate things of which the second and less " real" must be the most significant for us. And just as the sensation which has led us to attribute all objective reality to a non-existent thing which we called " purple" is more important for human life than the conception of vibrations of a certain frequency; so too the belief in God; however ill founded, has been more important in the life of man than the germ theory of true the latter may be. We may, if we like, speak of consequence, as certain mystics love to do, of the different levels or orders of truth. We may adopt what is essentially a Platonistic trick of thought and insist upon postulating the existence of external realities which correspond to the needs and modes of human feeling and which, so we may insist, have their being in some part of the universe unreachable by science. But to do so is to make an unwarrantable assumption and to be guilty of the metaphysical fallacy of failing to distinguish between a truth of feeling and that other sort of truth which is described as " truth of correspondence" and it is better perhaps, at least for those of us who have grown up in thought, to steer clear of such confusions and to rest content with the admission that, though the universe with which science deals is the real universe, yet we do not and cannot have any but fleeting and imperfect contacts with it; that the most important part of our lives-our sensations, emotions, desires and aspirations-take place in a universe of illusions which science can attenuate or destroy, but which it is powerless to enrich.
763.txt
1
[ "a humanist", "a pantheist", "a nuclear physicist", "a doctor of medicine" ]
Judging from the ideas and tone of the selection, one may reasonably guess that the author is _ .
The table before which we sit may be, as the scientist maintains, composed of dancing atoms, but it does not reveal itself to us as anything of the kind, and it is not with dancing atoms but a solid and motionless object that we live. So remote is this " real" table--and most of the other " realities" with which science deals--that it cannot be discussed in terms which have any human value, and though it may receive our purely intellectual credence it cannot be woven into the pattern of life as it is led, in contradistinction to life as we attempt it. Vibrations in the ether are so totally unlike the color, purple that the gulf between them cannot be bridged, and they are, to all intents and purposes, not one but two separate things of which the second and less " real" must be the most significant for us. And just as the sensation which has led us to attribute all objective reality to a non-existent thing which we called " purple" is more important for human life than the conception of vibrations of a certain frequency; so too the belief in God; however ill founded, has been more important in the life of man than the germ theory of true the latter may be. We may, if we like, speak of consequence, as certain mystics love to do, of the different levels or orders of truth. We may adopt what is essentially a Platonistic trick of thought and insist upon postulating the existence of external realities which correspond to the needs and modes of human feeling and which, so we may insist, have their being in some part of the universe unreachable by science. But to do so is to make an unwarrantable assumption and to be guilty of the metaphysical fallacy of failing to distinguish between a truth of feeling and that other sort of truth which is described as " truth of correspondence" and it is better perhaps, at least for those of us who have grown up in thought, to steer clear of such confusions and to rest content with the admission that, though the universe with which science deals is the real universe, yet we do not and cannot have any but fleeting and imperfect contacts with it; that the most important part of our lives-our sensations, emotions, desires and aspirations-take place in a universe of illusions which science can attenuate or destroy, but which it is powerless to enrich.
763.txt
0
[ "a solid motionless object", "certain characteristic vibrations in \" ether\"", "a form fixed in space and time", "a mass of atoms in motion" ]
According to this passage, a scientist would conceive of a " table" as being _ .
The table before which we sit may be, as the scientist maintains, composed of dancing atoms, but it does not reveal itself to us as anything of the kind, and it is not with dancing atoms but a solid and motionless object that we live. So remote is this " real" table--and most of the other " realities" with which science deals--that it cannot be discussed in terms which have any human value, and though it may receive our purely intellectual credence it cannot be woven into the pattern of life as it is led, in contradistinction to life as we attempt it. Vibrations in the ether are so totally unlike the color, purple that the gulf between them cannot be bridged, and they are, to all intents and purposes, not one but two separate things of which the second and less " real" must be the most significant for us. And just as the sensation which has led us to attribute all objective reality to a non-existent thing which we called " purple" is more important for human life than the conception of vibrations of a certain frequency; so too the belief in God; however ill founded, has been more important in the life of man than the germ theory of true the latter may be. We may, if we like, speak of consequence, as certain mystics love to do, of the different levels or orders of truth. We may adopt what is essentially a Platonistic trick of thought and insist upon postulating the existence of external realities which correspond to the needs and modes of human feeling and which, so we may insist, have their being in some part of the universe unreachable by science. But to do so is to make an unwarrantable assumption and to be guilty of the metaphysical fallacy of failing to distinguish between a truth of feeling and that other sort of truth which is described as " truth of correspondence" and it is better perhaps, at least for those of us who have grown up in thought, to steer clear of such confusions and to rest content with the admission that, though the universe with which science deals is the real universe, yet we do not and cannot have any but fleeting and imperfect contacts with it; that the most important part of our lives-our sensations, emotions, desires and aspirations-take place in a universe of illusions which science can attenuate or destroy, but which it is powerless to enrich.
763.txt
3
[ "the distortion of reality by science", "the confusion caused by emotions", "Platonic and contemporary views of truth", "the place of scientific truth in our lives" ]
The topic of this selection is _ .
The table before which we sit may be, as the scientist maintains, composed of dancing atoms, but it does not reveal itself to us as anything of the kind, and it is not with dancing atoms but a solid and motionless object that we live. So remote is this " real" table--and most of the other " realities" with which science deals--that it cannot be discussed in terms which have any human value, and though it may receive our purely intellectual credence it cannot be woven into the pattern of life as it is led, in contradistinction to life as we attempt it. Vibrations in the ether are so totally unlike the color, purple that the gulf between them cannot be bridged, and they are, to all intents and purposes, not one but two separate things of which the second and less " real" must be the most significant for us. And just as the sensation which has led us to attribute all objective reality to a non-existent thing which we called " purple" is more important for human life than the conception of vibrations of a certain frequency; so too the belief in God; however ill founded, has been more important in the life of man than the germ theory of true the latter may be. We may, if we like, speak of consequence, as certain mystics love to do, of the different levels or orders of truth. We may adopt what is essentially a Platonistic trick of thought and insist upon postulating the existence of external realities which correspond to the needs and modes of human feeling and which, so we may insist, have their being in some part of the universe unreachable by science. But to do so is to make an unwarrantable assumption and to be guilty of the metaphysical fallacy of failing to distinguish between a truth of feeling and that other sort of truth which is described as " truth of correspondence" and it is better perhaps, at least for those of us who have grown up in thought, to steer clear of such confusions and to rest content with the admission that, though the universe with which science deals is the real universe, yet we do not and cannot have any but fleeting and imperfect contacts with it; that the most important part of our lives-our sensations, emotions, desires and aspirations-take place in a universe of illusions which science can attenuate or destroy, but which it is powerless to enrich.
763.txt
3
[ "scientific reality", "a symbolic existence", "the viewer's experience", "reality colored by emotion" ]
By " objective reality" (Last line, Para. 1) the author means _ .
The table before which we sit may be, as the scientist maintains, composed of dancing atoms, but it does not reveal itself to us as anything of the kind, and it is not with dancing atoms but a solid and motionless object that we live. So remote is this " real" table--and most of the other " realities" with which science deals--that it cannot be discussed in terms which have any human value, and though it may receive our purely intellectual credence it cannot be woven into the pattern of life as it is led, in contradistinction to life as we attempt it. Vibrations in the ether are so totally unlike the color, purple that the gulf between them cannot be bridged, and they are, to all intents and purposes, not one but two separate things of which the second and less " real" must be the most significant for us. And just as the sensation which has led us to attribute all objective reality to a non-existent thing which we called " purple" is more important for human life than the conception of vibrations of a certain frequency; so too the belief in God; however ill founded, has been more important in the life of man than the germ theory of true the latter may be. We may, if we like, speak of consequence, as certain mystics love to do, of the different levels or orders of truth. We may adopt what is essentially a Platonistic trick of thought and insist upon postulating the existence of external realities which correspond to the needs and modes of human feeling and which, so we may insist, have their being in some part of the universe unreachable by science. But to do so is to make an unwarrantable assumption and to be guilty of the metaphysical fallacy of failing to distinguish between a truth of feeling and that other sort of truth which is described as " truth of correspondence" and it is better perhaps, at least for those of us who have grown up in thought, to steer clear of such confusions and to rest content with the admission that, though the universe with which science deals is the real universe, yet we do not and cannot have any but fleeting and imperfect contacts with it; that the most important part of our lives-our sensations, emotions, desires and aspirations-take place in a universe of illusions which science can attenuate or destroy, but which it is powerless to enrich.
763.txt
0
[ "241-2742.", "723-1182.", "381-3300.", "232-6220." ]
Which number should you call if you want to see an opera?
Music Opera at Music Hall: 1243 Elm Street. The season runs June through August, with additional performances in March and September. The Opera honors Enjoy the Arts membership discounts. Phone: 241-2742. http://www.cityopera.com. Chamber Orchestra: The Orchestra plays at Memorial Hall at 1406 Elm Street, which offers several concerts from March through June. Call 723-1182 for more information. http: //www.chamberorch.com. Symphony Orchestra: At Music Hall and Riverbend. For ticket sales, call 381-3300. Regular season runs September through May at Music Hall in summer at Riverbend. http://www.symphony.org/home.asp. College Conservatory of Music (CCM): Performances are on the main campus of the university, usually at Patricia Cobbett Theater. CCM organizes a variety of events, including performances by the well-known LaSalle Quartet, CCM's Philharmonic Orchestra, and various groups of musicians presenting Baroque through modern music. Students with I.D. cards can attend the events for free. A free schedule of events for each term is available by calling the box office at 556-4183. http://www.ccm.uc.edu/events/calendar. Riverbend Music Theater: 6295 Kellogg Ave. Large outdoor theater with the closest seats under cover (price difference). Big name shows all summer long! Phone: 232-6220. http://www.riverbendmusic.com.
4213.txt
0
[ "February.", "May.", "August.", "November." ]
When can you go to a concert by Chamber Orchestra?
Music Opera at Music Hall: 1243 Elm Street. The season runs June through August, with additional performances in March and September. The Opera honors Enjoy the Arts membership discounts. Phone: 241-2742. http://www.cityopera.com. Chamber Orchestra: The Orchestra plays at Memorial Hall at 1406 Elm Street, which offers several concerts from March through June. Call 723-1182 for more information. http: //www.chamberorch.com. Symphony Orchestra: At Music Hall and Riverbend. For ticket sales, call 381-3300. Regular season runs September through May at Music Hall in summer at Riverbend. http://www.symphony.org/home.asp. College Conservatory of Music (CCM): Performances are on the main campus of the university, usually at Patricia Cobbett Theater. CCM organizes a variety of events, including performances by the well-known LaSalle Quartet, CCM's Philharmonic Orchestra, and various groups of musicians presenting Baroque through modern music. Students with I.D. cards can attend the events for free. A free schedule of events for each term is available by calling the box office at 556-4183. http://www.ccm.uc.edu/events/calendar. Riverbend Music Theater: 6295 Kellogg Ave. Large outdoor theater with the closest seats under cover (price difference). Big name shows all summer long! Phone: 232-6220. http://www.riverbendmusic.com.
4213.txt
1
[ "Music Hall.", "Memorial Hall.", "Patricia Cobbett Theater.", "Riverbend Music Theater." ]
Where can students go for free performances with their I.D. cards?
Music Opera at Music Hall: 1243 Elm Street. The season runs June through August, with additional performances in March and September. The Opera honors Enjoy the Arts membership discounts. Phone: 241-2742. http://www.cityopera.com. Chamber Orchestra: The Orchestra plays at Memorial Hall at 1406 Elm Street, which offers several concerts from March through June. Call 723-1182 for more information. http: //www.chamberorch.com. Symphony Orchestra: At Music Hall and Riverbend. For ticket sales, call 381-3300. Regular season runs September through May at Music Hall in summer at Riverbend. http://www.symphony.org/home.asp. College Conservatory of Music (CCM): Performances are on the main campus of the university, usually at Patricia Cobbett Theater. CCM organizes a variety of events, including performances by the well-known LaSalle Quartet, CCM's Philharmonic Orchestra, and various groups of musicians presenting Baroque through modern music. Students with I.D. cards can attend the events for free. A free schedule of events for each term is available by calling the box office at 556-4183. http://www.ccm.uc.edu/events/calendar. Riverbend Music Theater: 6295 Kellogg Ave. Large outdoor theater with the closest seats under cover (price difference). Big name shows all summer long! Phone: 232-6220. http://www.riverbendmusic.com.
4213.txt
2
[ "It has seats in the open air.", "It gives shows all year round.", "It offers membership discounts.", "It presents famous musical works." ]
How is Riverbend Music Theater different from the other places?
Music Opera at Music Hall: 1243 Elm Street. The season runs June through August, with additional performances in March and September. The Opera honors Enjoy the Arts membership discounts. Phone: 241-2742. http://www.cityopera.com. Chamber Orchestra: The Orchestra plays at Memorial Hall at 1406 Elm Street, which offers several concerts from March through June. Call 723-1182 for more information. http: //www.chamberorch.com. Symphony Orchestra: At Music Hall and Riverbend. For ticket sales, call 381-3300. Regular season runs September through May at Music Hall in summer at Riverbend. http://www.symphony.org/home.asp. College Conservatory of Music (CCM): Performances are on the main campus of the university, usually at Patricia Cobbett Theater. CCM organizes a variety of events, including performances by the well-known LaSalle Quartet, CCM's Philharmonic Orchestra, and various groups of musicians presenting Baroque through modern music. Students with I.D. cards can attend the events for free. A free schedule of events for each term is available by calling the box office at 556-4183. http://www.ccm.uc.edu/events/calendar. Riverbend Music Theater: 6295 Kellogg Ave. Large outdoor theater with the closest seats under cover (price difference). Big name shows all summer long! Phone: 232-6220. http://www.riverbendmusic.com.
4213.txt
0
[ "exhausted unprecedented management efforts", "consumed a record-high percentage of budget", "severely damaged the ecology of western states", "caused a huge rise of infrastructure expenditure" ]
More frequent wildfires have become a national concern because in 2015 they ________.
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, says Professor Max Moritz, a specialist in fire ecology and management. In 2015, the US Forest Service for the first time spent more than half of its MYM5.5 billion annual budget fighting fires-nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency's other work-such as forest conservation, watershed and cultural resources management, and infrastructure upkeep-that affect the lives of all Americans. Another nationwide concern is whether public funds from other agencies are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire? "It's already a huge problem from a public expenditure perspective for the whole country," he says. We need to take a magnifying glass to that. Like, "Wait a minute, is this OK?""Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?" Such a view would require a corresponding shift in the way US society today views fire, researchers say. For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change-how the warming of the Earth from greenhouse gases is leading to conditions that worsen fires. While climate is a key element, Moritz says, it shouldn't come at the expense of the rest of the equation. "The human systems and the landscapes we live on are linked, and the interactions go both ways," he says. Failing to recognize that, he notes, leads to "an overly simplified view of what the solutions might be. Our perception of the problem and of what the solution is becomes very limited." At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire's inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says. "We've disconnected ourselves from living with fire," Balch says. "It is really important to understand and try and tease out what is the human connection with fire today."
451.txt
1
[ "raise more funds for fire-prone areas", "avoid the redirection of federal money", "find wildfire-free parts of the landscape", "guarantee safer spending of public funds" ]
Moritz calls for the use of "a magnifying glass" to _______.
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, says Professor Max Moritz, a specialist in fire ecology and management. In 2015, the US Forest Service for the first time spent more than half of its MYM5.5 billion annual budget fighting fires-nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency's other work-such as forest conservation, watershed and cultural resources management, and infrastructure upkeep-that affect the lives of all Americans. Another nationwide concern is whether public funds from other agencies are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire? "It's already a huge problem from a public expenditure perspective for the whole country," he says. We need to take a magnifying glass to that. Like, "Wait a minute, is this OK?""Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?" Such a view would require a corresponding shift in the way US society today views fire, researchers say. For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change-how the warming of the Earth from greenhouse gases is leading to conditions that worsen fires. While climate is a key element, Moritz says, it shouldn't come at the expense of the rest of the equation. "The human systems and the landscapes we live on are linked, and the interactions go both ways," he says. Failing to recognize that, he notes, leads to "an overly simplified view of what the solutions might be. Our perception of the problem and of what the solution is becomes very limited." At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire's inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says. "We've disconnected ourselves from living with fire," Balch says. "It is really important to understand and try and tease out what is the human connection with fire today."
451.txt
3
[ "public debates have not settled yet", "fire-fighting conditions are improving", "other factors should not be overlooked", "a shift in the view of fire has taken place" ]
While admitting that climate is a key element, Moritz notes that _______.
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, says Professor Max Moritz, a specialist in fire ecology and management. In 2015, the US Forest Service for the first time spent more than half of its MYM5.5 billion annual budget fighting fires-nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency's other work-such as forest conservation, watershed and cultural resources management, and infrastructure upkeep-that affect the lives of all Americans. Another nationwide concern is whether public funds from other agencies are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire? "It's already a huge problem from a public expenditure perspective for the whole country," he says. We need to take a magnifying glass to that. Like, "Wait a minute, is this OK?""Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?" Such a view would require a corresponding shift in the way US society today views fire, researchers say. For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change-how the warming of the Earth from greenhouse gases is leading to conditions that worsen fires. While climate is a key element, Moritz says, it shouldn't come at the expense of the rest of the equation. "The human systems and the landscapes we live on are linked, and the interactions go both ways," he says. Failing to recognize that, he notes, leads to "an overly simplified view of what the solutions might be. Our perception of the problem and of what the solution is becomes very limited." At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire's inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says. "We've disconnected ourselves from living with fire," Balch says. "It is really important to understand and try and tease out what is the human connection with fire today."
451.txt
2
[ "discover the fundamental makeup of nature", "explore the mechanism of the human systems", "maximize the role of landscape in human life", "understand the interrelations of man and nature" ]
The overly simplified view Moritz mentions is a result of failing to _______.
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, says Professor Max Moritz, a specialist in fire ecology and management. In 2015, the US Forest Service for the first time spent more than half of its MYM5.5 billion annual budget fighting fires-nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency's other work-such as forest conservation, watershed and cultural resources management, and infrastructure upkeep-that affect the lives of all Americans. Another nationwide concern is whether public funds from other agencies are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire? "It's already a huge problem from a public expenditure perspective for the whole country," he says. We need to take a magnifying glass to that. Like, "Wait a minute, is this OK?""Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?" Such a view would require a corresponding shift in the way US society today views fire, researchers say. For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change-how the warming of the Earth from greenhouse gases is leading to conditions that worsen fires. While climate is a key element, Moritz says, it shouldn't come at the expense of the rest of the equation. "The human systems and the landscapes we live on are linked, and the interactions go both ways," he says. Failing to recognize that, he notes, leads to "an overly simplified view of what the solutions might be. Our perception of the problem and of what the solution is becomes very limited." At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire's inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says. "We've disconnected ourselves from living with fire," Balch says. "It is really important to understand and try and tease out what is the human connection with fire today."
451.txt
3
[ "do away with", "come to terms with", "pay a price for", "keep away from" ]
Professor Balch points out that fire is something man should ________.
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, says Professor Max Moritz, a specialist in fire ecology and management. In 2015, the US Forest Service for the first time spent more than half of its MYM5.5 billion annual budget fighting fires-nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency's other work-such as forest conservation, watershed and cultural resources management, and infrastructure upkeep-that affect the lives of all Americans. Another nationwide concern is whether public funds from other agencies are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire? "It's already a huge problem from a public expenditure perspective for the whole country," he says. We need to take a magnifying glass to that. Like, "Wait a minute, is this OK?""Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?" Such a view would require a corresponding shift in the way US society today views fire, researchers say. For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change-how the warming of the Earth from greenhouse gases is leading to conditions that worsen fires. While climate is a key element, Moritz says, it shouldn't come at the expense of the rest of the equation. "The human systems and the landscapes we live on are linked, and the interactions go both ways," he says. Failing to recognize that, he notes, leads to "an overly simplified view of what the solutions might be. Our perception of the problem and of what the solution is becomes very limited." At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire's inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says. "We've disconnected ourselves from living with fire," Balch says. "It is really important to understand and try and tease out what is the human connection with fire today."
451.txt
1
[ "unreasonable", "criminal", "harmful", "costly" ]
It is commonly accepted in American society that too much sleep is _ .
American society is not nap friendly. In fact, says David Dinges, a sleep specialist at the University of Pennsylvania School of Medicine. "There's even a prohibition against admitting we need sleep." Nobody wants to be caught napping or found asleep at work. To quote proverb: "Some sleep five hours, nature requires seven, laziness nine and wickedness eleven." Wrong. The way not to fall asleep at work is to take naps when you need them. "We have to totally change our attitude toward napping", says Dr. William Dement of Stanford University, the godfather of sleep research. Last year a national commission led by Dement identified an "American sleep debt" which one member said was as important as the national debt, the commission was concerned about the dangers of sleepiness: people causing industrial accidents or falling asleep while driving. This may be why we have a new sleep policy in the White House. According to recent reports, president Clinton is trying to take a half-hour snooze every afternoon. About 60 percent of American adults nap when given the opportunity. We seem to have "a midafternoon quiet phase" also called "a secondary sleep gate." Sleeping 15 minutes to two hours in the early afternoon can reduce stress and make us refreshed. Clearly, we were born to nap. We Superstars of Snooze don't nap to replace lost shut-eye or to prepare for a night shift. Rather, we "snack" on sleep, whenever, wherever and at whatever time we feel like it. I myself have napped in buses, cars, planes and on boats; on floors and beds; and in libraries, offices and museums.
3060.txt
0
[ "don't like to take naps", "are terribly worried about their national debt", "sleep less than is good for them", "have caused many industrial and traffic accidents" ]
The research done by the Dement commission shows that Americans _ .
American society is not nap friendly. In fact, says David Dinges, a sleep specialist at the University of Pennsylvania School of Medicine. "There's even a prohibition against admitting we need sleep." Nobody wants to be caught napping or found asleep at work. To quote proverb: "Some sleep five hours, nature requires seven, laziness nine and wickedness eleven." Wrong. The way not to fall asleep at work is to take naps when you need them. "We have to totally change our attitude toward napping", says Dr. William Dement of Stanford University, the godfather of sleep research. Last year a national commission led by Dement identified an "American sleep debt" which one member said was as important as the national debt, the commission was concerned about the dangers of sleepiness: people causing industrial accidents or falling asleep while driving. This may be why we have a new sleep policy in the White House. According to recent reports, president Clinton is trying to take a half-hour snooze every afternoon. About 60 percent of American adults nap when given the opportunity. We seem to have "a midafternoon quiet phase" also called "a secondary sleep gate." Sleeping 15 minutes to two hours in the early afternoon can reduce stress and make us refreshed. Clearly, we were born to nap. We Superstars of Snooze don't nap to replace lost shut-eye or to prepare for a night shift. Rather, we "snack" on sleep, whenever, wherever and at whatever time we feel like it. I myself have napped in buses, cars, planes and on boats; on floors and beds; and in libraries, offices and museums.
3060.txt
2
[ "warn us of the wickedness of napping", "explain the danger of sleepiness", "discuss the side effects of napping", "convince the reader of the necessity of napping" ]
The purpose of this article is to _ .
American society is not nap friendly. In fact, says David Dinges, a sleep specialist at the University of Pennsylvania School of Medicine. "There's even a prohibition against admitting we need sleep." Nobody wants to be caught napping or found asleep at work. To quote proverb: "Some sleep five hours, nature requires seven, laziness nine and wickedness eleven." Wrong. The way not to fall asleep at work is to take naps when you need them. "We have to totally change our attitude toward napping", says Dr. William Dement of Stanford University, the godfather of sleep research. Last year a national commission led by Dement identified an "American sleep debt" which one member said was as important as the national debt, the commission was concerned about the dangers of sleepiness: people causing industrial accidents or falling asleep while driving. This may be why we have a new sleep policy in the White House. According to recent reports, president Clinton is trying to take a half-hour snooze every afternoon. About 60 percent of American adults nap when given the opportunity. We seem to have "a midafternoon quiet phase" also called "a secondary sleep gate." Sleeping 15 minutes to two hours in the early afternoon can reduce stress and make us refreshed. Clearly, we were born to nap. We Superstars of Snooze don't nap to replace lost shut-eye or to prepare for a night shift. Rather, we "snack" on sleep, whenever, wherever and at whatever time we feel like it. I myself have napped in buses, cars, planes and on boats; on floors and beds; and in libraries, offices and museums.
3060.txt
3
[ "the traditional misconception the Americans have about sleep", "the new sleep policy of the Clinton Administration", "the rapid development of American industry", "the Americans' worry about the danger of sleepiness" ]
The "American sleep debt" (Line 1, Para. 3) is the result of _ .
American society is not nap friendly. In fact, says David Dinges, a sleep specialist at the University of Pennsylvania School of Medicine. "There's even a prohibition against admitting we need sleep." Nobody wants to be caught napping or found asleep at work. To quote proverb: "Some sleep five hours, nature requires seven, laziness nine and wickedness eleven." Wrong. The way not to fall asleep at work is to take naps when you need them. "We have to totally change our attitude toward napping", says Dr. William Dement of Stanford University, the godfather of sleep research. Last year a national commission led by Dement identified an "American sleep debt" which one member said was as important as the national debt, the commission was concerned about the dangers of sleepiness: people causing industrial accidents or falling asleep while driving. This may be why we have a new sleep policy in the White House. According to recent reports, president Clinton is trying to take a half-hour snooze every afternoon. About 60 percent of American adults nap when given the opportunity. We seem to have "a midafternoon quiet phase" also called "a secondary sleep gate." Sleeping 15 minutes to two hours in the early afternoon can reduce stress and make us refreshed. Clearly, we were born to nap. We Superstars of Snooze don't nap to replace lost shut-eye or to prepare for a night shift. Rather, we "snack" on sleep, whenever, wherever and at whatever time we feel like it. I myself have napped in buses, cars, planes and on boats; on floors and beds; and in libraries, offices and museums.
3060.txt
0
[ "preferable to have a sound sleep before a night shift", "good practice to eat something light before we go to bed", "essential to make up for cost sleep", "natural to take a nap whenever we feel the need for it" ]
The second sentence of the last paragraph tells us that it is _ .
American society is not nap friendly. In fact, says David Dinges, a sleep specialist at the University of Pennsylvania School of Medicine. "There's even a prohibition against admitting we need sleep." Nobody wants to be caught napping or found asleep at work. To quote proverb: "Some sleep five hours, nature requires seven, laziness nine and wickedness eleven." Wrong. The way not to fall asleep at work is to take naps when you need them. "We have to totally change our attitude toward napping", says Dr. William Dement of Stanford University, the godfather of sleep research. Last year a national commission led by Dement identified an "American sleep debt" which one member said was as important as the national debt, the commission was concerned about the dangers of sleepiness: people causing industrial accidents or falling asleep while driving. This may be why we have a new sleep policy in the White House. According to recent reports, president Clinton is trying to take a half-hour snooze every afternoon. About 60 percent of American adults nap when given the opportunity. We seem to have "a midafternoon quiet phase" also called "a secondary sleep gate." Sleeping 15 minutes to two hours in the early afternoon can reduce stress and make us refreshed. Clearly, we were born to nap. We Superstars of Snooze don't nap to replace lost shut-eye or to prepare for a night shift. Rather, we "snack" on sleep, whenever, wherever and at whatever time we feel like it. I myself have napped in buses, cars, planes and on boats; on floors and beds; and in libraries, offices and museums.
3060.txt
3
[ "the consequences of the current sorting mechanism.", "companies financial loss due to immoral practices", "governmental ineffectiveness on moral issues.", "the wide misuse of integrity among institutions." ]
Accordign to the first two graphs, Elisabeth was upset by
Two years ago. Rupert Murdochs daughter, spoke at the unsettling dearth of integrity across so many of our collapsed, she argued, because of a collective acceptance that the mechanismin society should be profit and the market we the people who create the society we want, not profit. Driving her point home, she continuedIts increasingly absence of purpose, of a moral language with in government, could become one of the most dangerous goals for capitalism and freedom. This same absence of moral purpose was wounding companies, such as International, she thought, making it more likely that it would fore had with widespread illegal telephone hacking. As the hacking trial concludes-finding guilty one ex-editor of the News of the World, Andy Coulson, for conspiring to hack phones, and finding the predecessor, Rebekah Brooks, innocent of the same charge-the wide dearth of integrity still stands. Journalists are known to have hacked the phones of up to 5,500 people. This is hacking on an industrial scale, as was acknowledged by Glenn Mulcaire, the man hired by the News of the World in 2001 to be the point person for phone hacking. Others await trial. This long story still unfolds. In many respects, the dearth of moral purpose frames not only the fact of such widespread phone hacking but the terms on which the trial took place. One of the astonishing revelations was how little Rebekah Brooks knew of what went on in her newsroom, how little she thought to ask and the fact that she never inquired how the stories arrived. The core of her successful defence was that she knew nothing. In todays world, it has become normal that well-paid executives should not be accountable for what happens in the organizations that they run. Perhaps we should not be so surprised. For a generation, the collective doctrine has been that the sorting mechanism of society should be profit. The words that have mattered are efficiency, flexibility, shareholder value, business-friendly, wealth generation, sales, impact and, in newspapers, circulation. Words degraded to the margin have been justice, fairness, tolerance, proportionality and accountability. The purpose of editing the News of the World was not to promote reader understanding, to be fair in what was written or to betray any common humanity. It was to ruin lives in the quest for circulation and impact. Ms Brooks may or may not have had suspicions about how her journalists got their stories, but she asked no questions, gave no instructions-nor received traceable, recorded answers.
3718.txt
0
[ "Glenn Mulcaire may deny phone hacking as a crime.", "more journalists may be found guilty of phone hacking.", "Andy Coulson should be held innocent of the charge.", "phone hacking will be accepted on certain occasions." ]
It can be inferred from graph 3 that
Two years ago. Rupert Murdochs daughter, spoke at the unsettling dearth of integrity across so many of our collapsed, she argued, because of a collective acceptance that the mechanismin society should be profit and the market we the people who create the society we want, not profit. Driving her point home, she continuedIts increasingly absence of purpose, of a moral language with in government, could become one of the most dangerous goals for capitalism and freedom. This same absence of moral purpose was wounding companies, such as International, she thought, making it more likely that it would fore had with widespread illegal telephone hacking. As the hacking trial concludes-finding guilty one ex-editor of the News of the World, Andy Coulson, for conspiring to hack phones, and finding the predecessor, Rebekah Brooks, innocent of the same charge-the wide dearth of integrity still stands. Journalists are known to have hacked the phones of up to 5,500 people. This is hacking on an industrial scale, as was acknowledged by Glenn Mulcaire, the man hired by the News of the World in 2001 to be the point person for phone hacking. Others await trial. This long story still unfolds. In many respects, the dearth of moral purpose frames not only the fact of such widespread phone hacking but the terms on which the trial took place. One of the astonishing revelations was how little Rebekah Brooks knew of what went on in her newsroom, how little she thought to ask and the fact that she never inquired how the stories arrived. The core of her successful defence was that she knew nothing. In todays world, it has become normal that well-paid executives should not be accountable for what happens in the organizations that they run. Perhaps we should not be so surprised. For a generation, the collective doctrine has been that the sorting mechanism of society should be profit. The words that have mattered are efficiency, flexibility, shareholder value, business-friendly, wealth generation, sales, impact and, in newspapers, circulation. Words degraded to the margin have been justice, fairness, tolerance, proportionality and accountability. The purpose of editing the News of the World was not to promote reader understanding, to be fair in what was written or to betray any common humanity. It was to ruin lives in the quest for circulation and impact. Ms Brooks may or may not have had suspicions about how her journalists got their stories, but she asked no questions, gave no instructions-nor received traceable, recorded answers.
3718.txt
1
[ "revealed a cunning personality.", "centered on trivial issues.", "was hardly convincing.", "was part of a conspiracy." ]
The author believes that Rebekah Brookss defence
Two years ago. Rupert Murdochs daughter, spoke at the unsettling dearth of integrity across so many of our collapsed, she argued, because of a collective acceptance that the mechanismin society should be profit and the market we the people who create the society we want, not profit. Driving her point home, she continuedIts increasingly absence of purpose, of a moral language with in government, could become one of the most dangerous goals for capitalism and freedom. This same absence of moral purpose was wounding companies, such as International, she thought, making it more likely that it would fore had with widespread illegal telephone hacking. As the hacking trial concludes-finding guilty one ex-editor of the News of the World, Andy Coulson, for conspiring to hack phones, and finding the predecessor, Rebekah Brooks, innocent of the same charge-the wide dearth of integrity still stands. Journalists are known to have hacked the phones of up to 5,500 people. This is hacking on an industrial scale, as was acknowledged by Glenn Mulcaire, the man hired by the News of the World in 2001 to be the point person for phone hacking. Others await trial. This long story still unfolds. In many respects, the dearth of moral purpose frames not only the fact of such widespread phone hacking but the terms on which the trial took place. One of the astonishing revelations was how little Rebekah Brooks knew of what went on in her newsroom, how little she thought to ask and the fact that she never inquired how the stories arrived. The core of her successful defence was that she knew nothing. In todays world, it has become normal that well-paid executives should not be accountable for what happens in the organizations that they run. Perhaps we should not be so surprised. For a generation, the collective doctrine has been that the sorting mechanism of society should be profit. The words that have mattered are efficiency, flexibility, shareholder value, business-friendly, wealth generation, sales, impact and, in newspapers, circulation. Words degraded to the margin have been justice, fairness, tolerance, proportionality and accountability. The purpose of editing the News of the World was not to promote reader understanding, to be fair in what was written or to betray any common humanity. It was to ruin lives in the quest for circulation and impact. Ms Brooks may or may not have had suspicions about how her journalists got their stories, but she asked no questions, gave no instructions-nor received traceable, recorded answers.
3718.txt
2
[ "generally distorted values.", "unfair wealth distribution.", "a marginalized lifestyle.", "a rigid moral code." ]
The author holds that the current collective doctrine shows
Two years ago. Rupert Murdochs daughter, spoke at the unsettling dearth of integrity across so many of our collapsed, she argued, because of a collective acceptance that the mechanismin society should be profit and the market we the people who create the society we want, not profit. Driving her point home, she continuedIts increasingly absence of purpose, of a moral language with in government, could become one of the most dangerous goals for capitalism and freedom. This same absence of moral purpose was wounding companies, such as International, she thought, making it more likely that it would fore had with widespread illegal telephone hacking. As the hacking trial concludes-finding guilty one ex-editor of the News of the World, Andy Coulson, for conspiring to hack phones, and finding the predecessor, Rebekah Brooks, innocent of the same charge-the wide dearth of integrity still stands. Journalists are known to have hacked the phones of up to 5,500 people. This is hacking on an industrial scale, as was acknowledged by Glenn Mulcaire, the man hired by the News of the World in 2001 to be the point person for phone hacking. Others await trial. This long story still unfolds. In many respects, the dearth of moral purpose frames not only the fact of such widespread phone hacking but the terms on which the trial took place. One of the astonishing revelations was how little Rebekah Brooks knew of what went on in her newsroom, how little she thought to ask and the fact that she never inquired how the stories arrived. The core of her successful defence was that she knew nothing. In todays world, it has become normal that well-paid executives should not be accountable for what happens in the organizations that they run. Perhaps we should not be so surprised. For a generation, the collective doctrine has been that the sorting mechanism of society should be profit. The words that have mattered are efficiency, flexibility, shareholder value, business-friendly, wealth generation, sales, impact and, in newspapers, circulation. Words degraded to the margin have been justice, fairness, tolerance, proportionality and accountability. The purpose of editing the News of the World was not to promote reader understanding, to be fair in what was written or to betray any common humanity. It was to ruin lives in the quest for circulation and impact. Ms Brooks may or may not have had suspicions about how her journalists got their stories, but she asked no questions, gave no instructions-nor received traceable, recorded answers.
3718.txt
0
[ "The quality of writings is of primary importance.", "Common humanity is central to news reporting.", "Moral awareness matters in editing a newspaper.", "Journalists need stricter industrial regulations." ]
Which of the following is suggested in the last graph?
Two years ago. Rupert Murdochs daughter, spoke at the unsettling dearth of integrity across so many of our collapsed, she argued, because of a collective acceptance that the mechanismin society should be profit and the market we the people who create the society we want, not profit. Driving her point home, she continuedIts increasingly absence of purpose, of a moral language with in government, could become one of the most dangerous goals for capitalism and freedom. This same absence of moral purpose was wounding companies, such as International, she thought, making it more likely that it would fore had with widespread illegal telephone hacking. As the hacking trial concludes-finding guilty one ex-editor of the News of the World, Andy Coulson, for conspiring to hack phones, and finding the predecessor, Rebekah Brooks, innocent of the same charge-the wide dearth of integrity still stands. Journalists are known to have hacked the phones of up to 5,500 people. This is hacking on an industrial scale, as was acknowledged by Glenn Mulcaire, the man hired by the News of the World in 2001 to be the point person for phone hacking. Others await trial. This long story still unfolds. In many respects, the dearth of moral purpose frames not only the fact of such widespread phone hacking but the terms on which the trial took place. One of the astonishing revelations was how little Rebekah Brooks knew of what went on in her newsroom, how little she thought to ask and the fact that she never inquired how the stories arrived. The core of her successful defence was that she knew nothing. In todays world, it has become normal that well-paid executives should not be accountable for what happens in the organizations that they run. Perhaps we should not be so surprised. For a generation, the collective doctrine has been that the sorting mechanism of society should be profit. The words that have mattered are efficiency, flexibility, shareholder value, business-friendly, wealth generation, sales, impact and, in newspapers, circulation. Words degraded to the margin have been justice, fairness, tolerance, proportionality and accountability. The purpose of editing the News of the World was not to promote reader understanding, to be fair in what was written or to betray any common humanity. It was to ruin lives in the quest for circulation and impact. Ms Brooks may or may not have had suspicions about how her journalists got their stories, but she asked no questions, gave no instructions-nor received traceable, recorded answers.
3718.txt
2
[ "the guarantee of lifetime employment", "the consequence of recessions and automation", "the effect of lifetime employment", "the prospects of capitalism" ]
The observers are divided with regard to their attitudes towards _ .
In Japan many workers for large corporations have a guarantee of lifetime employment. They will not be laid off during recessions or when the tasks they perform are taken over by robots. To some observes, this is capitalism at its best, because workers are treated as people not things. Others see it as necessarily inefficient and believe it cannot continue if Japan is to remain competitive with foreign corporations more concerned about profits and less concerned about people. Defenders of the system argue that those who call it inefficient do not understand how it really works. In the first place not every Japanese worker has the guarantee of a lifetime job. The lifetime employment system includes only " regular employees" . Many employees do not fall into this category, including all women. All businesses have many part-time and temporary employees. These workers are hired and laid off during the course of the business cycle just as employees in the United States are. These " irregular workers" make up about 10 percent of the nonagricultural work force. Additionally, Japanese firms maintain some flexibility through the extensive use of subcontractors. This practice is much more common in Japan than in the United States. The use of both subcontractors and temporary workers has increased markedly in Japan since the 1974-1975 recession. All this leads some to argue that the Japanese system is not all that different from the American system. During recessions Japanese corporations lay off temporary workers and give less business to subcontractors. In the United States, corporations lay off those workers with the least seniority. The difference then is probably less than the term " lifetime employment" suggests, but there still is a difference. And this difference cannot be understood without looking at the values of Japanese society. The relationship between employer and employee cannot be explained in purely contractual terms. Firms hold on to the employees and employees stay with one firm. There are also practical reasons for not jumping from job to job. Most retirement benefits come from the employer. Changing jobs means losing these benefits. Also, teamwork is an essential part of Japanese production. Moving to a new firm means adapting to a different team and at least temporarily, lower productivity and lower pay.
793.txt
2
[ "defenders themselves do not appreciate the system", "about 90% of \" irregular workers\" are employed in agriculture", "the business cycle occurs more often in Japan and in the U.S.", "not all employees can benefit from the policy" ]
It is stated in the second paragraph that _ .
In Japan many workers for large corporations have a guarantee of lifetime employment. They will not be laid off during recessions or when the tasks they perform are taken over by robots. To some observes, this is capitalism at its best, because workers are treated as people not things. Others see it as necessarily inefficient and believe it cannot continue if Japan is to remain competitive with foreign corporations more concerned about profits and less concerned about people. Defenders of the system argue that those who call it inefficient do not understand how it really works. In the first place not every Japanese worker has the guarantee of a lifetime job. The lifetime employment system includes only " regular employees" . Many employees do not fall into this category, including all women. All businesses have many part-time and temporary employees. These workers are hired and laid off during the course of the business cycle just as employees in the United States are. These " irregular workers" make up about 10 percent of the nonagricultural work force. Additionally, Japanese firms maintain some flexibility through the extensive use of subcontractors. This practice is much more common in Japan than in the United States. The use of both subcontractors and temporary workers has increased markedly in Japan since the 1974-1975 recession. All this leads some to argue that the Japanese system is not all that different from the American system. During recessions Japanese corporations lay off temporary workers and give less business to subcontractors. In the United States, corporations lay off those workers with the least seniority. The difference then is probably less than the term " lifetime employment" suggests, but there still is a difference. And this difference cannot be understood without looking at the values of Japanese society. The relationship between employer and employee cannot be explained in purely contractual terms. Firms hold on to the employees and employees stay with one firm. There are also practical reasons for not jumping from job to job. Most retirement benefits come from the employer. Changing jobs means losing these benefits. Also, teamwork is an essential part of Japanese production. Moving to a new firm means adapting to a different team and at least temporarily, lower productivity and lower pay.
793.txt
3
[ "regular employees", "part-time workers", "junior employees", "temporary workers" ]
During recessions those who are to be fired first in the U.S. corporations are _ .
In Japan many workers for large corporations have a guarantee of lifetime employment. They will not be laid off during recessions or when the tasks they perform are taken over by robots. To some observes, this is capitalism at its best, because workers are treated as people not things. Others see it as necessarily inefficient and believe it cannot continue if Japan is to remain competitive with foreign corporations more concerned about profits and less concerned about people. Defenders of the system argue that those who call it inefficient do not understand how it really works. In the first place not every Japanese worker has the guarantee of a lifetime job. The lifetime employment system includes only " regular employees" . Many employees do not fall into this category, including all women. All businesses have many part-time and temporary employees. These workers are hired and laid off during the course of the business cycle just as employees in the United States are. These " irregular workers" make up about 10 percent of the nonagricultural work force. Additionally, Japanese firms maintain some flexibility through the extensive use of subcontractors. This practice is much more common in Japan than in the United States. The use of both subcontractors and temporary workers has increased markedly in Japan since the 1974-1975 recession. All this leads some to argue that the Japanese system is not all that different from the American system. During recessions Japanese corporations lay off temporary workers and give less business to subcontractors. In the United States, corporations lay off those workers with the least seniority. The difference then is probably less than the term " lifetime employment" suggests, but there still is a difference. And this difference cannot be understood without looking at the values of Japanese society. The relationship between employer and employee cannot be explained in purely contractual terms. Firms hold on to the employees and employees stay with one firm. There are also practical reasons for not jumping from job to job. Most retirement benefits come from the employer. Changing jobs means losing these benefits. Also, teamwork is an essential part of Japanese production. Moving to a new firm means adapting to a different team and at least temporarily, lower productivity and lower pay.
793.txt
2
[ "use subcontractors more extensively", "are less flexible in terms of lifetime employment", "hold on to the values of society", "are more efficient in competition than the latter" ]
According to the passage, Japanese firms differ strikingly from American firms in that the former _ .
In Japan many workers for large corporations have a guarantee of lifetime employment. They will not be laid off during recessions or when the tasks they perform are taken over by robots. To some observes, this is capitalism at its best, because workers are treated as people not things. Others see it as necessarily inefficient and believe it cannot continue if Japan is to remain competitive with foreign corporations more concerned about profits and less concerned about people. Defenders of the system argue that those who call it inefficient do not understand how it really works. In the first place not every Japanese worker has the guarantee of a lifetime job. The lifetime employment system includes only " regular employees" . Many employees do not fall into this category, including all women. All businesses have many part-time and temporary employees. These workers are hired and laid off during the course of the business cycle just as employees in the United States are. These " irregular workers" make up about 10 percent of the nonagricultural work force. Additionally, Japanese firms maintain some flexibility through the extensive use of subcontractors. This practice is much more common in Japan than in the United States. The use of both subcontractors and temporary workers has increased markedly in Japan since the 1974-1975 recession. All this leads some to argue that the Japanese system is not all that different from the American system. During recessions Japanese corporations lay off temporary workers and give less business to subcontractors. In the United States, corporations lay off those workers with the least seniority. The difference then is probably less than the term " lifetime employment" suggests, but there still is a difference. And this difference cannot be understood without looking at the values of Japanese society. The relationship between employer and employee cannot be explained in purely contractual terms. Firms hold on to the employees and employees stay with one firm. There are also practical reasons for not jumping from job to job. Most retirement benefits come from the employer. Changing jobs means losing these benefits. Also, teamwork is an essential part of Japanese production. Moving to a new firm means adapting to a different team and at least temporarily, lower productivity and lower pay.
793.txt
0
[ "He will probably be underpaid.", "He will not be entitled to some job benefits.", "He has been accustomed to the teamwork.", "He will be looked down upon by his prospective employer." ]
Which of the following does NOT account for the fact that a Japanese worker is reluctant to change his job?
In Japan many workers for large corporations have a guarantee of lifetime employment. They will not be laid off during recessions or when the tasks they perform are taken over by robots. To some observes, this is capitalism at its best, because workers are treated as people not things. Others see it as necessarily inefficient and believe it cannot continue if Japan is to remain competitive with foreign corporations more concerned about profits and less concerned about people. Defenders of the system argue that those who call it inefficient do not understand how it really works. In the first place not every Japanese worker has the guarantee of a lifetime job. The lifetime employment system includes only " regular employees" . Many employees do not fall into this category, including all women. All businesses have many part-time and temporary employees. These workers are hired and laid off during the course of the business cycle just as employees in the United States are. These " irregular workers" make up about 10 percent of the nonagricultural work force. Additionally, Japanese firms maintain some flexibility through the extensive use of subcontractors. This practice is much more common in Japan than in the United States. The use of both subcontractors and temporary workers has increased markedly in Japan since the 1974-1975 recession. All this leads some to argue that the Japanese system is not all that different from the American system. During recessions Japanese corporations lay off temporary workers and give less business to subcontractors. In the United States, corporations lay off those workers with the least seniority. The difference then is probably less than the term " lifetime employment" suggests, but there still is a difference. And this difference cannot be understood without looking at the values of Japanese society. The relationship between employer and employee cannot be explained in purely contractual terms. Firms hold on to the employees and employees stay with one firm. There are also practical reasons for not jumping from job to job. Most retirement benefits come from the employer. Changing jobs means losing these benefits. Also, teamwork is an essential part of Japanese production. Moving to a new firm means adapting to a different team and at least temporarily, lower productivity and lower pay.
793.txt
3
[ "to explain why things happen", "to explain how things happen", "to describe self-evident principles", "to support Aristotelian science" ]
The aim of controlled scientific experiments is .
In science the meaning of the word "explain" suffers with civilization's every step in search of reality. Science cannot really explain electricity, magnetism, and gravitation; their effects can be measured and predicted, but of their nature no more is known to the modern scientist than to Thales who first looked into the nature of the electrification of amber, a hard yellowish-brown gum. Most contemporary physicists reject the notion that man can ever discover what these mysterious forces "really" are. "Electricity," Bertrand Russell says, "is not a thing, like St. Paul's Cathedral; it is a way in which things behave. When we have told how things behave when they are electrified, and under what circumstances they are electrified, we have told all there is to tell." Until recently scientists would have disapproved of such an idea. Aristotle, for example, whose natural science dominated Western thought for two thousand years, believed that man could arrive at an understanding of reality by reasoning from self-evident principles. He felt, for example, that it is a self-evident principle that everything in the universe has its proper place, hence one can deduce that objects fall to the ground because that's where they belong, and smoke goes up because that's where it belongs. The goal of Aristotelian science was to explain why things happen. Modern science was born when Galileo began trying to explain how things happen and thus originated the method of controlled experiment which now forms the basis of scientific investigation.
3810.txt
1
[ "the speculations of Thales", "the forces of electricity, magnetism, and gravity", "Aristotle's natural science", "Galileo's discoveries" ]
What principles most influenced scientific thought for two thousand years?
In science the meaning of the word "explain" suffers with civilization's every step in search of reality. Science cannot really explain electricity, magnetism, and gravitation; their effects can be measured and predicted, but of their nature no more is known to the modern scientist than to Thales who first looked into the nature of the electrification of amber, a hard yellowish-brown gum. Most contemporary physicists reject the notion that man can ever discover what these mysterious forces "really" are. "Electricity," Bertrand Russell says, "is not a thing, like St. Paul's Cathedral; it is a way in which things behave. When we have told how things behave when they are electrified, and under what circumstances they are electrified, we have told all there is to tell." Until recently scientists would have disapproved of such an idea. Aristotle, for example, whose natural science dominated Western thought for two thousand years, believed that man could arrive at an understanding of reality by reasoning from self-evident principles. He felt, for example, that it is a self-evident principle that everything in the universe has its proper place, hence one can deduce that objects fall to the ground because that's where they belong, and smoke goes up because that's where it belongs. The goal of Aristotelian science was to explain why things happen. Modern science was born when Galileo began trying to explain how things happen and thus originated the method of controlled experiment which now forms the basis of scientific investigation.
3810.txt
2
[ "disapproved of by most modern scientists", "in agreement with Aristotle's theory of self-evident principles", "in agreement with scientific investigation directed toward \"how\" things happen", "in agreement with scientific investigation directed toward \"why\" things happen" ]
Bertrand Russell's notion about electricity is .
In science the meaning of the word "explain" suffers with civilization's every step in search of reality. Science cannot really explain electricity, magnetism, and gravitation; their effects can be measured and predicted, but of their nature no more is known to the modern scientist than to Thales who first looked into the nature of the electrification of amber, a hard yellowish-brown gum. Most contemporary physicists reject the notion that man can ever discover what these mysterious forces "really" are. "Electricity," Bertrand Russell says, "is not a thing, like St. Paul's Cathedral; it is a way in which things behave. When we have told how things behave when they are electrified, and under what circumstances they are electrified, we have told all there is to tell." Until recently scientists would have disapproved of such an idea. Aristotle, for example, whose natural science dominated Western thought for two thousand years, believed that man could arrive at an understanding of reality by reasoning from self-evident principles. He felt, for example, that it is a self-evident principle that everything in the universe has its proper place, hence one can deduce that objects fall to the ground because that's where they belong, and smoke goes up because that's where it belongs. The goal of Aristotelian science was to explain why things happen. Modern science was born when Galileo began trying to explain how things happen and thus originated the method of controlled experiment which now forms the basis of scientific investigation.
3810.txt
2
[ "that there are mysterious forces in the universe", "that man cannot discover what forces \"really\" are", "that there are self-evident principles", "that we can discover why things behave as they do" ]
The passage says that until recently scientists disagreed with the idea .
In science the meaning of the word "explain" suffers with civilization's every step in search of reality. Science cannot really explain electricity, magnetism, and gravitation; their effects can be measured and predicted, but of their nature no more is known to the modern scientist than to Thales who first looked into the nature of the electrification of amber, a hard yellowish-brown gum. Most contemporary physicists reject the notion that man can ever discover what these mysterious forces "really" are. "Electricity," Bertrand Russell says, "is not a thing, like St. Paul's Cathedral; it is a way in which things behave. When we have told how things behave when they are electrified, and under what circumstances they are electrified, we have told all there is to tell." Until recently scientists would have disapproved of such an idea. Aristotle, for example, whose natural science dominated Western thought for two thousand years, believed that man could arrive at an understanding of reality by reasoning from self-evident principles. He felt, for example, that it is a self-evident principle that everything in the universe has its proper place, hence one can deduce that objects fall to the ground because that's where they belong, and smoke goes up because that's where it belongs. The goal of Aristotelian science was to explain why things happen. Modern science was born when Galileo began trying to explain how things happen and thus originated the method of controlled experiment which now forms the basis of scientific investigation.
3810.txt
1
[ "when the method of controlled experiment was first introduced", "when Galileo succeeded in explaining how things happen", "when Aristotelian scientist tried to explain why things happen", "when scientists were able to acquire an understanding of reality of reasoning" ]
Modern science came into being .
In science the meaning of the word "explain" suffers with civilization's every step in search of reality. Science cannot really explain electricity, magnetism, and gravitation; their effects can be measured and predicted, but of their nature no more is known to the modern scientist than to Thales who first looked into the nature of the electrification of amber, a hard yellowish-brown gum. Most contemporary physicists reject the notion that man can ever discover what these mysterious forces "really" are. "Electricity," Bertrand Russell says, "is not a thing, like St. Paul's Cathedral; it is a way in which things behave. When we have told how things behave when they are electrified, and under what circumstances they are electrified, we have told all there is to tell." Until recently scientists would have disapproved of such an idea. Aristotle, for example, whose natural science dominated Western thought for two thousand years, believed that man could arrive at an understanding of reality by reasoning from self-evident principles. He felt, for example, that it is a self-evident principle that everything in the universe has its proper place, hence one can deduce that objects fall to the ground because that's where they belong, and smoke goes up because that's where it belongs. The goal of Aristotelian science was to explain why things happen. Modern science was born when Galileo began trying to explain how things happen and thus originated the method of controlled experiment which now forms the basis of scientific investigation.
3810.txt
0
[ "illustrate the change of height of NBA players.", "show the popularity of NBA players in the U.S..", "compare different generations of NBA players.", "assess the achievements of famous NBA players." ]
Wilt Chamberlain is cited as an example to
In the early 1960s Wilt Chamberlain was one of only three players in the National Basketball Association (NBA) listed at over seven feet. If he had played last season, however, he would have been one of 42. The bodies playing major professional sports have changed dramatically over the years, and managers have been more than willing to adjust team uniforms to fit the growing numbers of bigger, longer frames. The trend in sports, though, may be obscuring an unrecognized reality: Americans have generally stopped growing. Though typically about two inches taller now than 140 years ago, today's people - especially those born to families who have lived in the U.S. for many generations - apparently reached their limit in the early 1960s. And they aren't likely to get any taller. "In the general population today, at this genetic, environmental level, we've pretty much gone as far as we can go," says anthropologist William Cameron Chumlea of Wright State University. In the case of NBA players, their increase in height appears to result from the increasingly common practice of recruiting players from all over the world. Growth, which rarely continues beyond the age of 20, demands calories and nutrients -notably, protein - to feed expanding tissues. At the start of the 20th century, under-nutrition and childhood infections got in the way. But as diet and health improved, children and adolescents have, on average, increased in height by about an inch and a half every 20 years, a pattern known as the secular trend in height. Yet according to the Centers for Disease Control and Prevention, average height - 5′9″ for men, 5′4″ for women - hasn't really changed since 1960. Genetically speaking, there are advantages to avoiding substantial height. During childbirth, larger babies have more difficulty passing through the birth canal. Moreover, even though humans have been upright for millions of years, our feet and back continue to struggle with bipedal posture and cannot easily withstand repeated strain imposed by oversize limbs. "There are some real constraints that are set by the genetic architecture of the individual organism," says anthropologist William Leonard of Northwestern University. Genetic maximums can change, but don't expect this to happen soon. Claire C. Gordon, senior anthropologist at the Army Research Center in Natick, Mass., ensures that 90 percent of the uniforms and workstations fit recruits without alteration. She says that, unlike those for basketball, the length of military uniforms has not changed for some time. And if you need to predict human height in the near future to design a piece of equipment, Gordon says that by and large, "you could use today's data and feel fairly confident."
2642.txt
0
[ "Genetic modification.", "Natural environment.", "Living standards.", "Daily exercise." ]
Which of the following plays a key role in body growth according to the text?
In the early 1960s Wilt Chamberlain was one of only three players in the National Basketball Association (NBA) listed at over seven feet. If he had played last season, however, he would have been one of 42. The bodies playing major professional sports have changed dramatically over the years, and managers have been more than willing to adjust team uniforms to fit the growing numbers of bigger, longer frames. The trend in sports, though, may be obscuring an unrecognized reality: Americans have generally stopped growing. Though typically about two inches taller now than 140 years ago, today's people - especially those born to families who have lived in the U.S. for many generations - apparently reached their limit in the early 1960s. And they aren't likely to get any taller. "In the general population today, at this genetic, environmental level, we've pretty much gone as far as we can go," says anthropologist William Cameron Chumlea of Wright State University. In the case of NBA players, their increase in height appears to result from the increasingly common practice of recruiting players from all over the world. Growth, which rarely continues beyond the age of 20, demands calories and nutrients -notably, protein - to feed expanding tissues. At the start of the 20th century, under-nutrition and childhood infections got in the way. But as diet and health improved, children and adolescents have, on average, increased in height by about an inch and a half every 20 years, a pattern known as the secular trend in height. Yet according to the Centers for Disease Control and Prevention, average height - 5′9″ for men, 5′4″ for women - hasn't really changed since 1960. Genetically speaking, there are advantages to avoiding substantial height. During childbirth, larger babies have more difficulty passing through the birth canal. Moreover, even though humans have been upright for millions of years, our feet and back continue to struggle with bipedal posture and cannot easily withstand repeated strain imposed by oversize limbs. "There are some real constraints that are set by the genetic architecture of the individual organism," says anthropologist William Leonard of Northwestern University. Genetic maximums can change, but don't expect this to happen soon. Claire C. Gordon, senior anthropologist at the Army Research Center in Natick, Mass., ensures that 90 percent of the uniforms and workstations fit recruits without alteration. She says that, unlike those for basketball, the length of military uniforms has not changed for some time. And if you need to predict human height in the near future to design a piece of equipment, Gordon says that by and large, "you could use today's data and feel fairly confident."
2642.txt
2
[ "Non-Americans add to the average height of the nation.", "Human height is conditioned by the upright posture.", "Americans are the tallest on average in the world.", "Larger babies tend to become taller in adulthood." ]
On which of the following statements would the author most probably agree?
In the early 1960s Wilt Chamberlain was one of only three players in the National Basketball Association (NBA) listed at over seven feet. If he had played last season, however, he would have been one of 42. The bodies playing major professional sports have changed dramatically over the years, and managers have been more than willing to adjust team uniforms to fit the growing numbers of bigger, longer frames. The trend in sports, though, may be obscuring an unrecognized reality: Americans have generally stopped growing. Though typically about two inches taller now than 140 years ago, today's people - especially those born to families who have lived in the U.S. for many generations - apparently reached their limit in the early 1960s. And they aren't likely to get any taller. "In the general population today, at this genetic, environmental level, we've pretty much gone as far as we can go," says anthropologist William Cameron Chumlea of Wright State University. In the case of NBA players, their increase in height appears to result from the increasingly common practice of recruiting players from all over the world. Growth, which rarely continues beyond the age of 20, demands calories and nutrients -notably, protein - to feed expanding tissues. At the start of the 20th century, under-nutrition and childhood infections got in the way. But as diet and health improved, children and adolescents have, on average, increased in height by about an inch and a half every 20 years, a pattern known as the secular trend in height. Yet according to the Centers for Disease Control and Prevention, average height - 5′9″ for men, 5′4″ for women - hasn't really changed since 1960. Genetically speaking, there are advantages to avoiding substantial height. During childbirth, larger babies have more difficulty passing through the birth canal. Moreover, even though humans have been upright for millions of years, our feet and back continue to struggle with bipedal posture and cannot easily withstand repeated strain imposed by oversize limbs. "There are some real constraints that are set by the genetic architecture of the individual organism," says anthropologist William Leonard of Northwestern University. Genetic maximums can change, but don't expect this to happen soon. Claire C. Gordon, senior anthropologist at the Army Research Center in Natick, Mass., ensures that 90 percent of the uniforms and workstations fit recruits without alteration. She says that, unlike those for basketball, the length of military uniforms has not changed for some time. And if you need to predict human height in the near future to design a piece of equipment, Gordon says that by and large, "you could use today's data and feel fairly confident."
2642.txt
1
[ "the garment industry will reconsider the uniform size.", "the design of military uniforms will remain unchanged.", "genetic testing will be employed in selecting sportsmen.", "the existing data of human height will still be applicable." ]
We learn from the last paragraph that in the near future
In the early 1960s Wilt Chamberlain was one of only three players in the National Basketball Association (NBA) listed at over seven feet. If he had played last season, however, he would have been one of 42. The bodies playing major professional sports have changed dramatically over the years, and managers have been more than willing to adjust team uniforms to fit the growing numbers of bigger, longer frames. The trend in sports, though, may be obscuring an unrecognized reality: Americans have generally stopped growing. Though typically about two inches taller now than 140 years ago, today's people - especially those born to families who have lived in the U.S. for many generations - apparently reached their limit in the early 1960s. And they aren't likely to get any taller. "In the general population today, at this genetic, environmental level, we've pretty much gone as far as we can go," says anthropologist William Cameron Chumlea of Wright State University. In the case of NBA players, their increase in height appears to result from the increasingly common practice of recruiting players from all over the world. Growth, which rarely continues beyond the age of 20, demands calories and nutrients -notably, protein - to feed expanding tissues. At the start of the 20th century, under-nutrition and childhood infections got in the way. But as diet and health improved, children and adolescents have, on average, increased in height by about an inch and a half every 20 years, a pattern known as the secular trend in height. Yet according to the Centers for Disease Control and Prevention, average height - 5′9″ for men, 5′4″ for women - hasn't really changed since 1960. Genetically speaking, there are advantages to avoiding substantial height. During childbirth, larger babies have more difficulty passing through the birth canal. Moreover, even though humans have been upright for millions of years, our feet and back continue to struggle with bipedal posture and cannot easily withstand repeated strain imposed by oversize limbs. "There are some real constraints that are set by the genetic architecture of the individual organism," says anthropologist William Leonard of Northwestern University. Genetic maximums can change, but don't expect this to happen soon. Claire C. Gordon, senior anthropologist at the Army Research Center in Natick, Mass., ensures that 90 percent of the uniforms and workstations fit recruits without alteration. She says that, unlike those for basketball, the length of military uniforms has not changed for some time. And if you need to predict human height in the near future to design a piece of equipment, Gordon says that by and large, "you could use today's data and feel fairly confident."
2642.txt
3
[ "the change of human height follows a cyclic pattern.", "human height is becoming even more predictable.", "Americans have reached their genetic growth limit.", "the genetic pattern of Americans has altered." ]
The text intends to tell us that
In the early 1960s Wilt Chamberlain was one of only three players in the National Basketball Association (NBA) listed at over seven feet. If he had played last season, however, he would have been one of 42. The bodies playing major professional sports have changed dramatically over the years, and managers have been more than willing to adjust team uniforms to fit the growing numbers of bigger, longer frames. The trend in sports, though, may be obscuring an unrecognized reality: Americans have generally stopped growing. Though typically about two inches taller now than 140 years ago, today's people - especially those born to families who have lived in the U.S. for many generations - apparently reached their limit in the early 1960s. And they aren't likely to get any taller. "In the general population today, at this genetic, environmental level, we've pretty much gone as far as we can go," says anthropologist William Cameron Chumlea of Wright State University. In the case of NBA players, their increase in height appears to result from the increasingly common practice of recruiting players from all over the world. Growth, which rarely continues beyond the age of 20, demands calories and nutrients -notably, protein - to feed expanding tissues. At the start of the 20th century, under-nutrition and childhood infections got in the way. But as diet and health improved, children and adolescents have, on average, increased in height by about an inch and a half every 20 years, a pattern known as the secular trend in height. Yet according to the Centers for Disease Control and Prevention, average height - 5′9″ for men, 5′4″ for women - hasn't really changed since 1960. Genetically speaking, there are advantages to avoiding substantial height. During childbirth, larger babies have more difficulty passing through the birth canal. Moreover, even though humans have been upright for millions of years, our feet and back continue to struggle with bipedal posture and cannot easily withstand repeated strain imposed by oversize limbs. "There are some real constraints that are set by the genetic architecture of the individual organism," says anthropologist William Leonard of Northwestern University. Genetic maximums can change, but don't expect this to happen soon. Claire C. Gordon, senior anthropologist at the Army Research Center in Natick, Mass., ensures that 90 percent of the uniforms and workstations fit recruits without alteration. She says that, unlike those for basketball, the length of military uniforms has not changed for some time. And if you need to predict human height in the near future to design a piece of equipment, Gordon says that by and large, "you could use today's data and feel fairly confident."
2642.txt
2
[ "adds some new sections to most chapters", "introduces more difficulties to students", "remains almost unchanged", "has considerably improved the balance between theory and practice" ]
Compared with the old edition, this new edition _ .
The reading passages in Lower English Course were designed to introduce a large number of points associated with vocabulary, grammar and construction that often cause difficulties to students. These have been retained within the structure of this edition together with the practice exercises which formed part of most chapters. With the introduction of a considerably modified examination however, a large part of the earlier book has been rewritten so as to ensure adequate preparation for all sections of the new syllabus. The notes to each chapter have been condensed to essentials to allow additional material to be included together with guidance in pronunciation, special grammatical points and use of prepositions, related in each of these cases to the foregoing comprehension passage. New sections have been added to most chapters, consisting of multiple choice questions based on the comprehension passage, a number of questions of the type that will appear on the Use of English paper and finally considerable practice in spoken English, including conventional usage, conversations, situations to be dealt with orally and discussion topics. The Chapter on spoken English has been adapted to the new " Interview" and " play extract" reading passage with advice and examples included. Other sections deal with the new type of summary and with composition writing, though each of the proposed types of composition is presented separately as part of a chapter. The student undertaking this course should already have a good elementary knowledge of English. When classes have as many as ten weekly lessons most of the material can be dealt with in class, but students in groups which meet for not more than 4-5 hours weekly have to do a good deal of preparation at home with class guidance and checking. A key is available separately and the material is presented clearly enough to enable a student working alone to derive considerable benefit from it.
792.txt
0
[ "because a new syllabus had been introduced", "so that the notes could be included", "because a number of multiple choice questions had to be dealt with orally", "and advice and examples have been condensed" ]
According to author, this new edition has been adapted _ .
The reading passages in Lower English Course were designed to introduce a large number of points associated with vocabulary, grammar and construction that often cause difficulties to students. These have been retained within the structure of this edition together with the practice exercises which formed part of most chapters. With the introduction of a considerably modified examination however, a large part of the earlier book has been rewritten so as to ensure adequate preparation for all sections of the new syllabus. The notes to each chapter have been condensed to essentials to allow additional material to be included together with guidance in pronunciation, special grammatical points and use of prepositions, related in each of these cases to the foregoing comprehension passage. New sections have been added to most chapters, consisting of multiple choice questions based on the comprehension passage, a number of questions of the type that will appear on the Use of English paper and finally considerable practice in spoken English, including conventional usage, conversations, situations to be dealt with orally and discussion topics. The Chapter on spoken English has been adapted to the new " Interview" and " play extract" reading passage with advice and examples included. Other sections deal with the new type of summary and with composition writing, though each of the proposed types of composition is presented separately as part of a chapter. The student undertaking this course should already have a good elementary knowledge of English. When classes have as many as ten weekly lessons most of the material can be dealt with in class, but students in groups which meet for not more than 4-5 hours weekly have to do a good deal of preparation at home with class guidance and checking. A key is available separately and the material is presented clearly enough to enable a student working alone to derive considerable benefit from it.
792.txt
0
[ "additional materials", "practice exercises", "comprehension passages", "writing compositions" ]
It can be inferred from the passage that the major component of the textbook is _ .
The reading passages in Lower English Course were designed to introduce a large number of points associated with vocabulary, grammar and construction that often cause difficulties to students. These have been retained within the structure of this edition together with the practice exercises which formed part of most chapters. With the introduction of a considerably modified examination however, a large part of the earlier book has been rewritten so as to ensure adequate preparation for all sections of the new syllabus. The notes to each chapter have been condensed to essentials to allow additional material to be included together with guidance in pronunciation, special grammatical points and use of prepositions, related in each of these cases to the foregoing comprehension passage. New sections have been added to most chapters, consisting of multiple choice questions based on the comprehension passage, a number of questions of the type that will appear on the Use of English paper and finally considerable practice in spoken English, including conventional usage, conversations, situations to be dealt with orally and discussion topics. The Chapter on spoken English has been adapted to the new " Interview" and " play extract" reading passage with advice and examples included. Other sections deal with the new type of summary and with composition writing, though each of the proposed types of composition is presented separately as part of a chapter. The student undertaking this course should already have a good elementary knowledge of English. When classes have as many as ten weekly lessons most of the material can be dealt with in class, but students in groups which meet for not more than 4-5 hours weekly have to do a good deal of preparation at home with class guidance and checking. A key is available separately and the material is presented clearly enough to enable a student working alone to derive considerable benefit from it.
792.txt
2
[ "its users may be beginners of the English language", "it can be used for different course arrangements", "only advanced learners can benefit from it", "its learners must spend at least 10 hours on it per week" ]
One of the features of this textbook as mentioned in the last paragraph is that _ .
The reading passages in Lower English Course were designed to introduce a large number of points associated with vocabulary, grammar and construction that often cause difficulties to students. These have been retained within the structure of this edition together with the practice exercises which formed part of most chapters. With the introduction of a considerably modified examination however, a large part of the earlier book has been rewritten so as to ensure adequate preparation for all sections of the new syllabus. The notes to each chapter have been condensed to essentials to allow additional material to be included together with guidance in pronunciation, special grammatical points and use of prepositions, related in each of these cases to the foregoing comprehension passage. New sections have been added to most chapters, consisting of multiple choice questions based on the comprehension passage, a number of questions of the type that will appear on the Use of English paper and finally considerable practice in spoken English, including conventional usage, conversations, situations to be dealt with orally and discussion topics. The Chapter on spoken English has been adapted to the new " Interview" and " play extract" reading passage with advice and examples included. Other sections deal with the new type of summary and with composition writing, though each of the proposed types of composition is presented separately as part of a chapter. The student undertaking this course should already have a good elementary knowledge of English. When classes have as many as ten weekly lessons most of the material can be dealt with in class, but students in groups which meet for not more than 4-5 hours weekly have to do a good deal of preparation at home with class guidance and checking. A key is available separately and the material is presented clearly enough to enable a student working alone to derive considerable benefit from it.
792.txt
1
[ "a scientific paper", "a preface", "an interview", "a news article" ]
The passage is most likely a part of _ .
The reading passages in Lower English Course were designed to introduce a large number of points associated with vocabulary, grammar and construction that often cause difficulties to students. These have been retained within the structure of this edition together with the practice exercises which formed part of most chapters. With the introduction of a considerably modified examination however, a large part of the earlier book has been rewritten so as to ensure adequate preparation for all sections of the new syllabus. The notes to each chapter have been condensed to essentials to allow additional material to be included together with guidance in pronunciation, special grammatical points and use of prepositions, related in each of these cases to the foregoing comprehension passage. New sections have been added to most chapters, consisting of multiple choice questions based on the comprehension passage, a number of questions of the type that will appear on the Use of English paper and finally considerable practice in spoken English, including conventional usage, conversations, situations to be dealt with orally and discussion topics. The Chapter on spoken English has been adapted to the new " Interview" and " play extract" reading passage with advice and examples included. Other sections deal with the new type of summary and with composition writing, though each of the proposed types of composition is presented separately as part of a chapter. The student undertaking this course should already have a good elementary knowledge of English. When classes have as many as ten weekly lessons most of the material can be dealt with in class, but students in groups which meet for not more than 4-5 hours weekly have to do a good deal of preparation at home with class guidance and checking. A key is available separately and the material is presented clearly enough to enable a student working alone to derive considerable benefit from it.
792.txt
1
[ "In 1971.", "In 1976.", "In 1966.", "In 1900." ]
When did Walt Disney die?
Disney World, Florida, is the biggest amusement resort in the world. It covers 24. 4 thousand acres, and is twice the size of Manhattan. It was opened on October 1, 1971, five years after Walt Disney's death, and it is a larger, slightly more ambitious version of Disneyland near Los Angeles. Foreigners tend to associate Walt Disney with Snow White and the Seven Dwarfs, and with his other famous cartoon characters, Mickey Mouse, Donald Duck and Pluto, or with his nature films, whose superb photography is spoiled, in the opinion of some, by the vulgarity of the commentary and musical background. There is very little that could be called vulgar in Disney World. It attracts people of most tastes and most income groups, and people of all ages, from toddlers to adults. But the central attraction of the resort is the Magic Kingdom. Between the huge parking lots and the Magic Kingdom lies a broad artificial lake. In the distance rise the towers of Cinderella's Castle, which like every other building in the Kingdom is built of solid materials. Even getting to the Magic Kingdom is quite an adventure. You have a choice of transportation. You can either cross the lake on a replica of a Mississippi paddle-wheeler, or you can glide around the shore in a streamlined monorail train. When you reach the terminal, you walk straight into a little square which faces Main Street. Main Street is late 19th century. There are modern shops inside the buildings, but all the facades are of the period. There are hanging baskets full of red and white flowers, and there is no traffic except a horse-drawn streetcar and an ancient double-decker bus. Yet as you walk through the Magic Kingdom, you are actually walking on top of a network of underground roads. This is how the shops, restaurants and all the other material need of the Magic Kingdom are invisibly supplied.
1474.txt
2
[ "the Seven Dwarfs", "Mickey Mouse", "Donald Duck", "the Magic Kingdom" ]
The main attraction of Disney World is_ .
Disney World, Florida, is the biggest amusement resort in the world. It covers 24. 4 thousand acres, and is twice the size of Manhattan. It was opened on October 1, 1971, five years after Walt Disney's death, and it is a larger, slightly more ambitious version of Disneyland near Los Angeles. Foreigners tend to associate Walt Disney with Snow White and the Seven Dwarfs, and with his other famous cartoon characters, Mickey Mouse, Donald Duck and Pluto, or with his nature films, whose superb photography is spoiled, in the opinion of some, by the vulgarity of the commentary and musical background. There is very little that could be called vulgar in Disney World. It attracts people of most tastes and most income groups, and people of all ages, from toddlers to adults. But the central attraction of the resort is the Magic Kingdom. Between the huge parking lots and the Magic Kingdom lies a broad artificial lake. In the distance rise the towers of Cinderella's Castle, which like every other building in the Kingdom is built of solid materials. Even getting to the Magic Kingdom is quite an adventure. You have a choice of transportation. You can either cross the lake on a replica of a Mississippi paddle-wheeler, or you can glide around the shore in a streamlined monorail train. When you reach the terminal, you walk straight into a little square which faces Main Street. Main Street is late 19th century. There are modern shops inside the buildings, but all the facades are of the period. There are hanging baskets full of red and white flowers, and there is no traffic except a horse-drawn streetcar and an ancient double-decker bus. Yet as you walk through the Magic Kingdom, you are actually walking on top of a network of underground roads. This is how the shops, restaurants and all the other material need of the Magic Kingdom are invisibly supplied.
1474.txt
3
[ "adventurous", "dangerous", "difficult", "easy" ]
Reaching the Magic Kingdom is_ .
Disney World, Florida, is the biggest amusement resort in the world. It covers 24. 4 thousand acres, and is twice the size of Manhattan. It was opened on October 1, 1971, five years after Walt Disney's death, and it is a larger, slightly more ambitious version of Disneyland near Los Angeles. Foreigners tend to associate Walt Disney with Snow White and the Seven Dwarfs, and with his other famous cartoon characters, Mickey Mouse, Donald Duck and Pluto, or with his nature films, whose superb photography is spoiled, in the opinion of some, by the vulgarity of the commentary and musical background. There is very little that could be called vulgar in Disney World. It attracts people of most tastes and most income groups, and people of all ages, from toddlers to adults. But the central attraction of the resort is the Magic Kingdom. Between the huge parking lots and the Magic Kingdom lies a broad artificial lake. In the distance rise the towers of Cinderella's Castle, which like every other building in the Kingdom is built of solid materials. Even getting to the Magic Kingdom is quite an adventure. You have a choice of transportation. You can either cross the lake on a replica of a Mississippi paddle-wheeler, or you can glide around the shore in a streamlined monorail train. When you reach the terminal, you walk straight into a little square which faces Main Street. Main Street is late 19th century. There are modern shops inside the buildings, but all the facades are of the period. There are hanging baskets full of red and white flowers, and there is no traffic except a horse-drawn streetcar and an ancient double-decker bus. Yet as you walk through the Magic Kingdom, you are actually walking on top of a network of underground roads. This is how the shops, restaurants and all the other material need of the Magic Kingdom are invisibly supplied.
1474.txt
0
[ "it is relatively amusing", "it is very expensive", "it just wastes his time", "it is vulgar" ]
When one visits this biggest amusement park in the world, one will find_ .
Disney World, Florida, is the biggest amusement resort in the world. It covers 24. 4 thousand acres, and is twice the size of Manhattan. It was opened on October 1, 1971, five years after Walt Disney's death, and it is a larger, slightly more ambitious version of Disneyland near Los Angeles. Foreigners tend to associate Walt Disney with Snow White and the Seven Dwarfs, and with his other famous cartoon characters, Mickey Mouse, Donald Duck and Pluto, or with his nature films, whose superb photography is spoiled, in the opinion of some, by the vulgarity of the commentary and musical background. There is very little that could be called vulgar in Disney World. It attracts people of most tastes and most income groups, and people of all ages, from toddlers to adults. But the central attraction of the resort is the Magic Kingdom. Between the huge parking lots and the Magic Kingdom lies a broad artificial lake. In the distance rise the towers of Cinderella's Castle, which like every other building in the Kingdom is built of solid materials. Even getting to the Magic Kingdom is quite an adventure. You have a choice of transportation. You can either cross the lake on a replica of a Mississippi paddle-wheeler, or you can glide around the shore in a streamlined monorail train. When you reach the terminal, you walk straight into a little square which faces Main Street. Main Street is late 19th century. There are modern shops inside the buildings, but all the facades are of the period. There are hanging baskets full of red and white flowers, and there is no traffic except a horse-drawn streetcar and an ancient double-decker bus. Yet as you walk through the Magic Kingdom, you are actually walking on top of a network of underground roads. This is how the shops, restaurants and all the other material need of the Magic Kingdom are invisibly supplied.
1474.txt
0
[ "It is funny.", "It is interesting.", "It is the biggest one.", "It is the most expensive." ]
Why is Disney World the most famous amusement resort?
Disney World, Florida, is the biggest amusement resort in the world. It covers 24. 4 thousand acres, and is twice the size of Manhattan. It was opened on October 1, 1971, five years after Walt Disney's death, and it is a larger, slightly more ambitious version of Disneyland near Los Angeles. Foreigners tend to associate Walt Disney with Snow White and the Seven Dwarfs, and with his other famous cartoon characters, Mickey Mouse, Donald Duck and Pluto, or with his nature films, whose superb photography is spoiled, in the opinion of some, by the vulgarity of the commentary and musical background. There is very little that could be called vulgar in Disney World. It attracts people of most tastes and most income groups, and people of all ages, from toddlers to adults. But the central attraction of the resort is the Magic Kingdom. Between the huge parking lots and the Magic Kingdom lies a broad artificial lake. In the distance rise the towers of Cinderella's Castle, which like every other building in the Kingdom is built of solid materials. Even getting to the Magic Kingdom is quite an adventure. You have a choice of transportation. You can either cross the lake on a replica of a Mississippi paddle-wheeler, or you can glide around the shore in a streamlined monorail train. When you reach the terminal, you walk straight into a little square which faces Main Street. Main Street is late 19th century. There are modern shops inside the buildings, but all the facades are of the period. There are hanging baskets full of red and white flowers, and there is no traffic except a horse-drawn streetcar and an ancient double-decker bus. Yet as you walk through the Magic Kingdom, you are actually walking on top of a network of underground roads. This is how the shops, restaurants and all the other material need of the Magic Kingdom are invisibly supplied.
1474.txt
2
[ "the civilisations of the empire were short of universalism or cultural breadth.", "William Lecky, who is the leading figure of the Byzantium's study, depreciated the culture of Byzantium greatly.", "criticisms against the Byzantium in the history biased people.", "Byzantium's culture was completely devastated in the 18th and 19th centuries." ]
Modern observers and defenders do not highly praise Byzantium's culture because _
Not all modern observers of Byzantium have been so willing to associate the city on the Bosphorus with universalism or cultural breadth. While Byzantium's rating has risen recently, it has not entirely shaken off the criticisms dished out in the 18th and 19th centuries, including the devastating verdict of William Lecky, an Irish historian, who in 1869 described the Byzantine empire as "the most thoroughly base and despicable form that civilisation has yet assumed." Even Byzantium's modern defenders have tended to set out their case in qualified terms, stressing the empire's relationship to other historical developments. Some see it as a connecting line between classical antiquity and the modern world; others, particularly those who think that civilisations are doomed perpetually to clash, stress the empire's role as a bulwark against Islam, without which Europe as a whole would have turned Muslim. Others again see it as a catalyst for the European Renaissance, especially after Hellenic talent was freed from Byzantine dogmatism. Judith Herrin, a professor at King's College London, sets out to show that there are far better reasons to study and admire the civilisation that flourished for more than a millennium before the conquest of Constantinople in 1453, and whose legacy is still discernible all over south-east Europe and the Levant. She presents Byzantium as a vibrant, dynamic, cosmopolitan reality which somehow escaped the constraints of its official ideology. For example, despite the anti-Semitism of the empire's public discourse and theology, its complex, diversified economy could hardly have functioned without the 30-plus Jewish communities that Benjamin of Tudela, a 12th-century rabbi, described. Ms Herrin also shows that there was a fluid and perpetually evolving relationship between the competing influences of classical Greek learning, Greek Christianity and popular Byzantine culture. She pays particular attention to the powerful female voices that emerged from Byzantium: not just pious ladies who wrote saints' lives and hymns (including one breathtaking piece of sensual, almost erotic religious poetry) but the sophisticated political history that was penned by Anna Komnene, a frustrated would-be empress of the 12th century. Ms Herrin will certainly win over some sceptics. But it will remain the case that more people are drawn to Byzantine civilisation through its dazzling art and architecture than by its literature. In August 2006, for example, more than 1,000 academic specialists on Byzantium converged on London for a week-long conference. The success of the quinquennial event was a sign that Byzantine studies are flourishing in almost every corner of the world. But it is a reasonable bet that, whatever they ultimately studied, these scholars were first drawn to the Byzantine world by gazing in wonder at an icon or a frescoed church rather than by perusing the pages of Anna Komnene. The brilliance of Byzantine art is proof enough that something extraordinary happened on the Bosphorus. And this brilliance remained undimmed even when the empire's geopolitical fortunes were collapsing. Snobbish Western classicists who called Byzantium a poor substitute for ancient Greece may have missed the point. True, the Byzantine world was weighed down by deference to classical Greek models. But that charge could also be laid against the pedagogues who used to dominate the study of the humanities in the Western world. Right now, Byzantine history is in vogue at many universities while old-fashioned classical studies are struggling to hold their own.
3533.txt
2
[ "the civilisation of Byzantium is worth of studying and admiring.", "Byzantium' civilization had flourished a much longer time than people usually perceive it and such influence has been neglected.", "To some extent, it is not necessary to relate Byzantium the city with the whole empire's public discourse and theology.", "The Jewish communities actually had made great contribution to Byzantium, which is against the conventional view of the study." ]
The example by Herrin in the third paragraph may prove that _
Not all modern observers of Byzantium have been so willing to associate the city on the Bosphorus with universalism or cultural breadth. While Byzantium's rating has risen recently, it has not entirely shaken off the criticisms dished out in the 18th and 19th centuries, including the devastating verdict of William Lecky, an Irish historian, who in 1869 described the Byzantine empire as "the most thoroughly base and despicable form that civilisation has yet assumed." Even Byzantium's modern defenders have tended to set out their case in qualified terms, stressing the empire's relationship to other historical developments. Some see it as a connecting line between classical antiquity and the modern world; others, particularly those who think that civilisations are doomed perpetually to clash, stress the empire's role as a bulwark against Islam, without which Europe as a whole would have turned Muslim. Others again see it as a catalyst for the European Renaissance, especially after Hellenic talent was freed from Byzantine dogmatism. Judith Herrin, a professor at King's College London, sets out to show that there are far better reasons to study and admire the civilisation that flourished for more than a millennium before the conquest of Constantinople in 1453, and whose legacy is still discernible all over south-east Europe and the Levant. She presents Byzantium as a vibrant, dynamic, cosmopolitan reality which somehow escaped the constraints of its official ideology. For example, despite the anti-Semitism of the empire's public discourse and theology, its complex, diversified economy could hardly have functioned without the 30-plus Jewish communities that Benjamin of Tudela, a 12th-century rabbi, described. Ms Herrin also shows that there was a fluid and perpetually evolving relationship between the competing influences of classical Greek learning, Greek Christianity and popular Byzantine culture. She pays particular attention to the powerful female voices that emerged from Byzantium: not just pious ladies who wrote saints' lives and hymns (including one breathtaking piece of sensual, almost erotic religious poetry) but the sophisticated political history that was penned by Anna Komnene, a frustrated would-be empress of the 12th century. Ms Herrin will certainly win over some sceptics. But it will remain the case that more people are drawn to Byzantine civilisation through its dazzling art and architecture than by its literature. In August 2006, for example, more than 1,000 academic specialists on Byzantium converged on London for a week-long conference. The success of the quinquennial event was a sign that Byzantine studies are flourishing in almost every corner of the world. But it is a reasonable bet that, whatever they ultimately studied, these scholars were first drawn to the Byzantine world by gazing in wonder at an icon or a frescoed church rather than by perusing the pages of Anna Komnene. The brilliance of Byzantine art is proof enough that something extraordinary happened on the Bosphorus. And this brilliance remained undimmed even when the empire's geopolitical fortunes were collapsing. Snobbish Western classicists who called Byzantium a poor substitute for ancient Greece may have missed the point. True, the Byzantine world was weighed down by deference to classical Greek models. But that charge could also be laid against the pedagogues who used to dominate the study of the humanities in the Western world. Right now, Byzantine history is in vogue at many universities while old-fashioned classical studies are struggling to hold their own.
3533.txt
2
[ "The scholars were only interested in studying icons or frescoes in Byzantium.", "The success of this conference proves the study on Byzantium is in vogue.", "Scholars were drawn to Byzantium civilisation by its art at the very beginning .", "Scholars showed less interest in the literature of Byzantium." ]
Which one of the following statements is NOT true of the academic conferent on Byzantium?
Not all modern observers of Byzantium have been so willing to associate the city on the Bosphorus with universalism or cultural breadth. While Byzantium's rating has risen recently, it has not entirely shaken off the criticisms dished out in the 18th and 19th centuries, including the devastating verdict of William Lecky, an Irish historian, who in 1869 described the Byzantine empire as "the most thoroughly base and despicable form that civilisation has yet assumed." Even Byzantium's modern defenders have tended to set out their case in qualified terms, stressing the empire's relationship to other historical developments. Some see it as a connecting line between classical antiquity and the modern world; others, particularly those who think that civilisations are doomed perpetually to clash, stress the empire's role as a bulwark against Islam, without which Europe as a whole would have turned Muslim. Others again see it as a catalyst for the European Renaissance, especially after Hellenic talent was freed from Byzantine dogmatism. Judith Herrin, a professor at King's College London, sets out to show that there are far better reasons to study and admire the civilisation that flourished for more than a millennium before the conquest of Constantinople in 1453, and whose legacy is still discernible all over south-east Europe and the Levant. She presents Byzantium as a vibrant, dynamic, cosmopolitan reality which somehow escaped the constraints of its official ideology. For example, despite the anti-Semitism of the empire's public discourse and theology, its complex, diversified economy could hardly have functioned without the 30-plus Jewish communities that Benjamin of Tudela, a 12th-century rabbi, described. Ms Herrin also shows that there was a fluid and perpetually evolving relationship between the competing influences of classical Greek learning, Greek Christianity and popular Byzantine culture. She pays particular attention to the powerful female voices that emerged from Byzantium: not just pious ladies who wrote saints' lives and hymns (including one breathtaking piece of sensual, almost erotic religious poetry) but the sophisticated political history that was penned by Anna Komnene, a frustrated would-be empress of the 12th century. Ms Herrin will certainly win over some sceptics. But it will remain the case that more people are drawn to Byzantine civilisation through its dazzling art and architecture than by its literature. In August 2006, for example, more than 1,000 academic specialists on Byzantium converged on London for a week-long conference. The success of the quinquennial event was a sign that Byzantine studies are flourishing in almost every corner of the world. But it is a reasonable bet that, whatever they ultimately studied, these scholars were first drawn to the Byzantine world by gazing in wonder at an icon or a frescoed church rather than by perusing the pages of Anna Komnene. The brilliance of Byzantine art is proof enough that something extraordinary happened on the Bosphorus. And this brilliance remained undimmed even when the empire's geopolitical fortunes were collapsing. Snobbish Western classicists who called Byzantium a poor substitute for ancient Greece may have missed the point. True, the Byzantine world was weighed down by deference to classical Greek models. But that charge could also be laid against the pedagogues who used to dominate the study of the humanities in the Western world. Right now, Byzantine history is in vogue at many universities while old-fashioned classical studies are struggling to hold their own.
3533.txt
0