List of top Verbal Ability & Reading Comprehension (VARC) Questions asked in CAT

Read the passage and answer the following question.
Founded at the dawn of the modern industrial era, the nearly forgotten Women’s Trade Union League (WTUL) played an instrumental line role in advancing the cause of working women through the early part of the twentieth century. In the face of considerable adversity, the WTUL made a contribution far greater than did most historical footnotes.
The organization’s successes did not come easily; conflict beset the WTUL in many forms. During those early days of American unions, organized labour was aggressively opposed by both industry and government. The WTUL, which represented a largely unskilled labour force, had little leverage against these powerful opponents. Also, because of the skill level of its workers as well as inherent societal gender bias, the WTUL had great difficulty find- ing allies among other unions. Even the large and powerful American Federation of Labour (AFL), which nominally took the WTUL under its wing, kept it at a distance. Because the AFL’s power stemmed from its highly skilled labour force, the organization saw little eco- nomic benefit in working with the WTUL. The affiliation provided the AFL with political cover, allowing it to claim support for women workers; in return, the WTUL gained a potent but largely absent ally.
The WTUL also had to overcome internal discord. While the majority of the group’s members were working women, a sizeable and powerful minority consisted of middle- and upper-class social reformers whose goals extended beyond labour reform. While workers ar- gued that the WTUL should focus its efforts on collective bargaining and working conditions, the reformers looked beyond the workplace, seeking state and national legislation aimed at education reform and urban poverty relief as well as workplace issues.
Despite these obstacles, the WTUL accomplished a great deal. The organization was in- strumental in the passage of state laws mandating an eight-hour workday, a minimum wage for women, and a ban on child labour. It provided seed money to women who organized workers in specific plants and industries, and also established strike funds and soup kitchens to support striking unionists. After the tragic Triangle Shirtwaist Company fire of 1911, the WTUL launched a four-year investigation whose conclusions formed the basis of much sub- sequent workplace safety legislation. The organization also offered a political base for all reform-minded women, and thus helped develop the next generation of American leaders. Eleanor Roosevelt was one of many prominent figures to emerge from the WTUL.
The organization began a slow death in the late 1920s, when the Great Depression choked off its funding. The organization limped through the 1940s; the death knell eventually rang in 1950, at the onset of the McCarthy era. A turn-of-the-century labour organization dedicated to social reform, one that during its heyday was regarded by many as “radical,” stood little chance of weathering that storm. This humble ending, however, does nothing to diminish the accomplishments of an organization that is yet to receive its historical due.
Read the passage and answer the following question.
History has shaped academic medical centers (AMCs) to perform 3 functions: patient care,research, and teaching. These 3 missions are now fraught with problems because the attempt to combine them has led to such inefficiencies as duplication of activities and personnel, inpatient procedures that could and should have been out-patient procedures, and unwieldy administrative bureaucracies.
One source of inefficiency derives from mixed lines of authority. Clinical chiefs and prac- titioners in AMCs are typically responsible to the hospital for practice issues but to the med- ical school for promotion, marketing, membership in a faculty practice plan, and educational accreditation. Community physicians with privileges at a university hospital add more com- plications. They have no official affiliation with the AMC’s medical school connected, but their cooperation with faculty members is essential for proper patient treatment. The frag- mented accountability is heightened by the fact that 3 different groups often vie for the loy- alty of physicians who receive research. The medical school may wish to capitalize on the research for its educational value to students; the hospital may desire the state-of-the-art treat- ment methods resulting from the research; and the grant administrators may focus on the re- searchers’ humanitarian motives. Communication among these groups is rarely coordinated, and the physicians may serve whichever group promises the best perks and ignore the rest — which inevitably strains relationships.
Another source of inefficiency is the fact that physicians have obligations to many different illnesses cost, and of how other institutions treat patient conditions, they would be better practitioners, and the educational and clinical care missions of AMCs would both be better served. groups: patients, students, faculty members, referring physicians, third-party payers, and staff members, all of whom have varied expectations. Satisfying the interests of one group may alienate others. Patient care provides a common example. For the benefit of medical students, physicians may order too many tests, prolong patient visits, or encourage experimental studies of a patient. If AMC faculty physicians were more aware of how much treatments of specific.
A bias toward specialization adds yet more inefficiency. AMCs are viewed as institutions serving the gravest cases in need of the most advanced treatments. The high number of spe- cialty residents and the presence of burn units, blood banks, and transplant centers validate this belief. Also present at AMCs, though less conspicuous, are facilities for ordinary pri- mary care patients. In fact, many patients choose to visit an AMC for primary care because they realize that any necessary follow-up can occur almost instantaneously. While AMCs have emphasized cutting-edge specialty medicine, their more routine medical services need development and enhancement.
A final contribution to inefficiency is organizational complacency. Until recently, most academic medical centers drew the public merely by existing. The rising presence, however, of tertiary hospitals with patient care as their only goal has immersed AMCs in a very com- petitive market. It is only in the past several years that AMCs have started to recognize and develop strategies to address competition.

A remarkable aspect of art of the present century is the range of concepts and ideologies which it embodies. It is almost tempting to see a pattern emerging within the art field — or alternatively under an a posteriori umbrella — similar to that which exists under the umbrella of science where the general term covers a whole range of separate, though interconnecting, activities. Any parallelism is, however, in this instance at least — misleading. A scientific discipline develops systematically once its bare tenets have been established, named and categorized as conventions. Many of the concepts of modern art, by contrast, have resulted from the almost accidental meetings of groups of talented individuals at certain times and certain places. The ideas generated by these chance meetings had twofold consequences. Firstly, a corpus of work would be produced which, in great part, remains as a concrete record of the events. Secondly, the ideas would themselves be disseminated through many different channels of communication — seeds that often bore fruit in contexts far removed from their generation. Not all movements were exclusively concerned with innovation. Surrealism, for instance, claimed to embody a kind of insight which can be present in the art of any period. This claim has been generally accepted so that a sixteenth century painting by Spranger or a mysterious photograph by Atget can legitimately be discussed in surrealist terms. Briefly, then, the concepts of modern art are of many different (often fundamentally different) kinds and resulted from the exposures of painters, sculptors and thinkers to the more complex phenomena of the twentieth century, including our ever increasing knowledge of the thought and products of earlier centuries. Different groups of artists would collaborate in trying to make sense of a rapidly changing world of visual and spiritual experience. We should hardly be surprised if no one group succeeded completely, but achievements, though relative, have been considerable. Landmarks have been established — concrete statements of position which give a pattern to a situation which could easily have degenerated into total chaos. Beyond this, new language tools have been created for those who follow — semantic systems which can provide a springboard for further explorations.

The codifying of art is often criticized. Certainly one can understand that artists are wary of being pigeon-holed since they are apt to think of themselves as individuals — sometimes with good reason. The notion of self-expression, however, no longer carries quite the weight it once did; objectivity has its defenders. There is good reason to accept the ideas codified by artists and critics, over the past sixty years or so, as having attained the status of independent existence — an independence which is not without its own value. The time factor is important here. As an art movement slips into temporal perspective, it ceases to be a living organism — becoming, rather, a fossil. This is not to say that it becomes useless or uninteresting. Just as a scientist can reconstruct the life of a prehistoric environment from the messages codified into the structure of a fossil, so can an artist decipher whole webs of intellectual and creative possibility from the recorded structure of a “dead” art movement. The artist can match the creative patterns crystallized into this structure against the potentials and possibilities of his own time. As T.S. Eliot observed, no one starts anything from scratch; however consciously you may try to live in the present, you are still involved with a nexus of behaviour patterns bequeathed from the past. The original and creative person is not someone who ignores these patterns, but someone who is able to translate and develop them so that they conform more exactly to his — and our — present needs.

To summarize the Classic Maya collapse, we can tentatively identify five strands. I acknowledge, however, that Maya archaeologists still disagree vigorously among themselves in part, because the different strands evidently varied in importance among different parts of the Maya realm; because detailed archaeological studies are available for only some Maya sites; and because it remains puzzling why most of the Maya heartland remained nearly empty of population and failed to recover after the collapse and after re-growth of forests.

With those caveats, it appears to me that one strand consisted of population growth outstripping available resources: a dilemma similar to the one foreseen by Thomas Malthus in 1798 and being played out today in Rwanda, Haiti and elsewhere. As the archaeologist David Webster succinctly puts it, "Too many farmers grew too many crops on too much of landscape." Compounding that mismatch between population and resources was the second strand: the effects of deforestation and hillside erosion, which caused a decrease in the amount of useable farmland at a time when more rather than less farmland was needed, and possibly exacerbated by an anthropogenic drought resulting from deforestation, by soil nutrient depletion and other soil problems, and by the struggle to prevent bracken ferns from overrunning the fields.

The third strand consisted of increased fighting, as more and more people fought over fewer resources. Maya warfare, already endemic, peaked just before the collapse. That is not surprising when one reflects that at least five million people, perhaps many more, were crammed into an area smaller than the US state of Colorado (104,000 square miles). That warfare would have decreased further the amount of land available for agriculture, by creating no-man’s lands between principalities where it was now unsafe to farm. Bringing matters to a head was the strand of climate change. The drought at the time of the Classic collapse was not the first drought that the Maya had lived through, but it was the most severe. At the time of previous droughts, there were still uninhabited parts of the Maya landscape, and people at a site affected by drought could save themselves by moving to another site. However, by the time of the Classic collapse the landscape was now full, there was no useful unoccupied land in the vicinity on which to begin anew, and the whole population could not be accommodated in the few areas that continued to have reliable water supplies.

As our fifth strand, we have to wonder why the kings and nobles failed to recognize and solve these seemingly obvious problems undermining their society. Their attention was evidently focused on their short-term concerns of enriching themselves, waging wars, erecting monuments, competing with each other, and extracting enough food from the peasants to support all those activities. Like most leaders throughout human history, the Maya kings and nobles did not heed long-term problems, insofar as they perceived them.

Finally, while we still have some other past societies to consider before we switch our attention to the modern world, we must already be struck by some parallels between the Maya and the past societies. As on Mangareva, the Maya environmental and population problems led to increasing warfare and civil strife. Similarly, on Easter Island and at Chaco Canyon, the Maya peak population numbers were followed swiftly by political and social collapse. Paralleling the eventual extension of agriculture from Easter Island’s coastal lowlands to its uplands, and from the Mimbres floodplain to the hills, Copan’s inhabitants also expanded from the floodplain to the more fragile hill slopes, leaving them with a larger population to feed when the agricultural boom in the hills went bust. Like Easter Island chiefs erecting ever larger statues, eventually crowned by pukao, and like Anasazi elite treating themselves to necklaces of 2,000 turquoise beads, Maya kings sought to outdo each other with more and more impressive temples, covered with thicker and thicker plaster — reminiscent in turn of the extravagant conspicuous consumption by modern American CEOs. The passivity of Easter chiefs and Maya kings in the face of the real big threats to their societies completes our list of disquieting parallels.

Language is not a cultural artifact that we learn the way we learn to tell time or how the federal government works. Instead, it is a distinct piece of the biological makeup of our brains. Language is a complex, specialized skill, which develops in the child spontaneously, without conscious effort or formal instruction, is deployed without awareness of its underlying logic, is qualitatively the same in every individual, and is distinct from more general abilities to process information or behave intelligently. For these reasons some cognitive scientists have described language as a psychological faculty, a mental organ, a neural system, and a computational module. But I prefer the admittedly quaint term “instinct”. It conveys the idea that people know how to talk in more or less the sense that spiders know how to spin webs. Web-spinning was not invented by some unsung spider genius and does not depend on having had the right education or on having an aptitude for architecture or the construction trades. Rather, spiders spin spider webs because they have spider brains, which give them the urge to spin and the competence to succeed. Although there are differences between webs and words, I will encourage you to see language in this way, for it helps to make sense of the phenomena we will explore.

Thinking of language as an instinct inverts the popular wisdom, especially as it has been passed down in the canon of the humanities and social sciences. Language is no more a cultural invention than is upright posture. It is not a manifestation of a general capacity to use symbols: a three-year-old, we shall see, is a grammatical genius, but is quite incompetent at the visual arts, religious iconography, traffic signs, and the other staples of the semiotics curriculum. Though language is a magnificent ability unique to Homo sapiens among living species, it does not call for sequestering the study of humans from the domain of biology, for a magnificent ability unique to a particular living species is far from unique in the animal kingdom. Some kinds of bats home in on flying insects using Doppler sonar. Some kinds of migratory birds navigate thousands of miles by calibrating the positions of the constellations against the time of day and year. In nature’s talent show, we are simply a species of primate with our own act, a knack for communicating information about who did what to whom by modulating the sounds we make when we exhale.

Once you begin to look at language not as the ineffable essence of human uniqueness but as a biological adaptation to communicate information, it is no longer as tempting to see language as an insidious shaper of thought, and, we shall see, it is not. Moreover, seeing language as one of nature’s engineering marvels — an organ with “that perfection of structure and co-adaptation which justly excites our admiration,” in Darwin’s words — gives us a new respect for your ordinary Joe and the much-maligned English language (or any language). The complexity of language, from the scientist’s point of view, is part of our biological birthright; it is not something that parents teach their children or something that must be elaborated in school — as Oscar Wilde said, “Education is an admirable thing, but it is well to remember from time to time that nothing that is worth knowing can be taught.” A preschooler’s tacit knowledge of grammar is more sophisticated than the thickest style manual or the most state-of-the-art computer language system, and the same applies to all healthy human beings, even the notorious syntax-fracturing professional athlete and the, you know, like, inarticulate teenage skateboarder. Finally, since language is the product of a well-engineered biological instinct, we shall see that it is not the nutty barrel of monkeys that entertainer columnists make it out to be.

When I was little, children were bought two kinds of ice cream, sold from those white wagons with canoe-pies made of silvery metal: either the two-cent cone or the four-cent ice-cream pie. The two-cent cone was very small, in fact it could fit comfortably into a child’s hand, and it was made by taking the ice cream from its container with a special scoop and piling it on the cone. Granny always suggested I eat only a part of the cone, then throw away the pointed end, because it had been touched by the vendor’s hand (though that was the best part, nice and crunchy, and it was regularly eaten in secret, after a pretence of discarding it).

The four-cent pie was made by a special little machine, also silvery, which pressed two disks of sweet biscuit against a cylindrical section of ice cream. First you had to thrust your tongue into the gap between the biscuits until it touched the central nucleus of ice cream; then, gradually, you ate the whole thing, the biscuit surfaces softening as they became soaked in creamy nectar. Granny had no advice to give here: in theory the pies had been touched only by the machine; in practice, the vendor had held them in his hand while giving them to us, but it was impossible to isolate the contaminated area.

I was fascinated, however, by some of my peers, whose parents bought them not a four-cent pie but two two-cent cones. These privileged children advanced proudly with one cone in their right hand and one in their left; and expertly moving their head from side to side, they licked first one, then the other. This liturgy seemed to me so sumptuously enviable, that many times I asked to be allowed to celebrate it. In vain. My elders were inflexible: a four-cent ice, yes; but two two-cent ones, absolutely no.

As anyone can see, neither mathematics nor economy nor dietetics justified this refusal. Nor did hygiene, assuming that in due course the tips of both cones were discarded. The pathetic, and obviously mendacious, justification was that a boy concerned with turning his eyes from one cone to the other was more inclined to stumble over stones, steps, or cracks in the pavement. I dimly sensed that there was another secret justification, cruelly pedagogical, but I was unable to grasp it.

Today, citizen and victim of a consumer society, a civilization of excess and waste (which the society of the thirties was not), I realize that those dear and now departed elders were right. Two two-cent cones instead of one at four cents did not signify squandering, economically speaking, but symbolically they surely did. It was for this precise reason, that I yearned for them: because two ice creams suggested excess. And this was precisely why they were denied to me: because they looked indecent, an insult to poverty, a display of fictitious privilege, a boast of wealth. Only spoiled children ate two cones at once, those children who in fairy tales were rightly punished, as Pinocchio was when he rejected the skin and the stalk. And parents who encouraged this weakness, appropriate to little parvenus, were bringing up their children in the false theatre of “I’d like to but I can’t.” They were preparing them to turn up at tourist-class check-in with a fake Gucci bag bought from a street peddler on the beach at Rimini.

Nowadays the moralist risks seeming at odds with morality, in a world where the consumer civilization now wants even adults to be spoiled, and promises them always something more, from the wristwatch in the box of detergent to the bonus bangle sheathed, with the magazine it accompanies, in a plastic envelope. Like the parents of those ambidextrous gluttons I so envied, the consumer civilization pretends to give more, but actually gives, for four cents, what its worth for four cents. You will throw away the old transistor radio to purchase the new one, that boasts an alarm clock as well, but some inexplicable defect in the mechanism will guarantee that the radio lasts only a year. The new cheap car will have leather seats, double side mirrors adjustable from inside, and a panelled dashboard, but it will not last nearly so long as the glorious old Fiat 500, which, even when it broke down, could be started again with a kick.

The morality of the old days made Spartans of us all, while today’s morality wants all of us to be Sybarites.