List of top Verbal Ability & Reading Comprehension (VARC) Questions asked in CAT

Understanding where you are in the world is a basic survival skill, which is why we, like most species come hard-wired with specialized brain areas to create congnitive maps of our surroundings. Where humans are unique, though, with the possible exception of honeybees, is that we try to communicate this understanding the world with others. We have along history of doing this by drawing maps – the earliest version yet discovered were scrawled on cave walls 14,000 years ago. Human cultures have been drawing them on stone tablets, papyrus, paper and now computer screens ever since.
Given such a long history of human map-making, it perhaps surprising that is only within the last few hundred years that north has been consistently considered to be at the top. In fact, for much of human history, north almost never appeared at the top, according to Jerry Brotton, a map historian… “North was rarely put at the top for the simple fact that north is where darkness comes from,” he says. “West is also very unlikely o be put at the top because west is where the sun disappears.” 
Confusingly, early Chinese maps seem to buck this trend. But, Brotton, says, even though they did have compasses at the time, that isn’t the reason that they placed north at the top. Early Chinese compasses were actually oriented to point south, which was considered to be more desirable than deepest darkest north. But  in Chinese maps, the emperor, who lived in the north of the country was always put at the top of the map, with everyone else, his loyal subjects, looking up towards him. “In Chinese culture the Emperor looks south because it’s where the winds come from, it’s a good direction. North is not very good but you are in a position of the subjection to the emperor, so you look up to him,” says Brotton. 
Given that each culture has a very different idea of who, or what, they should look upto it’s perhaps not surprising that there is very little consistency in which way early maps pointed. In ancient Egyptian times the top of the world was east, the position of sunrise. Early Islamic maps favoured south at the top because most of the early Muslim cultures were north of Mecca, so they imagined looking up (south) towards it Christian maps from the same era (called Mappa Mundi) put east at the top, towards the Garden of Eden and with Jerusalem in the centre. 
So when did everyone get together and decide that north was the top? It’s tempting to put it down to European explorers like Christopher Columbus and Ferdinand Megellan who were navigating by the North Star. But Brotton argues that these early explorers didn’t think of the world like that at all. “When Columbus describes the world it is in accordance with east being at the top,” he says “Columbus says he is going towards paradise, so his mentality is from a medieval mappa mundi.” We’ve got to remember, adds Brotton, that at the time, “no one knows what they are doing and where they are going.”
I used a smartphone GPS to find my way through the cobblestoned maze of Geneva's Old Town, in search of a handmade machine that changed the world more than any other invention. Near a 13th-century cathedral in this Swiss city on the shores of a lovely lake, I found what I was looking for: a Gutenberg printing press. "This was the Internet of its day — at least as influential as the iPhone," said Gabriel de Montmollin, the director of the Museum of the Reformation, toying with the replica of Johann Gutenberg's great invention. [Before the invention of the printing press] it used to take four monks...up to a year to produce a single book. With the advance in movable type in 15th-century Europe, one press could crank out 3,000 pages a day. Before long, average people could travel to places that used to be unknown to them — with maps! Medical information passed more freely and quickly, diminishing the sway of quacks...The printing press offered the prospect that tyrants would never be able to kill a book or suppress an idea. Gutenberg's brainchild broke the monopoly that clerics had on scripture. And later, stirred by pamphlets from a version of that same press, the American colonies rose up against a king and gave birth to a nation. So, a question in the summer of this 10th anniversary of the iPhone: has the device that is perhaps the most revolutionary of all time given us a single magnificent idea? Nearly every advancement of the written word through new technology has also advanced humankind. Sure, you can say the iPhone changed everything. By putting the world's recorded knowledge in the palm of a hand, it revolutionized work, dining, travel and socializing. It made us more narcissistic — here's more of me doing cool stuff! — and it unleashed an army of awful trolls. We no longer have the patience to sit through a baseball game without that reach to the pocket. And one more casualty of Apple selling more than a billion phones in a decade's time: daydreaming has become a lost art. For all of that, I'm still waiting to see if the iPhone can do what the printing press did for religion and democracy...the Geneva museum makes a strong case that the printing press opened more minds than anything else...it's hard to imagine the French or American revolutions without those enlightened voices in print... Not long after Steve Jobs introduced his iPhone, he said the bound book was probably headed for history's attic. Not so fast. After a period of rapid growth in e-books, something closer to the medium for Chaucer's volumes has made a great comeback. The hope of the iPhone, and the Internet in general, was that it would free people in closed societies. But the failure of the Arab Spring, and the continued suppression of ideas in North Korea, China and Iran, has not borne that out... The iPhone is still young. It has certainly been "one of the most important, world-changing and successful products in history, “ as Apple CEO. Tim Cook said. But I'm not sure if the world changed for the better with the iPhone — as it did with the printing press — or merely, changed.
This year alone, more than 8,600 stores could close, according to industry estimates, many of them the brand -name anchor outlets that real estate developers once stumbled over themselves to court. Already there have been 5,300 retail closings this year... Sears Holdings—which owns Kmart—said in March that there's "substantial doubt" it can stay in business altogether, and will close 300 stores this year. So far this year, nine national retail chains have filed for bankruptcy.
Local jobs are a major casualty of what analysts are calling, with only a hint of hyperbole, the retail apocalypse. Since 2002, department stores have lost 448,000 jobs, a 25% decline, while the number of store closures this year is on pace to surpass the worst depths of the Great Recession. The growth of online retailers, meanwhile, has failed to offset those losses, with the e-commerce sector adding just 178,000 jobs over the past 15 years. Some of those jobs can be found in the massive distribution centers Amazon has opened across the country, often not too far from malls the company helped shutter. 
But those are workplaces, not gathering places. The mall is both. And in the 61 years since the first enclosed one opened in suburban Minneapolis, the shopping mall has been where a huge swath of middle-class America went for far more than shopping. It was the home of first jobs and blind dates, the place for family photos and ear piercings, where goths and grandmothers could somehow walk through the same doors and find something they all liked. Sure, the food was lousy for you and the oceans of parking lots encouraged car-heavy development, something now scorned by contemporary planners. But for better or worse, the mall has been America's public square for the last 60 years. 
So what happens when it disappears? 
Think of your mall. Or think of the one you went to as a kid. Think of the perfume clouds in the department stores. The fountains splashing below the skylights. The cinnamon wafting from the food court. As far back as ancient Greece, societies have congregated around a central marketplace. In medieval Europe, they were outside cathedrals. For half of the 20th century and almost 20 years into the new one, much of America has found their agora on the terrazzo between Orange Julius and Sbarro, Waldenbooks and the Gap, Sunglass Hut and Hot Topic.
That mall was an ecosystem unto itself, a combination of community and commercialism peddling everything you needed and everything you didn't: Magic Eye posters, wind catchers. Air Jordans. ...
A growing number of Americans, however, don't see the need to go to any Macy's at all. Our digital lives are frictionless and ruthlessly efficient, with retail and romance available at a click. Malls were designed for leisure, abundance, ambling. You parked and planned to spend some time. Today, much of that time has been given over to busier lives and second jobs and apps that let you swipe right instead of haunt the food court. ' Malls, says Harvard business professor Leonard Schlesinger, "were built for patterns of social interaction that increasingly don't exist."
Scientists have long recognised the incredible diversity within a species. But they thought it reflected evolutionary changes that unfolded imperceptibly, over millions of years. That divergence between populations within a species was enforced, according to Ernst Mayr, the great evolutionary biologist of the 1940s, when a population was separated from the rest of the species by a mountain range or a desert, preventing breeding across the divide over geologic scales of time. Without the separation, gene flow was relentless. But as the separation persisted, the isolated population grew apart and speciation occurred.
In the mid-1960s, the biologist Paul Ehrlich - author of The Population Bomb (1968) - and his Stanford University colleague Peter Raven challenged Mayr's ideas about speciation. They had studied checkerspot butterflies living in the Jasper Ridge Biological Preserve in California, and it soon became clear that they were not examining a single population. Through years of capturing, marking and then recapturing the butterflies, they were able to prove that within the population, spread over just 50 acres of suitable checkerspot habitat, there were three groups that rarely interacted despite their very close proximity. 
Among other ideas, Ehrlich and Raven argued in a now classic paper from 1969 that gene flow was not as predictable and ubiquitous as Mayr and his cohort maintained, and thus evolutionary divergence between neighboring groups in a population was probably common. They also asserted that isolation and gene flow were less important to evolutionary divergence than natural selection (when factors such as mate choice, weather, disease or predation cause better-adapted individuals to survive and pass on their successful genetic traits). For example, Ehrlich and Raven suggested that, without the force of natural selection, an isolated population would remain unchanged and that, in other scenarios, natural selection could be strong enough to overpower gene flow...
Do sports mega events like the summer Olympic Games benefit the host city economically? It depends, but the prospects are less than rosy. The trick is converting...several billion dollars in operating costs during the 17-day fiesta of the Games into a basis for long-term economic returns. These days, the summer Olympic Games themselves generate total revenue of $4 billion to $5 billion, but the lion's share of this goes to the International Olympics Committee, the National Olympics Committees and the International Sports Federations. Any economic benefit would have to flow from the value of the Games as an advertisement for the city, the new transportation and communications infrastructure that was created for the Games, or the ongoing use of the new facilities.
Evidence suggests that the advertising effect is far from certain. The infrastructure benefit depends on the initial condition of the city and the effectiveness of the planning. The facilities benefit is dubious at best for buildings such as velodromes or natatoriums and problematic for 100,000-seat Olympic stadiums. The latter require a conversion plan for future use, the former are usually doomed to near vacancy. Hosting the summer Games generally requires 30-plus sports venues and dozens of training centers. Today, the Bird's Nest in Beijing sits virtually empty, while the Olympic Stadium in Sydney costs some $30 million a year to operate. 
Part of the problem is that Olympics planning takes place in a frenzied and time-pressured atmosphere of intense competition with the other prospective host cities — not optimal conditions for contemplating the future shape of an urban landscape. Another part of the problem is that urban land is generally scarce and growing scarcer. The new facilities often stand for decades or longer. Even if they have future use, are they the best use of precious urban real estate? 
Further, cities must consider the human cost. Residential areas often are razed and citizens relocated (without adequate preparation or compensation). Life is made more hectic and congested. There are, after all, other productive uses that can be made of vanishing fiscal resources.
Creativity is at once our most precious resource and our most inexhaustible one. As anyone who has ever spent any time with children knows, every single human being is born creative; every human being is innately endowed with the ability to combine and recombine data, perceptions, materials and ideas, and devise new ways of thinking and doing. What fosters creativity? More than anything else: the presence of other creative people. The big myth is that creativity is the province of great individual geniuses. In fact creativity is a social process. Our biggest creative breakthroughs come when people learn from, compete with, and collaborate with other people.
Cities are the true fonts of creativity... With their diverse populations, dense social networks, and public spaces where people can meet spontaneously and serendipitously, they spark and catalyze new ideas. With their infrastructure for finance, organization and trade, they allow those ideas to be swiftly actualized.
As for what staunches creativity, that's easy, if ironic. It's the very institutions that we build to manage, exploit and perpetuate the fruits of creativity — our big bureaucracies, and sad to say, too many of our schools. Creativity is disruptive; schools and organizations are regimented, standardized and stultifying.
The education expert Sir Ken Robinson points to a 1968 study reporting on a group of 1,600 children who were tested over time for their ability to think in out-of-the-box ways. When the children were between 3 and 5 years old, 98 percent achieved positive scores. When they were 8 to 10, only 32 percent passed the same test, and only 10 percent at 13 to 15. When 280,000 25-year-olds took the test, just 2 percent passed. By the time we are adults, our creativity has been wrung out of us.
I once asked the great urbanist Jane Jacobs what makes some places more creative than others. She said, essentially, that the question was an easy one. All cities, she said, were filled with creative people; that's our default state as people. But some cities had more than their shares of leaders, people and institutions that blocked out that creativity. She called them "squelchers."
Creativity (or the lack of it) follows the same general contours of the great socio-economic divide - our rising inequality - that plagues us. According to my own estimates, roughly a third of us across the United States, and perhaps as much as half of us in our most creative cities - are able to do work which engages our creative faculties to some extent, whether as artists, musicians, writers, techies, innovators, entrepreneurs, doctors, lawyers, journalists or educators - those of us who work with our minds. That leaves a group that I term "the other 66 percent," who toil in low-wage rote and rotten jobs — if they have jobs at all — in which their creativity is subjugated, ignored or wasted. Creativity (or the lack of it) follows the same general contours of the great socio-economic divide - our rising inequality - that plagues us. According to my own estimates, roughly a third of us across the United States, and perhaps as much as half of us in our most creative cities - are able to do work which engages our creative faculties to some extent, whether as artists, musicians, writers, techies, innovators, entrepreneurs, doctors, lawyers, journalists or educators - those of us who work with our minds. That leaves a group that I term "the other 66 percent," who toil in low-wage rote and rotten jobs — if they have jobs at all — in which their creativity is subjugated, ignored or wasted.
Creativity itself is not in danger. It's flourishing is all around us - in science and technology, arts and culture, in our rapidly revitalizing cities. But we still have a long way to go if we want to build a truly creative society that supports and rewards the creativity of each and every one of us.
During the frigid season...it's often necessary to nestle under a blanket to try to stay warm.The temperature difference between the blanket and the air outside is so palpable that we often have trouble leaving our warm refuge.Many plants and animals similarly hunker down,relying on snow cover for safety from winter's harsh conditions.The small area between the snowpack and the ground,called the subnivium...might be the most important ecosystem that you have never heard of.
The subnivium is so well-insulated and stable that its temperature holds steady at around 32 degree Fahrenheit (0 degree Celsius). Although that might still sound cold, a constant temperature of 32 degree Fahrenheit can often be 30 to 40 degrees warmer than the air temperature during the peak of winter. Because of this large temperature difference, a wide variety of species...depend on the subnivium for winter protection.
For many organisms living in temperate and Arctic regions, the difference between being under the snow or outside it is a matter of life and death. Consequently, disruptions to the subnivium brought about by climate change will affect everything from population dynamics to nutrient cycling through the ecosystem.
The formation and stability of the subnivium requires more than a few flurries. Winter ecologists have suggested that eight inches of snow is necessary to develop a stable layer of insulation. Depth is not the only factor, however. More accurately, the stability of the subnivium depends on the interaction between snow depth and snow density. Imagine being under a stack of blankets that are all flattened and pressed together. When compressed, the blankets essentially form one compacted layer. In contrast, when they are lightly placed on top of one another, their insulative capacity increases because the air pockets between them trap heat. Greater depths of low-density snow are therefore better at insulating the ground.
Both depth and density of snow are sensitive to temperature. Scientists are now beginning to explore how climate change will affect the subnivium, as well as the species that depend on it. At first glance, warmer winters seem beneficial for species that have difficulty surviving subzero temperatures; however, as with most ecological phenomena, the consequences are not so straightforward. Research has shown that the snow season (the period when snow is more likely than rain) has become shorter since 1970. When rain falls on snow, it increases the density of the snow and reduces its insulative capacity. Therefore, even though winters are expected to become warmer overall from future climate change, the subnivium will tend to become colder and more variable with less protection from the above-ground temperatures.
The effects of a colder subnivium are complex...For example, shrubs such as crowberry and alpine a2alea that grow along the forest floor tend to block the wind and so retain higher depths of snow around them. This captured snow helps to keep soils insulated and in turn increases plant decomposition and nutrient release. In field experiments, researchers removed a portion of the snow cover to investigate the importance of the subnivium's insulation. They found that soil frost in the snow-free area resulted in damage to plant roots and sometimes even the death of the plant.
The purpose of this passage is to
A) introduce readers to a relatively unknown ecosystem: the subnivium
B) explain how the subnivium works to provide shelter and food to several species.
C) outline the effects of climate change on the subnivium.
D) draw an analogy between the effect of blankets on humans and of snow cover on species living in the subnivium.
The end of the age of the internal combustion engine is in sight. There are small signs everywhere: the shift to hybrid vehicles is already under way among manufacturers. Volvo has announced it will make no purely petrol-engined cars after 2019...and Tesla has just started selling its first electric car aimed squarely at the middle classes: the Tesla 3 sells for $35,000 in the US, and 400,000 people have put down a small, refundable deposit towards one. Several thousand have already taken delivery, and the company hopes to sell half a million more next year. This is a remarkable figure for a machine with a fairly short range and a very limited number of specialised charging stations.
Some of it reflects the remarkable abilities of Elon Musk, the company's founder, as a salesman, engineer, and a man able to get the most out his factory workers and the governments he deals with...Mr Musk is selling a dream that the world wants to believe in.
This last may be the most important factor in the story. The private car is...a device of immense practical help and economic significance, but at the same time a theatre for myths of unattainable self-fulfilment. The one thing you will never see in a car advertisement is traffic, even though that is the element in which drivers spend their lives. Every single driver in a traffic jam is trying to escape from it, yet it is the inevitable consequence of mass car ownership.
The sleek and swift electric car is at one level merely the most contemporary fantasy of autonomy and power. But it might also disrupt our exterior landscapes nearly as much as the fossil fuel-engined car did in the last century. Electrical cars would of course pollute far less than fossil fuel-driven ones; instead of oil reserves, the rarest materials for batteries would make undeserving despots and their dynasties fantastically rich. Petrol stations would disappear. The air in cities would once more be breathable and their streets as quiet as those of Venice. This isn't an unmixed good. Cars that were as silent as bicycles would still be as dangerous as they are now to anyone they hit without audible warning.
The dream goes further than that. The electric cars of the future will be so thoroughly equipped with sensors and reaction mechanisms that they will never hit anyone. Just as brakes don't let you skid today, the steering wheel of tomorrow will swerve you away from danger before you have even noticed it...
This is where the fantasy of autonomy comes full circle. The logical outcome of cars which need no driver is that they will become cars which need no owner either. Instead, they will work as taxis do, summoned at will but only for the journeys we actually need. This the future towards which Uber...is working. The ultimate development of the private car will be to reinvent public transport. Traffic jams will be abolished only when the private car becomes a public utility. What then will happen to our fantasies of independence? We'll all have to take to electrically powered bicycles.
Despite their fierce reputation. Vikings may not have always been the plunderers and pillagers popular culture imagines them to be. In fact, they got their start trading in northern European markets, researchers suggest.
Combs carved from animal antlers, as well as comb manufacturing waste and raw antler material has turned up at three archaeological sites in Denmark, including a medieval marketplace in the city of Ribe. A team of researchers from Denmark and the U.K. hoped to identify the species of animal to which the antlers once belonged by analyzing collagen proteins in the samples and comparing them across the animal kingdom, Laura Geggel reports for LiveScience. Somewhat surprisingly, molecular analysis of the artifacts revealed that some combs and other material had been carved from reindeer antlers.... Given that reindeer (Rangifer tarandus) don't live in Denmark, the researchers posit that it arrived on Viking ships from Norway. Antler craftsmanship, in the form of decorative combs, was part of Viking culture. Such combs served as symbols of good health, Geggel writes. The fact that the animals shed their antlers also made them easy to collect from the large herds that inhabited Norway.
Since the artifacts were found in marketplace areas at each site it's more likely that the Norsemen came to trade rather than pillage. Most of the artifacts also date to the 780s, but some are as old as 725. That predates the beginning of Viking raids on Great Britain by about 70 years. (Traditionally, the so-called "Viking Age" began with these raids in 793 and ended with the Norman conquest of Great Britain in 1066.) Archaeologists had suspected that the Vikings had experience with long maritime voyages [that] might have preceded their raiding days. Beyond Norway, these combs would have been a popular industry in Scandinavia as well. It's possible that the antler combs represent a larger trade network, where the Norsemen supplied raw material to craftsmen in Denmark and elsewhere.

A conservation problem equally as important as that of soil erosion is the loss of soil fertility. Most agriculture was originally supported by the natural fertility of the soil; and, in areas in which soils were deep and rich in minerals, farming could be carried on for many years without the return of any nutrients to the soil other than those supplied through the natural breakdown of plant and animal wastes. In river basins, such as that of the Nile, annual flooding deposited a rich layer of silt over the soil, thus restoring its fertility. In areas of active volcanism, such as Hawaii, soil fertility has been renewed by the periodic deposition of volcanic ash. In other areas, however, natural fertility has been quickly exhausted. This is true of most forest soils, particularly those in the humid tropics. Because continued cropping in such areas caused a rapid decline in fertility and therefore in crop yields, fertility could be restored only by abandoning the areas and allowing the natural forest vegetation to return. Over a period if time, the soil surface would be rejuvenated by parent materials, new circulation channels would form deep in the soil, and the deposition of forest debris would restore minerals to the topsoil. Primitive agriculture in such forests was of shifting nature: areas were cleared of trees and the woody material burned to add ash to the soil; after a few years of farming, the plots would be abandoned and new sites cleared. As long as populations were sparse in relation to the area of forestland, such agricultural methods did little harm. They could not, however, support dense populations or produce large quantities of surplus foods.
Starting with the most easily depleted soils, which were also the easiest to farm, the practice of using various fertilizers was developed. The earliest fertilizers were organic manures, but later, larger yields were obtained by adding balanced combinations of those nutrients (e.g. potassium, nitrogen, phosphorus and calcium) that crop plants require in greatest quantity. Because high yields are essential, most modern agriculture depends upon the continued addition of chemical fertilizers to the soil. Usually these substances are added in mineral form, but nitrogen is often added as urea, an organic compound. 
Early in agricultural history, it was found that the practice of growing the same crop year after year in a particular plot of ground not only caused undesirable changes in the physical structure of the soil, but also drained the soil of its nutrients. The practice of crop rotation was discovered to be a useful way to maintain the condition of the soil, and also to prevent the buildup of those insects and other plant pests that are attracted to a particular kind of crop. In rotation systems, a grain crop is often grown the first year, followed by a leafy vegetable crop in the second year, and pasture crop in the third. The last usually contains legumes (e.g. clover, alfalfa), because such plants can restore nitrogen to the soil through the action of bacteria that live in nodules on their roots. 
In irrigation agriculture, in which water is brought in to supply the needs of crops in an area with insufficient rainfall, a particular soil-management problem that develops is the salinization (concentration of salts) of the surface soil. This most commonly results from inadequate drainage of the irrigated land; because the water cannot flow freely, it evaporates, and the salts dissolved in the water are left on the surface of the soil. Even though the water does not contain a large concentration of dissolved salts, the accumulation over the years can be significant enough to make the soil unsuitable for crop production. Effective drainage solves the problem; in many cases, drainage canals must be constructed, and drainage tiles must be laid beneath the surface of the soil. Drainage also requires the availability of an excess of water to flush the salts from the surface soil. In certain heavy soils with poor drainage, this problem can be quite severe; for example, large areas of formerly irrigated land in the Indus basin, in the Tigris Euphrates region, in the Nile Basin, and in the Western United States, have been seriously damaged by salinization.