List of top Verbal and Logical Ability Questions asked in XAT

Read the passage given below and answer the questions that follow it:
Elevation has always existed but has just moved out of the realm of philosophy and religion and been recognized as a distinct emotional state and a subject for psychological study. Psychology has long focused on what goes wrong, but in the past decade there has been an explosion of interest in “positive psychology”—what makes us feel good and why. University of Virginia moral psychologist Jonathan Haidt, who coined the term elevation, writes, “Powerful moments of elevation sometimes seem to push a mental ‘reset button,’ wiping out feelings of cynicism and replacing them with feelings of hope, love, and optimism, and a sense of moral inspiration.”
Haidt quotes first-century Greek philosopher Longinus on great oratory: “The effect of elevated language upon an audience is not persuasion but transport.” Such feeling was once a part of our public discourse. After hearing Abraham Lincoln’s second inaugural address, former slave Frederick Douglass said it was a “sacred effort.” But uplifting rhetoric came to sound anachronistic, except as practiced by the occasional master like
Martin Luther King Jr. It was while looking through the letters of Thomas Jefferson that Haidt first found a description of elevation. Jefferson wrote of the physical sensation that comes from witnessing goodness in others: It is to “dilate [the] breast and elevate [the] sentiments ... and privately covenant to copy the fair example.” Haidt took this description as a mandate.
Elevation can so often give us chills or a tingling feeling in the chest. This noticeable, physiological response is important. In fact, this physical reaction is what can tell us most surely that we have been moved. This reaction, and the prosocial inclinations it seems to inspire, has been linked with a specific hormone, oxytocin, emitted from Vagus nerve which works with oxytocin, the hormone of connection. The nerve’s activities can only be studied indirectly.
Elevation is part of a family of self-transcending emotions. Some others are awe, that sense of the vastness of the universe and smallness of self that is often invoked by nature; another is admiration, that goose-bump- making thrill that comes from seeing exceptional skill in action. While there is very little lab work on the elevating emotions, there is quite a bit on its counterpart, disgust. It started as a survival strategy: Early humans needed to figure out when food was spoiled by contact with bacteria or parasites. From there disgust expanded to the social realm—people became repelled by the idea of contact with the defiled or by behaviors that seemed to belong to lower people. “Disgust is probably the most powerful emotion that separates your group from other groups.” Haidt says disgust is the bottom floor of a vertical continuum of emotion; hit the up button, and you arrive at elevation. Another response to something extraordinary in another person can be envy, with all its downsides. Envy is unlikely, however, when the extraordinary aspect of another person is a moral virtue (such as acting in a just way, bravery and self-sacrifice, and caring for others).
Read the passage given below and answer the questions that follow it:
There are no Commandments in art and no easy axioms for art appreciation. “Do I like this?” is the question anyone should ask themselves at the moment of confrontation with the picture. But if “yes,” why “yes”? and if “no,” why “no”? The obvious direct emotional response is never simple, and ninety-nine times out of a hundred, the “yes” or “no” has nothing at all to do with the picture in its own right. “I don’t understand this poem” and “I don’t like this picture” are statements that tell us something about the speaker. That should be obvious, but in fact, such statements are offered as criticisms of art, as evidence against, not least because the ignorant, the lazy, or the plain confused are not likely to want to admit themselves as such. We hear a lot about the arrogance of the artist but nothing about the arrogance of the audience. The audience, who have given no thought to the medium or the method, will glance up, flick through, chatter over the opening chords, then snap their fingers and walk away like some monstrous Roman tyrant. This is not arrogance; of course, they can absorb in a few moments, and without any effort, the sum of the artist and the art.
Admire me is the sub-text of so much of our looking; the demand put on art that it should reflect the reality of the viewer. The true painting, in its stubborn independence, cannot do this, except coincidentally. Its reality is imaginative not mundane.
When the thick curtain of protection is taken away; protection of prejudice, protection of authority, protection of trivia, even the most familiar of paintings can begin to work its power. There are very few people who could manage an hour alone with the Mona Lisa. Our poor art-lover in his aesthetic laboratory has not succeeded in freeing himself from the protection of assumption. What he has found is that the painting objects to his lack of concentration; his failure to meet intensity with intensity. He still has not discovered anything about the painting, but the painting has discovered a lot about him. He is inadequate, and the painting has told him so.
When you say “This work is boring/ pointless/silly/obscure/élitist etc.,” you might be right, because you are looking at a fad, or you might be wrong because the work falls so outside of the safety of your own experience that in order to keep your own world intact, you must deny the other world of the painting. This denial of imaginative experience happens at a deeper level than our affirmation of our daily world. Every day, in countless ways, you and I convince ourselves about ourselves. True art, when it happens to us, challenges the “I” that we are and you say, “This work has nothing to do with me.”
Art is not a little bit of evolution that late-twentieth-century city dwellers can safely do without. Strictly, art does not belong to our evolutionary pattern at all. It has no biological necessity. Time taken up with it was time lost to hunting, gathering, mating, exploring, building, surviving, thriving. We say we have no time for art. If we say that art, all art. is no longer relevant to our lives, then we might at least risk the question “What has happened to our lives?” The usual question, “What has happened to art?” is too easy an escape route.
Read the passage given below and answer the questions that follow it:
Lately it seems everyone’s got an opinion about women’s speech. Everybody has been getting his two cents in about vocal fry, up-speak, and women’s allegedly over-liberal use of apologies. The ways women live and move in the world are subject to relentless scrutiny, their modes of speech are assessed against a (usually) masculine standard. This is increasingly true as women have entered previously male-dominated fields like industry and politics.
In his essay “On Speech and Public Release,” Joshua Gunn highlights the field of public address as an important arena where social roles and norms are contested, reshaped, and upheld. Gunn argues that the field of public address is an important symbolic arena where we harbor an “[ideological] bias against the feminine voice,” a bias, that is rooted in positive primal associations with masculinity (and the corresponding devaluation of femininity, the voice that constrains and nags—the mother, the droning Charlie Brown schoolteacher, the wife).
Gunn contends that masculine speech is the cultural standard. It’s what we value and respect. The low pitch and assertive demeanor that characterize the adult male voice signify reason, control, and authority, suitable for the public domain. Women’s voices are higher pitched, like those of immature boys, and their characteristic speech patterns have a distinctive cadence that exhibits a wider range of emotional expression. In Western cultures, this is bad because it comes across as uncontrolled. We associate uncontrolled speech - “the cry, the grunt, the scream, and the yawp” - with things that happen in the private, domestic spheres (both coded as feminine). Men are expected to repress passionate, emotional speech, Gunn explains, precisely because it threatens norms of masculine control and order.
The notion of control also relates to the cultural ideal of eloquence. Language ideologies in the U.S. are complex and highly prescriptive, but not formal or explicit. They are internalized by osmosis, from early observations of adult language use, criticism from teachers (i.e., telling little girls not to “be so bossy” and boys to “act like gentlemen”), and sanctions imposed by peers. These norms become most obvious when they areviolated. When men fall off the “control and reason” wagon, they suffer for it. Gunn recalls Howard Dean’s infamous 2004 “I Have a Scream” speech, in which Dean emitted a spontaneous high-pitched screech of joy after he rattled off a list of planned campaign stops. The rest, as they say, is history. Women face a different dilemma—how to please like a woman and impress like a man. Women in the public sphere have, historically, been expected to “perform” femininity and they usually do this by adopting a personal tone, giving anecdotal evidence, using domestic metaphors, and making emotional appeals to ideals of wifely virtue and motherhood.
Gunn arrives at the conclusion that “eloquence” is, essentially, code for values associated with masculinity, saying, “Performances of femininity are principally vocal and related, not to arguments, but to tone; not to appearance, but to speech; not to good reasons, but to sound. This implies that the ideology of sexism is much more insidious, much more deeply ingrained than many might suppose.”
Read the passage given below and answer the questions that follow it:
Does having a mood disorder make you more creative? That’s the most frequent question I hear about the relationship. But because we cannot control the instance of a mood disorder (that is, we can’t turn it on and off, and measure that person’s creativity under both conditions), the question should really be: Do individuals with a mood disorder exhibit greater creativity than those without? Studies that attempt to answer this question by comparing the creativity of individuals with a mood disorder against those without, have been well, mixed.
Studies that ask participants to complete surveys of creative personality, behavior or accomplishment, or to complete divergent thinking measures (where they are asked to generate lots of ideas) often find that individuals with mood disorders do not differ from those without. However, studies using “creative occupation” as an indicator of creativity (based on the assumption that those employed in these occupations are relatively more creative than others) have found that people with bipolar disorders are overrepresented in these occupations. These studies do not measure the creativity of participants directly, rather they use external records (such as censuses and medical registries) to tally the number of people with a history of mood disorders (compared with those without) who report being employed in a creative occupation at some time. These studies incorporate an enormous number of people and provide solid evidence that people who have sought treatment for mood disorders are engaged in creative occupations to a greater extent than those who have not. But can creative occupations serve as a proxy for creative ability?
The creative occupations considered in these studies are overwhelmingly in the arts, which frequently provide greater autonomy and less rigid structure than the average nine-to-five job. This makes these jobs more conducive to the success of individuals who struggle with performance consistency as the result of a mood disorder. The American psychiatrist Arnold Ludwig has suggested that the level of emotional expressiveness required to be successful in various occupations creates an occupational drift and demonstrated that the pattern of expressive occupations being associated with a greater incidence of psychopathology is a self- repeating pattern. For example, professions in the creative arts are associated with greater psychopathology than professions in the sciences whereas, within creative arts professions, architects exhibit a lower lifetime prevalence rate of psychopathology than visual artists and, within the visual arts, abstract artists exhibit lower rates of psychopathology than expressive artists. Therefore, it is possible that many people who suffer from mood disorders gravitate towards these types of professions, regardless of creative ability or inclination.
Please read the passage below and answer the questions that follow Labor and capital are the opposite poles of capitalist society. This polarity begins in each enterprise and is realized on a national and even international scale as a giant duality of classes which dominates the social structure. And yet this polarity is incorporated in a necessary identity between the two. Whatever its form, whether as money or commodities or means of production, capital is labor: it is labor that has been performed in the past, the objectified product of preceding phases of the cycle of production which becomes capital only through appropriation by the capitalist and its use in the accumulation of more capital. At the same time, as living labor which is purchased by the capitalist to set the production process into motion, labor is capital. That portion of money capital which is set aside for the payment of labor, the portion which in each cycle is converted into living labor power, is the portion of capital which stands for and corresponds to the working population, and upon which the latter subsists. Before it is anything else, therefore, the working class is the animate part of capital, the part which will set in motion the process that yields to the total capital its increment of surplus value. As such, the working class is first of all, raw material for exploitation. This working class lives a social and political existence of its own, outside the direct grip of capital. It protests and submits, rebels or is integrated into bourgeois society, sees itself as a class or loses sight of its own existence, in accordance with the forces that act upon it and the moods, conjunctures, and conflicts of social and political life. But since, in its permanent existence, it is the living part of capital, its occupational structure, modes of work, and distribution through the industries of society are determined by the ongoing processes of the accumulation of capital. It is seized, released, flung into various parts of the social machinery and expelled by others, not in accord with its own will or self-activity, but in accord with the movement of capital.
Please read the passage below and answer the questions that follow:
It is sometimes said that consciousness is a mystery in the sense that we have no idea what it is. This is clearly not true. What could be better known to us than our own feelings and experiences? The mystery of consciousness is not what consciousness is, but why it is.
Modern brain imaging techniques have provided us with a rich body of correlations between physical processes in the brain and the experiences had by the person whose brain it is. We know, for example, that a person undergoing stimulation in her or his ventromedial hypothalamus feels hunger. The problem is that no one knows why these correlations hold. It seems perfectly conceivable that ventromedial hypothalamus stimulation could do its job in the brain without giving rise to any kind of feeling at all. No one has even the beginnings of an explanation of why some physical systems, such as the human brain, have experiences. This is the difficulty David Chalmers famously called ‘the hard problem of consciousness’.
Materialists hope that we will one day be able to explain consciousness in purely physical terms. But this project now has a long history of failure. The problem with materialist approaches to the hard problem is that they always end up avoiding the issue by redefining what we mean by ‘consciousness’. They start off by declaring that they are going to solve the hard problem, to explain experience; but somewhere along the way they start using the word ‘consciousness’ to refer not to experience but to some complex behavioural functioning associated with experience, such as the ability of a person to monitor their internal states or to process information about the environment. Explaining complex behaviours is an important scientific endeavour. But the hard problem of consciousness cannot be solved by changing the subject.
In spite of these difficulties, many scientists and philosophers maintain optimism that materialism will prevail.
At every point in this glorious history, it is claimed, philosophers have declared that certain phenomena are too special to be explained by physical science - light, chemistry, life - only to be subsequently proven wrong by the relentless march of scientific progress.
Before Galileo it was generally assumed that matter had sensory qualities: tomatoes were red, paprika was spicy, flowers were sweet smelling. How could an equation capture the taste of spicy paprika? And if sensory qualities can’t be captured in a mathematical vocabulary, it seemed to follow that a mathematical vocabulary could never capture the complete nature of matter. Galileo’s solution was to strip matter of its sensory qualities and put them in the soul (as we might put it, in the mind). The sweet smell isn’t really in the flowers, but in the soul (mind) of the person smelling them ... Even colours for Galileo aren’t on the surfaces of the objects themselves, but in the soul of the person observing them. And if matter in itself has no sensory qualities, then it’s possible in principle to describe the material world in the purely quantitative vocabulary of mathematics. This was the birth of mathematical physics. 
But of course Galileo didn’t deny the existence of the sensory qualities. If Galileo were to time travel to the present day and be told that scientific materialists are having a problem explaining consciousness in purely physical terms, he would no doubt reply, “Of course they do, I created physical science by taking consciousness out of the physical world!”
Please read the passage below and answer the questions that follow:
Rene Descartes’ assertion that ideas may be held true with certainty if they are “clear and distinct” provides the context for Peirce’s title, “How to Make Our Ideas Clear.” Peirce argued that an idea may seem clear if it is familiar. Distinctness depends on having good definitions, and while definitions are desirable they do not yield any new knowledge or certainty of the truth of empirical propositions. Peirce argues that thought needs more than a sense of clarity; it also needs a method for making ideas clear. Once we have made an idea clear, then we can begin the task of determining its truth. The method that Peirce offers came to be known as the pragmatic method and the epistemology on which it depends is pragmatism. Peirce rejected Descartes’ method of doubt. We cannot doubt something, for the sake of method, that we do not doubt in fact. In a later essay, he would state as his rule “Dismiss make-believes.” This refers to Descartes’ method of doubting things, in the safety of his study, such things as the existence of the material world, which he did not doubt when he went out on the street. Peirce proposed that a philosophical investigation can begin from only one state of mind, namely, the state of mind in which we find ourselves when we begin. If any of us examines our state of mind, we find two kinds of thoughts: beliefs and doubts. Peirce had presented the interaction of doubt and belief in an earlier essay “The Fixation of Belief”.
Beliefs and doubts are distinct. Beliefs consist of states of mind in which we would make a statement; doubts are states in which we would ask a question. We experience a doubt as a sense of uneasiness and hesitation. Doubt serves as an irritant that causes us to appease it by answering a question and thereby fixing a belief and putting the mind to rest on that issue. A common example of a doubt would be arriving in an unfamiliar city and not being sure of the location of our destination address in relation to our present location. We overcome this doubt and fix a belief by getting the directions. Once we achieve a belief, we can take the necessary action to reach our destination. Peirce defines a belief subjectively as something of which we are aware and which appeases the doubt. Objectively, a belief is a rule of action. The whole purpose of thought consists in overcoming a doubt and attaining a belief. Peirce acknowledges that some people like to think about things or argue about them without caring to find a true belief, but he asserts that such dilettantism does not constitute thought. The beliefs that we hold determine how we will act. If we believe, rightly or wrongly, that the building that we are trying to reach sits one block to our north, we will walk in that direction. We have beliefs about matters of fact, near and far. For example, we believe in the real objects in front of us and we believe generally accepted historical statements. We also believe in relations of ideas such as that seven and five equal twelve. In addition to these we have many beliefs about science, politics, economics, religion and so on. Some of our beliefs may be false since we are capable of error. To believe something means to think that it is true.
Please read the passage below and answer the questions that follow:
If history doesn’t follow any stable rules, and if we cannot predict its future course, why study it? It often seems that the chief aim of science is to predict the future - meteorologists are expected to forecast whether tomorrow will bring rain or sunshine; economists should know whether devaluing the currency will avert or precipitate an economic crisis; good doctors foresee whether chemotherapy or radiation therapy will be more successful in curing lung cancer. Similarly, historians are asked to examine the actions of our ancestors so that we can repeat their wise decisions and avoid their mistakes. But it never works like that because the present is just too different from the past. It is a wast of time to study Hannibal’s tactics in the Second Punic War so as to copy them in the Third World War. What worked well in cavalry battles will not necessarily be of much benefit in cyber warfare. Science is not just about predicting the future, though. Scholars in all fields often seek to broaden our horizons, thereby opening before us new and unknown futures. This is especially true of history. Though historians occasionally try their hand at prophecy (without notable success), the study of history aims above all to make us aware of possibilities we don’t normally consider. Historians study the past not in order to repeat it, but in order to be liberated from it. Each and every one of us has been born into a given historical reality, ruled by particular norms and values, and managed by a unique economic and political system. We take this reality for granted, thinking it is natural, inevitable and immutable. We forget that our world was created by an accidental chain of events, and that history shaped not only our technology, politics and society, but also our thoughts, fears and dreams. The cold hand of the past emerges from the grave of our ancestors, grips us by the neck and directs our gaze towards a single future. We have felt that grip from the moment we were born, so we assume that it is a natural and inescapable part of who we are. Therefore we seldom try to shake ourselves free, and envision alternative futures. Studying history aims to loosen the grip of the past. It enables us to turn our head this way and that, and begin to notice possibilities that our ancestors could not imagine, or didn’t want us to imagine. By observing the accidental chain of events that led us here, we realise how our very thoughts and dreams took shape - and we can begin to think and dream differently. Studying history will not tell us what to choose, but at least it gives us more options.

Writing is both my vocation and my avocation: that’s all I do. 
You may wonder why I should write a genealogy. Well, to begin with, my story is interesting. And, next, I am a mystery -more so than a tree or a sunset or even a ash of lightning. But, sadly, I am taken for granted by those who use me, as if I were a mere incident and without background. This supercilious attitude relegates me to the level of the commonplace. This is a species of the grievous error in which mankind cannot too long persist without peril. For, as a wise man, G. K. Chesterton, observed, “We are perishing for want of wonder, not for want of wonders.”
I, simple though I appear to be, merit your wonder and awe, a claim I shall attempt to prove. In fact, if you can understand me-no, that’s too much to ask of anyone - if you can become aware of the miraculousness that I symbolize, you can help save the freedom mankind is so unhappily losing. I have a profound lesson to teach. And I can teach this lesson better than an automobile or an airplane or a mechanical dishwasher because - well, because I am seemingly so simple.
Simple? Yet, not a single person on the face of this earth knows how to make me. This sounds fantastic, doesn’t it? Especially when you realize that there are about one and one-half billion of my kind produced in the U.S. each year.
Pick me up and look me over. What do you see? Not much meets the eye - there’s some wood, lacquer, the printed labeling, graphite lead, a bit of metal, and an eraser

It’s taken me 60 years, but I had an epiphany recently: Everything, without exception, requires additional energy and order to maintain itself. I knew this in the abstract as the famous second law of thermodynamics, which states that everything is falling apart slowly. This realization is not just the lament of a person getting older. Long ago I learnt that even the most inanimate things we know of stone, iron columns, copper pipes, gravel roads, a piece of paper won’t last very long without attention and xing and the loan of additional order. Existence, it seems, is chie y maintenance.
What has surprised me recently is how unstable even the intangible is. Keeping a website or a software program a oat is like keeping a yacht a oat It is a black hole for attention. I can understand why a mechanical device like a pump would break down after a while moisture rusts metal, or the air oxidizes membranes, or lubricants evaporate, all of which require repair. But I wasn’t thinking that the nonmaterial world of bits would also degrade. What’s to break? Apparently everything. 
Brand-new computers will ossify. Apps weaken with use. Code corrodes. Fresh software just released will immediately begin to fray. On their own nothing you did. The more complex the gear, the more (not less) attention it will require. The natural inclination toward change is inescapable, even for the most abstract entities we know of: bits.
And then there is the assault of the changing digital landscape. When everything around you is upgrading, this puts pressure on your digital system and necessitates maintenance. You may not want to upgrade, but you must because everyone else is. It’s an upgrade arms race.
I used to upgrade my gear begrudgingly (Why upgrade if it still works?) and at the last possible moment. You know how it goes: Upgrade this and suddenly you need to upgrade that, which triggers upgrades everywhere. I would put it off for years because I had the experiences of one “tiny” upgrade of a minor part disrupting my entire working life. But as our personal technology is becoming more complex, more co-dependents upon peripherals, more like a living ecosystem, delaying upgrading is even more disruptive. If you neglect ongoing minor upgrades, the change backs up so much that the eventual big upgrade reaches traumatic proportions. So I now see upgrading as a type of hygiene: You do it regularly to keep your tech healthy. Continual upgrades are so critical for technological systems that they are now automatic for the major personal computer operating systems and some software apps. Behind the scenes, the machines will upgrade themselves, slowly changing their features over time. This happens gradually, so we don‘t notice they are “becoming.” We take this evolution as normal.
Technological life in the future will be a series of endless upgrades. And the rate of graduations is accelerating. Features shift, defaults disappear, menus morph. I’ll open up a software package I don’t use every day expecting certain choices, and whole menus will have disappeared.
No matter how long you have been using a tool, endless upgrades make you into a newbie the new user often seen as clueless. In this era of “becoming” everyone becomes a newbie. Worse, we will be newbies forever. That should keep us humble.
That bears repeating. All of us every one of us will be endless newbies in the future simply trying to keep up. Here’s why: First, most of the important technologies that will dominate life 30 years from now have not yet been invented, so naturally you’ll be a newbie to them. Second, because the new technology requires endless upgrades, you will remain in the newbie state. Third, because the cycle of obsolescence is accelerating (the average lifespan of a phone app is a mere 30 days!), you won’t have time to master anything before it is displaced, so you will remain in the newbie mode forever. Endless Newbie is the new default for everyone, no matter your age or experience.