List of practice Questions

Economists have spent most of the 20th century ignoring psychology, positive or otherwise. But today there is a great deal of emphasis on how happiness can shape global economies, or — on a smaller scale — successful business practice. This is driven, in part, by a trend in "measuring" positive emotions, mostly so they can be optimized. Neuroscientists, for example, claim to be able to locate specific emotions, such as happiness or disappointment, in particular areas of the brain. Wearable technologies, such as Spire, offer data-driven advice on how to reduce stress. We are no longer just dealing with "happiness" in a philosophical or romantic sense — it has become something that can be monitored and measured, including by our behavior, use of social media and bodily indicators such as pulse rate and facial expressions.
There is nothing automatically sinister about this trend. But it is disquieting that the businesses and experts driving the quantification of happiness claim to have our best interests at heart, often concealing their own agendas in the process. In the workplace, happy workers are viewed as a "win-win." Work becomes more pleasant, and employees, more productive. But this is now being pursued through the use of performance-evaluating wearable technology, such as Humanyze or Virgin Pulse, both of which monitor physical signs of stress and activity toward the goal of increasing productivity.
Cities such as Dubai, which has pledged to become the "happiest city in the world," dream up ever-more elaborate and intrusive ways of collecting data on well-being — to the point where there is now talk of using CCTV cameras to monitor facial expressions in public spaces. New ways of detecting emotions are hitting the market all the time: One company, Beyond Verbal, aims to calculate moods conveyed in a phone conversation, potentially without the knowledge of at least one of the participants. And Facebook [has] demonstrated . . . that it could influence our emotions through tweaking our news feeds — opening the door to ever-more targeted manipulation in advertising and influence.
As the science grows more sophisticated and technologies become more intimate with our thoughts and bodies, a clear trend is emerging. Where happiness indicators were once used as a basis to reform society, challenging the obsession with money that G.D.P. measurement entrenches, they are increasingly used as a basis to transform or discipline individuals.
Happiness becomes a personal project, that each of us must now work on, like going to the gym. Since the 1970s, depression has come to be viewed as a cognitive or neurological defect in the individual, and never a consequence of circumstances. All of this simply escalates the sense of responsibility each of us feels for our own feelings, and with it, the sense of failure when things go badly. A society that deliberately removed certain sources of misery, such as precarious and exploitative employment, may well be a happier one. But we won't get there by making this single, often fleeting emotion, the over-arching goal.
When researchers at Emory University in Atlanta trained mice to fear the smell of almonds (by pairing it with electric shocks), they found, to their consternation, that both the children and grandchildren of these mice were spontaneously afraid of the same smell. That is not supposed to happen. Generations of schoolchildren have been taught that the inheritance of acquired characteristics is impossible. A mouse should not be born with something its parents have learned during their lifetimes, any more than a mouse that loses its tail in an accident should give birth to tailless mice. . . .
Modern evolutionary biology dates back to a synthesis that emerged around the 1940s-60s, which married Charles Darwin’s mechanism of natural selection with Gregor Mendel’s discoveries of how genes are inherited. The traditional, and still dominant, view is that adaptations – from the human brain to the peacock’s tail – are fully and satisfactorily explained by natural selection (and subsequent inheritance). Yet [new evidence] from genomics, epigenetics and developmental biology [indicates] that evolution is more complex than we once assumed. . . .
In his book On Human Nature (1978), the evolutionary biologist Edward O Wilson claimed that human culture is held on a genetic leash. The metaphor [needs revision]. . . . Imagine a dogwalker (the genes) struggling to retain control of a brawny mastiff (human culture). The pair’s trajectory (the pathway of evolution) reflects the outcome of the struggle. Now imagine the same dog-walker struggling with multiple dogs, on leashes of varied lengths, with each dog tugging in different directions. All these tugs represent the influence of developmental factors, including epigenetics, antibodies and hormones passed on by parents, as well as the ecological legacies and culture they bequeath. . . .
The received wisdom is that parental experiences can’t affect the characters of their offspring. Except they do. The way that genes are expressed to produce an organism’s phenotype – the actual characteristics it ends up with – is affected by chemicals that attach to them. Everything from diet to air pollution to parental behaviour can influence the addition or removal of these chemical marks, which switches genes on or off. Usually these so-called ‘epigenetic’ attachments are removed during the production of sperm and eggs cells, but it turns out that some escape the resetting process and are passed on to the next generation, along with the genes. This is known as ‘epigenetic inheritance’, and more and more studies are confirming that it really happens. Let’s return to the almond-fearing mice. The inheritance of an epigenetic mark transmitted in the sperm is what led the mice’s offspring to acquire an inherited fear. . . .
Epigenetics is only part of the story. Through culture and society, [humans and other animals] inherit knowledge and skills acquired by [their] parents. . . . All this complexity . . . points to an evolutionary process in which genomes (over hundreds to thousands of generations), epigenetic modifications and inherited cultural factors (over several, perhaps tens or hundreds of generations), and parental effects (over single-generation timespans) collectively inform how organisms adapt. These extra-genetic kinds of inheritance give organisms the flexibility to make rapid adjustments to environmental challenges, dragging genetic change in their wake – much like a rowdy pack of dogs.

The Indian government [has] announced an international competition to design a National War Memorial in New Delhi, to honour all of the Indian soldiers who served in the various wars and counter-insurgency campaigns from 1947 onwards. The terms of the competition also specified that the new structure would be built adjacent to the India Gate – a memorial to the Indian soldiers who died in the First World War. Between the old imperialist memorial and the proposed nationalist one, India’s contribution to the Second World War is airbrushed out of existence.
The Indian government’s conception of the war memorial was not merely absent-minded. Rather, it accurately reflected the fact that both academic history and popular memory have yet to come to terms with India’s Second World War, which continues to be seen as little more than mood music in the drama of India’s advance towards independence and partition in 1947. Further, the political trajectory of the postwar subcontinent has militated against popular remembrance of the war. With partition and the onset of the India-Pakistan rivalry, both of the new nations needed fresh stories for self-legitimisation rather than focusing on shared wartime experiences.
However, the Second World War played a crucial role in both the independence and partition of India. . . . The Indian army recruited, trained and deployed some 2.5 million men, almost 90,000 of which were killed and many more injured. Even at the time, it was recognised as the largest volunteer force in the war. . . .
India’s material and financial contribution to the war was equally significant. India emerged as a major military-industrial and logistical base for Allied operations in south-east Asia and the Middle East. This led the United States to take considerable interest in the country’s future, and ensured that this was no longer the preserve of the British government.
Other wartime developments pointed in the direction of India’s independence. In a stunning reversal of its long-standing financial relationship with Britain, India finished the war as one of the largest creditors to the imperial power. 
Such extraordinary mobilization for war was achieved at great human cost, with the Bengal famine the most extreme manifestation of widespread wartime deprivation. The costs on India’s home front must be counted in millions of lives.
Indians signed up to serve on the war and home fronts for a variety of reasons. . . . [M]any were convinced that their contribution would open the doors to India’s freedom. . . . The political and social churn triggered by the war was evident in the massive waves of popular protest and unrest that washed over rural and urban India in the aftermath of the conflict. This turmoil was crucial in persuading the Attlee government to rid itself of the incubus of ruling India. . . .
Seventy years on, it is time that India engaged with the complex legacies of the Second World War. Bringing the war into the ambit of the new national memorial would be a fitting – if not overdue – recognition that this was India’s War.

The complexity of modern problems often precludes any one person from fully understanding them. Factors contributing to rising obesity levels, for example, include transportation systems and infrastructure, media, convenience foods, changing social norms, human biology and psychological factors. . . . The multidimensional or layered character of complex problems also undermines the principle of meritocracy: the idea that the ‘best person’ should be hired. There is no best person. When putting together an oncological research team, a biotech company such as Gilead or Genentech would not construct a multiple-choice test and hire the top scorers, or hire people whose resumes score highest according to some performance criteria. Instead, they would seek diversity. They would build a team of people who bring diverse knowledge bases, tools and analytic skills. . . .
Believers in a meritocracy might grant that teams ought to be diverse but then argue that meritocratic principles should apply within each category. Thus the team should consist of the ‘best’ mathematicians, the ‘best’ oncologists, and the ‘best’ biostatisticians from within the pool. That position suffers from a similar flaw. Even with a knowledge domain, no test or criteria applied to individuals will produce the best team. Each of these domains possesses such depth and breadth, that no test can exist. Consider the field of neuroscience. Upwards of 50,000 papers were published last year covering various techniques, domains of enquiry and levels of analysis, ranging from molecules and synapses up through networks of neurons. Given that complexity, any attempt to rank a collection of neuroscientists from best to worst, as if they were competitors in the 50-metre butterfly, must fail. What could be true is that given a specific task and the composition of a particular team, one scientist would be more likely to contribute than another. Optimal hiring depends on context. Optimal teams will be diverse.
Evidence for this claim can be seen in the way that papers and patents that combine diverse ideas tend to rank as high-impact. It can also be found in the structure of the so-called random decision forest, a state-of-the-art machine-learning algorithm. Random forests consist of ensembles of decision trees. If classifying pictures, each tree makes a vote: is that a picture of a fox or a dog? A weighted majority rules. Random forests can serve many ends. They can identify bank fraud and diseases, recommend ceiling fans and predict online dating behaviour. When building a forest, you do not select the best trees as they tend to make similar classifications. You want diversity. Programmers achieve that diversity by training each tree on different data, a technique known as bagging. They also boost the forest ‘cognitively’ by training trees on the hardest cases – those that the current forest gets wrong. This ensures even more diversity and accurate forests. Yet the fallacy of meritocracy persists. Corporations, non-profits, governments, universities and even preschools test, score and hire the ‘best’. This all but guarantees not creating the best team. Ranking people by common criteria produces homogeneity. . . . That’s not likely to lead to breakthroughs.
Grove snails as a whole are distributed all over Europe, but a specific variety of the snail, with a distinctive white-lipped shell, is found exclusively in Ireland and in the Pyrenees mountains that lie on the border between France and Spain. The researchers sampled a total of 423 snail specimens from 36 sites distributed across Europe, with an emphasis on gathering large numbers of the white-lipped variety. When they sequenced genes from the mitochondrial DNA of each of these snails and used algorithms to analyze the genetic diversity between them, they found that. . . a distinct lineage (the snails with the white-lipped shells) was indeed endemic to the two very specific and distant places in question.
Explaining this is tricky. Previously, some had speculated that the strange distributions of creatures such as the white-lipped grove snails could be explained by convergent evolution—in which two populations evolve the same trait by coincidence—but the underlying genetic similarities between the two groups rules that out. Alternately, some scientists had suggested that the white-lipped variety had simply spread over the whole continent, then been wiped out everywhere besides Ireland and the Pyrenees, but the researchers say their sampling and subsequent DNA analysis eliminate that possibility too. “If the snails naturally colonized Ireland, you would expect to find some of the same genetic type in other areas of Europe, especially Britain. We just don’t find them,” Davidson, the lead author, said in a press statement.
Moreover, if they’d gradually spread across the continent, there would be some genetic variation within the white-lipped type, because evolution would introduce variety over the thousands of years it would have taken them to spread from the Pyrenees to Ireland. That variation doesn’t exist, at least in the genes sampled. This means that rather than the organism gradually expanding its range, large populations instead were somehow moved en mass to the other location within the space of a few dozen generations, ensuring a lack of genetic variety.
“There is a very clear pattern, which is difficult to explain except by involving humans,” Davidson said. Humans, after all, colonized Ireland roughly 9,000 years ago, and the oldest fossil evidence of grove snails in Ireland dates to roughly the same era. Additionally, there is archaeological evidence of early sea trade between the ancient peoples of Spain and Ireland via the Atlantic and even evidence that humans routinely ate these types of snails before the advent of agriculture, as their burnt shells have been found in Stone Age trash heaps.
The simplest explanation, then? Boats. These snails may have inadvertently traveled on the floor of the small, coast-hugging skiffs these early humans used for travel, or they may have been intentionally carried to Ireland by the seafarers as a food source. “The highways of the past were rivers and the ocean–as the river that flanks the Pyrenees was an ancient trade route to the Atlantic, what we’re actually seeing might be the long lasting legacy of snails that hitched a ride…as humans travelled from the South of France to Ireland 8,000 years ago,” Davidson said.
More and more companies, government agencies, educational institutions and philanthropic organisations are today in the grip of a new phenomenon: ‘metric fixation’. The key components of metric fixation are the belief that it is possible – and desirable – to replace professional judgment (acquired through personal experience and talent) with numerical indicators of comparative performance based upon standardised data (metrics); and that the best way to motivate people within these organisations is by attaching rewards and penalties to their measured performance.
The rewards can be monetary, in the form of pay for performance, say, or reputational, in the form of college rankings, hospital ratings, surgical report cards and so on. But the most dramatic negative effect of metric fixation is its propensity to incentivise gaming: that is, encouraging professionals to maximise the metrics in ways that are at odds with the larger purpose of the organisation. If the rate of major crimes in a district becomes the metric according to which police officers are promoted, then some officers will respond by simply not recording crimes or downgrading them from major offences to misdemeanours. Or take the case of surgeons. When the metrics of success and failure are made public – affecting their reputation and income – some surgeons will improve their metric scores by refusing to operate on patients with more complex problems, whose surgical outcomes are more likely to be negative. Who suffers? The patients who don’t get operated upon.
When reward is tied to measured performance, metric fixation invites just this sort of gaming. But metric fixation also leads to a variety of more subtle unintended negative consequences. These include goal displacement, which comes in many varieties: when performance is judged by a few measures, and the stakes are high (keeping one’s job, getting a pay rise or raising the stock price at the time that stock options are vested), people focus on satisfying those measures – often at the expense of other, more important organisational goals that are not measured. The best-known example is ‘teaching to the test’, a widespread phenomenon that has distorted primary and secondary education in the United States since the adoption of the No Child Left Behind Act of 2001.
Short-termism is another negative. Measured performance encourages what the US sociologist Robert K Merton in 1936 called ‘the imperious immediacy of interests … where the actor’s paramount concern with the foreseen immediate consequences excludes consideration of further or other consequences’. In short, advancing short-term goals at the expense of long-range considerations. This problem is endemic to publicly traded corporations that sacrifice long-term research and development, and the development of their staff, to the perceived imperatives of the quarterly report.
To the debit side of the ledger must also be added the transactional costs of metrics: the expenditure of employee time by those tasked with compiling and processing the metrics in the first place – not to mention the time required to actually read them. . . .
Will a day come when India’s poor can access government services as easily as drawing cash from an ATM? . . . [N]o country in the world has made accessing education or health or policing or dispute resolution as easy as an ATM, because the nature of these activities requires individuals to use their discretion in a positive way. Technology can certainly facilitate this in a variety of ways if it is seen as one part of an overall approach, but the evidence so far in education, for instance, is that just adding computers alone doesn’t make education any better. . .
The dangerous illusion of technology is that it can create stronger, top down accountability of service providers in implementation-intensive services within existing public sector organisations. One notion is that electronic management information systems (EMIS) keep better track of inputs and those aspects of personnel that are ‘EMIS visible’ can lead to better services. A recent study examined attempts to increase attendance of Auxiliary Nurse Midwife (ANMs) at clinics in Rajasthan, which involved high-tech time clocks to monitor attendance. The study’s title says it all: Band-Aids on a Corpse . . . e-governance can be just as bad as any other governance when the real issue is people and their motivation.
For services to improve, the people providing the services have to want to do a better job with the skills they have. A study of medical care in Delhi found that even though providers, in the public sector had much better skills than private sector providers their provision of care in actual practice was much worse.
In implementation-intensive services the key to success is face-to-face interactions between a teacher, a nurse, a policeman, an extension agent and a citizen. This relationship is about power. Amartya Sen’s . . . report on education in West Bengal had a supremely telling anecdote in which the villagers forced the teacher to attend school, but then, when the parents went off to work, the teacher did not teach, but forced the children to massage his feet. . . . As long as the system empowers providers over citizens, technology is irrelevant.
The answer to successfully providing basic services is to create systems that provide both autonomy and accountability. In basic education for instance, the answer to poor teaching is not controlling teachers more . . . The key . . . is to hire teachers who want to teach and let them teach, expressing their professionalism and vocation as a teacher through autonomy in the classroom. This autonomy has to be matched with accountability for results—not just narrowly measured through test scores, but broadly for the quality of the education they provide.
A recent study in Uttar Pradesh showed that if, somehow, all civil service teachers could be replaced with contract teachers, the state could save a billion dollars a year in revenue and double student learning. Just the additional autonomy and accountability of contracts through local groups—even without complementary system changes in information and empowerment— led to that much improvement. The first step to being part of the solution is to create performance information accessible to those outside of the government. . . .