DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. The mind-uploading technique depends on the fundamental premise that
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. The rejuvenation method of achieving immortality is based on the understanding that
1 Crore+ students have signed up on EduRev. Have you? Download the App |
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. The author feels that the rejuvenation method is a less problematic choice because
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. All of the following are ethical issues presented by mind-uploading EXCEPT:
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. Which of the following is not a negative consequence of rejuvenation technology?
I. Those who undergo rejuvenation become vulnerable to injury, poisoning and trauma.
II. Rejuvenation could widen the gap between the rich and the poor.
III. Rejuvenation could increase the population burden.
IV. Rejuvenation is still a hypothesis, and not practically feasible.
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Immortality has gone secular. It’s now the subject of serious investment - both intellectual and financial - by philosophers, scientists and the Silicon Valley. But if we treat death as a problem, what are the ethical implications of the highly speculative ‘solutions’ being mooted?
Of course, we don’t currently have the means of achieving human immortality, nor is it clear that we ever will. But two hypothetical options have attracted the most attention: rejuvenation technology, and mind uploading.
Rejuvenation promises to remove and reverse the damage of ageing at the cellular level. Gerontologists argue that growing old is a disease that we can circumvent by having our cells replaced or repaired at regular intervals. Practically speaking, this might mean that every few years, you would visit a rejuvenation clinic. Doctors would not only remove infected, cancerous or otherwise unhealthy cells, but also induce healthy ones to regenerate more effectively and remove accumulated waste products. This deep makeover would ‘turn back the clock’ on your body, leaving you physiologically younger than your actual age. You would, however, remain just as vulnerable to death from acute trauma - that is, from injury and poisoning, whether accidental or not - as you were before.
The other option would be mind uploading, in which your brain is digitally scanned and copied onto a computer. This method presupposes that consciousness is akin to software running on some kind of organic hard-disk - that what makes you ‘you’ is the sum total of the information stored in the brain’s operations, and therefore it should be possible to migrate the self onto a different physical substrate or platform. This remains a highly controversial stance. However, let’s leave aside for now the question of where ‘you’ really reside, and play with the idea that it might be possible to replicate the brain in digital form one day.
Unlike rejuvenation, mind uploading could actually offer something tantalisingly close to true immortality. Just as we currently backup files on external drives and cloud storage, your uploaded mind could be copied innumerable times and backed up in secure locations.
Despite this advantage, mind uploading presents some difficult ethical issues. Some philosophers think there is a possibility that your upload would appear functionally identical to your old self without having any conscious experience of the world.
You’d be more of a zombie than a person, let alone you. Others have argued that since you are reducible to the processes and content of your brain, a functionally identical copy of it - no matter the substrate on which it runs - could not possibly yield anything other than you.
What if the whole process is so qualitatively different from biological existence as to make you utterly terrified or even catatonic? If so, what if you can’t communicate to outsiders or switch yourself off? In this case, your immortality would amount to more of a curse than a blessing. Death might not be so bad after all, but unfortunately it might no longer be an option.
Which option is more ethically fraught? In our view, ‘mere’ rejuvenation would probably be a less problematic choice. Yes, vanquishing death for the entire human species would greatly exacerbate our existing problems of overpopulation and inequality - but the problems would at least be reasonably familiar. We can be pretty certain, for instance, that rejuvenation would widen the gap between the rich and poor, and would eventually force us to make decisive calls about resource use, whether to limit the rate of growth of the population, and so forth. On the other hand, mind uploading would open up a plethora of completely new and unfamiliar ethical quandaries.
Q. Which of the following best summarises the nature of the content presented in the sixth para, “Despite this advantage…yield anything other than you”?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. The author justifies art criticism in the first para of the passage by saying that art criticism
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. Why does the author think there is an opportunity and need for critics again?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. Which of the following has been dubbed as a weak defence of art criticism by the author in the first sentence of the second para?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. The author uses the metaphorical expression ‘words were crushed by images’ to describe how
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. The joke history played on critics like Fuller and Hughes is that
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Do art critics have a point anymore? Can they contribute anything to the development of art? For a long time, I've ducked this question. If you'd asked me any time over the past few years, I'd have replied that criticism does not seriously influence art. It has its own justification, however, as literature. If literature seems a pompous word, let's say entertainment. The appetite to read about art is almost as insatiable as the need to look at it; the critic provides a service that gives a chance to talk, think and tell stories about art and artists. Maybe it doesn't have any impact on art but it does occupy a place in the culture. That's what I would have said, until recently.
But that's a weak defence of criticism. The truth is that critics have been in retreat for a long time. In British art, they faced a cataclysmic loss of standing just before I came on the scene. When I was a student, the art critic whose books I bought was Peter Fuller, founder of the magazine Modern Painters and a savage critic of most trends in contemporary art. I enjoyed the provocative seriousness of his essays. I also loved the writing of Robert Hughes, another critic whose eloquence was - and is - very much at the expense of current art.
Not much newspaper criticism comes near their mark, but what critics did share, in the late 1980s, was a similar scepticism about new fashions, a "seriousness" defined by suspicion. And of course, history played a joke on these critics - even on Fuller and Hughes. While high moral disdain for shallow modern art was pouring from the printing presses, a generation of British artists led by Damien Hirst were getting away with anything they wanted - again and again and again. Words were crushed by images. Critics were reduced to the status of promoters. They had no other role.
Today I think there is an opportunity for critics again - and a need. The sheer volume and range of art that we're fed in a culture obsessed with galleries is so vast and confusing that a critic can get stuck in and make a difference. It really is time to stand up for what is good against what is meretricious. And it really is possible to find examples of excellence as well as stupidity. In other words, this is a great time to be a critic - to try to show people what really matters.
Yes, there's a staggering volume of mediocre art being talked up by fools. But there are real talents and real ideas too. The critic's task is to identify what is good and defend it come hell or high water - and to honestly denounce the bad. Art history can help in this task by enriching your perspective. Writing can give you flexibility in how and when you want to engage.
But engage we must. Engage we will.
Q. The role of the art critic, according to the author, does not include
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. Which fundamental distinction between tribalism and present politics does the author allude to in the first sentence of the eighth para: ‘This might sound … present politics’?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. The author has a bone to pick against ‘our colloquial evocation of tribalism’. Which of the following best captures it?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. The attribute about tribalism that the author demonstrates by citing James Boon is
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. The author mentions the Berbers of North Africa to highlight that
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. Which of the following best captures the essence of the author’s exhortation in the last para of the passage, ‘We could do worse … circumstances’?
DIRECTIONS for questions: The passage given below is accompanied by a set of six questions. Choose the best answer to each question.
Western politics has, it is argued, become more tribal. Tribes are distinguished from other human groups by their relatively clear social boundaries, often defined by kinship and demarcated territory. It’s clear that our political groups are increasingly based on single aspects of common identity with unambiguous boundaries, such as race and educational status.
Equally undeniable, however, is that most commentators vastly misunderstand the nature of tribes. The mistaken view of tribes as primitive, violent, and insular is already having pernicious effects on our response to this new era of politics. If we hope to live productively in this new political era, it helps to understand what tribes actually are - and how, rather than simply being the cause of our political problems, tribalism can also contribute to the solution.
Our colloquial evocation of tribalism mostly reflects outmoded anthropology. Scientists once believed that tribes were defined by their rigid social structures which were coercive; tribes were thought to be able to integrate their individual members only through the stultifying and imposed repetition of social customs.
But, years of empirical studies of actual tribes show that even as they are defined by relatively narrow identities, they are also characterized by porous boundaries. Tribes continually sample one another’s practices and social forms. Speaking about American Indians, James Boon, a Princeton anthropologist, noted that “each tribal population appears almost to toy with patterns that are fundamental to its neighbours.” Tribes also frequently adopt outsiders. Among certain tribes in North Africa, members could voluntarily leave their own tribe and join another.
Reciprocity, too, is a central part of traditional tribal life. Moral or material indebtedness, they know, can serve as the foundation of a strong relationship. It is common amongst the Berbers of North Africa, for example - for leaders to be chosen or ratified by the group’s opponents on the theory that one’s current enemy may later be an ally.
Many tribes - among them the Mae Enga of Papua New Guinea and the Lozi of Central Africa - also share the common practice of marrying members of enemy tribes to reduce the likelihood of internecine warfare. As a result of intermarriage and trading relations, a high proportion of tribes are multilingual.
Nor are tribes inherently authoritarian. Tribes often do not like too much power in too few hands for too long a period of time, and hence, employ a wide variety of practices that redistribute power.
This might sound quite distant from the partisan tribes of our present politics, which seem mostly to be characterized by their pugnaciousness. But the point is that, anthropologically, narrow identity groups such as tribes aren’t defined by exclusionary traits. The existence of narrow group identities doesn’t imply hostility among such groups.
Indeed, there is a reason that tribes historically have not embraced the rigid structural identities and institutions evident in our politics today. Excluding immigrants or cultural outsiders in the name of social solidarity comes at a price. Actual tribes know that social isolation limits their flexibility. But, we can only sustainably avoid paying such costs when we understand that resorting to defensive boundaries, even when we have gone “tribal,” is not our natural default position.
If politicians and ordinary citizens insist on using tribal metaphors to define our present identity politics, we need a more apt metaphor to understand tribes themselves. We could do worse than to think of tribes as amoebas, entities whose very shape adapts to fit changing circumstances.
Q. What does the author recommend for those who hope to live productively in this new political era?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
More than 7,000 years ago, people living in the Middle East discovered that they could ferment grapes to make wine. As with wine, the processing of coffee beans and cacao, used to make chocolate, also requires some fermentation. Cacao originated in the Amazon and was widely cultivated in Central America before Hernán Cortés brought it to the Old World in 1530. From Ethiopia, coffee was disseminated throughout the Middle East by Arab traders during the 6th century and it ultimately arrived in the New World during the 17th century. Over the next three centuries, other trading nations completed coffee’s worldwide dissemination and set it up as a mainstay crop of many of the world’s poorest economies. Cacao was treated in much the same way and is now grown in 33 tropical countries.
Given this history, Aimée Dudley and Justin Fay of the University of Washington wondered if the yeasts associated with cacao and coffee followed these plants from their places of origin just as yeasts had followed wine from the Middle East.
They collected unroasted cacao beans from 13 countries, including Haiti, Colombia, Ghana, Madagascar and Papua New Guinea, and unroasted coffee beans from 14 locations, including Ethiopia, Hawaii, Honduras, Indonesia and Yemen. They then set about studying the yeast found on the beans. As a control, the team also studied the yeasts on grapes from diverse locations.
As they report in Current Biology, although all vineyard-yeast strains are extremely similar genetically, there is tremendous diversity among the yeast strains associated with cacao and coffee. Further, all cacao beans collected from Venezuela carried closely related strains of yeast that were distinct from those found on Nigerian and Ecuadorian beans. The same was true for the yeasts found on coffee. The use of starter yeast culture is very rare in the processing of cacao and coffee, where growers tend to rely upon the species of yeast found locally.
This greater diversity of cacao and coffee yeasts means there is the potential to create new flavours by using a strain from one location in another. No one knows what the resulting coffee and chocolate might taste like, but if Dr Dudley and her colleagues are correct in their hunch, there will be many new flavours for coffee lovers and chocoholics to savour.
Q. Which of the following, if true, would strengthen the researchers’ finding that “all vineyard-yeast strains are extremely similar genetically”?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
More than 7,000 years ago, people living in the Middle East discovered that they could ferment grapes to make wine. As with wine, the processing of coffee beans and cacao, used to make chocolate, also requires some fermentation. Cacao originated in the Amazon and was widely cultivated in Central America before Hernán Cortés brought it to the Old World in 1530. From Ethiopia, coffee was disseminated throughout the Middle East by Arab traders during the 6th century and it ultimately arrived in the New World during the 17th century. Over the next three centuries, other trading nations completed coffee’s worldwide dissemination and set it up as a mainstay crop of many of the world’s poorest economies. Cacao was treated in much the same way and is now grown in 33 tropical countries.
Given this history, Aimée Dudley and Justin Fay of the University of Washington wondered if the yeasts associated with cacao and coffee followed these plants from their places of origin just as yeasts had followed wine from the Middle East.
They collected unroasted cacao beans from 13 countries, including Haiti, Colombia, Ghana, Madagascar and Papua New Guinea, and unroasted coffee beans from 14 locations, including Ethiopia, Hawaii, Honduras, Indonesia and Yemen. They then set about studying the yeast found on the beans. As a control, the team also studied the yeasts on grapes from diverse locations.
As they report in Current Biology, although all vineyard-yeast strains are extremely similar genetically, there is tremendous diversity among the yeast strains associated with cacao and coffee. Further, all cacao beans collected from Venezuela carried closely related strains of yeast that were distinct from those found on Nigerian and Ecuadorian beans. The same was true for the yeasts found on coffee. The use of starter yeast culture is very rare in the processing of cacao and coffee, where growers tend to rely upon the species of yeast found locally.
This greater diversity of cacao and coffee yeasts means there is the potential to create new flavours by using a strain from one location in another. No one knows what the resulting coffee and chocolate might taste like, but if Dr Dudley and her colleagues are correct in their hunch, there will be many new flavours for coffee lovers and chocoholics to savour.
Q. Which of the following best summarises the content of the passage?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
More than 7,000 years ago, people living in the Middle East discovered that they could ferment grapes to make wine. As with wine, the processing of coffee beans and cacao, used to make chocolate, also requires some fermentation. Cacao originated in the Amazon and was widely cultivated in Central America before Hernán Cortés brought it to the Old World in 1530. From Ethiopia, coffee was disseminated throughout the Middle East by Arab traders during the 6th century and it ultimately arrived in the New World during the 17th century. Over the next three centuries, other trading nations completed coffee’s worldwide dissemination and set it up as a mainstay crop of many of the world’s poorest economies. Cacao was treated in much the same way and is now grown in 33 tropical countries.
Given this history, Aimée Dudley and Justin Fay of the University of Washington wondered if the yeasts associated with cacao and coffee followed these plants from their places of origin just as yeasts had followed wine from the Middle East.
They collected unroasted cacao beans from 13 countries, including Haiti, Colombia, Ghana, Madagascar and Papua New Guinea, and unroasted coffee beans from 14 locations, including Ethiopia, Hawaii, Honduras, Indonesia and Yemen. They then set about studying the yeast found on the beans. As a control, the team also studied the yeasts on grapes from diverse locations.
As they report in Current Biology, although all vineyard-yeast strains are extremely similar genetically, there is tremendous diversity among the yeast strains associated with cacao and coffee. Further, all cacao beans collected from Venezuela carried closely related strains of yeast that were distinct from those found on Nigerian and Ecuadorian beans. The same was true for the yeasts found on coffee. The use of starter yeast culture is very rare in the processing of cacao and coffee, where growers tend to rely upon the species of yeast found locally.
This greater diversity of cacao and coffee yeasts means there is the potential to create new flavours by using a strain from one location in another. No one knows what the resulting coffee and chocolate might taste like, but if Dr Dudley and her colleagues are correct in their hunch, there will be many new flavours for coffee lovers and chocoholics to savour.
Q. Which of the following cannot be understood from the passage?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
Biotechnology proponents have argued repeatedly that GM seeds are crucial to feed the world, using the same flawed reasoning that was advanced for decades by the proponents of the Green Revolution. Conventional food production, they maintain, will not keep pace with the growing world population. Monsanto's ads proclaimed in 1998: “Worrying about starving future generations won't feed them. Food biotechnology will.” As agroecologists Miguel Altieri and Peter Rosset point out, this argument is based on two erroneous assumptions. The first is that world hunger is caused by a global shortage of food; the second is that genetic engineering is the only way to increase food production.
In their classic study, World Hunger: Twelve Myths, development specialists Frances Moore Lappé and her colleagues at the Institute for Food and Development Policy gave a detailed account of world food production that surprised many readers.
They showed that abundance, not scarcity, best describes the food supply in today's world. During the past three decades, increases in global food production have outstripped world population growth by 16 per cent. During that time, mountains of surplus grain have pushed prices strongly downward on world markets. Increases in food supplies have kept ahead of population growth in every region except Africa during the past fifty years. A 1997 study found that in the developing world, 78 percent of all malnourished children under five live in countries with food surpluses. Many of these countries, in which hunger is rampant, export more agricultural goods than they import.
The root causes of hunger around the world are unrelated to food production. They are poverty, inequality and lack of access to food and land. People go hungry because the means to produce and distribute food are controlled by the rich and powerful: world hunger is not a technological but a political problem. Miguel Altieri points out that we cannot ignore the social and political realities. ‘If the root causes are not addressed,’ he retorts, ‘hunger will persist no matter what technologies are used.’
Q. Which of the following best represents the flaws in the argument that food biotechnology will feed the starving future generations?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
Biotechnology proponents have argued repeatedly that GM seeds are crucial to feed the world, using the same flawed reasoning that was advanced for decades by the proponents of the Green Revolution. Conventional food production, they maintain, will not keep pace with the growing world population. Monsanto's ads proclaimed in 1998: “Worrying about starving future generations won't feed them. Food biotechnology will.” As agroecologists Miguel Altieri and Peter Rosset point out, this argument is based on two erroneous assumptions. The first is that world hunger is caused by a global shortage of food; the second is that genetic engineering is the only way to increase food production.
In their classic study, World Hunger: Twelve Myths, development specialists Frances Moore Lappé and her colleagues at the Institute for Food and Development Policy gave a detailed account of world food production that surprised many readers.
They showed that abundance, not scarcity, best describes the food supply in today's world. During the past three decades, increases in global food production have outstripped world population growth by 16 per cent. During that time, mountains of surplus grain have pushed prices strongly downward on world markets. Increases in food supplies have kept ahead of population growth in every region except Africa during the past fifty years. A 1997 study found that in the developing world, 78 percent of all malnourished children under five live in countries with food surpluses. Many of these countries, in which hunger is rampant, export more agricultural goods than they import.
The root causes of hunger around the world are unrelated to food production. They are poverty, inequality and lack of access to food and land. People go hungry because the means to produce and distribute food are controlled by the rich and powerful: world hunger is not a technological but a political problem. Miguel Altieri points out that we cannot ignore the social and political realities. ‘If the root causes are not addressed,’ he retorts, ‘hunger will persist no matter what technologies are used.’
Q. The argument that world hunger is caused by food shortage is weakened by which of the following?
DIRECTIONS for questions: The passage given below is accompanied by a set of three questions. Choose the best answer to each question.
Biotechnology proponents have argued repeatedly that GM seeds are crucial to feed the world, using the same flawed reasoning that was advanced for decades by the proponents of the Green Revolution. Conventional food production, they maintain, will not keep pace with the growing world population. Monsanto's ads proclaimed in 1998: “Worrying about starving future generations won't feed them. Food biotechnology will.” As agroecologists Miguel Altieri and Peter Rosset point out, this argument is based on two erroneous assumptions. The first is that world hunger is caused by a global shortage of food; the second is that genetic engineering is the only way to increase food production.
In their classic study, World Hunger: Twelve Myths, development specialists Frances Moore Lappé and her colleagues at the Institute for Food and Development Policy gave a detailed account of world food production that surprised many readers.
They showed that abundance, not scarcity, best describes the food supply in today's world. During the past three decades, increases in global food production have outstripped world population growth by 16 per cent. During that time, mountains of surplus grain have pushed prices strongly downward on world markets. Increases in food supplies have kept ahead of population growth in every region except Africa during the past fifty years. A 1997 study found that in the developing world, 78 percent of all malnourished children under five live in countries with food surpluses. Many of these countries, in which hunger is rampant, export more agricultural goods than they import.
The root causes of hunger around the world are unrelated to food production. They are poverty, inequality and lack of access to food and land. People go hungry because the means to produce and distribute food are controlled by the rich and powerful: world hunger is not a technological but a political problem. Miguel Altieri points out that we cannot ignore the social and political realities. ‘If the root causes are not addressed,’ he retorts, ‘hunger will persist no matter what technologies are used.’
Q. Which of the following has not been suggested by the author in the third para of the passage?
DIRECTIONS for questions: The sentences given in each of the following questions, when properly sequenced, form a coherent paragraph. Each sentence is labelled with a number. Decide on the proper order for the five sentences and key in the sequence of five numbers as your answer, in the input box given below the question.
1. It was not like pistons and wheels and gears all moving at once, massive and coordinated.
2. General illumination of that target he hit seems to be left for me.
3. Phaedrus did not try to use his brilliance for general illumination but he sought one specific distant target and aimed for it and hit it, and that was all.
4. Phaedrus was systematic as an individual, but to say that he thought and acted like a machine would be to misunderstand the nature of his thought.
5. The image of a laser beam comes to mind instead; a single pencil of light of such terrific energy in such concentration that it can be shot at the moon and its reflection seen back on earth.