The inclusion or exclusion of refugees in contemporary Europe has been a contentious topic of debate and is high on the international agenda. With faster extradition procedures and more limited border crossings, the attitude of Western governments to asylum applications has seemed increasingly restrictive.
Economic and educational inclusion are often given precedence over other areas, but a new project is focusing on social and cultural inclusion. Started in Berlin, ‘Multaka: Museums as Meeting Point’ aims to encourage the sharing of historical and national knowledge. The concept is to train Syrian and Iraqi refugees to become museum guides so that they can take tours in their mother tongues.
The course is currently offered to young adults and teenagers, but will be extended to older applicants as the programme develops. The tours play on the relationship between the host country and the guides’ home countries. Those involved in organising the project hope that it will help to improve native residents’ knowledge of other cultures and give refugees a sense of pride and involvement in their new community.
At this time there are four German museums that are involved in the Multaka, but more are soon to be added with the Pitt Rivers Museum and the Museum of the History of Science in Oxford developing their own version of the scheme. There have also been talks with the Louvre and MoMA. Of the four museums currently taking part, two focus on Syrian and Iraqi artefacts and two on the connections between Islam, Judaism and Christianity.
Students aspiring to study Philosophy and Theology could prepare for their interviews by examining how different religions and religious artefacts are perceived from different cultural stand points. Art Historians might like to consider how museums can be regarded as a meeting point (multaka) for our common past. Those aiming to study HSPS could investigate the social and political impact of such a programme as this.
Fasting has a long history and is central to many of the world’s religions: Yom Kippur is a fast day in Judaism, Ramadan is a fast month in Islam, and Lent is a 40-day fast in Christianity for Roman Catholics.
Uses of fasting can similarly be traced as far back as our primitive cultures, with examples as far ranging as: coming-of-age rites, appeasing violent deities, rituals to avoid catastrophes, and as a preparation tool for war. Fasting has even been used as a form of political protest, with the Suffragettes, the Irish Republicans and Gandhi all using hunger strikes to convey their message.
Described by Paracelsus as the ‘physican within’, the health benefits of fasting, as well as the spiritual benefits, have also be long extolled. The ‘nature cure’ became popular in the 1920s with fasting being used to treat everything from heart disease to headaches. In recent years, the rise of the 5:2 diet (where only 500 of calories are consumed on two days a week) and the 16:8 (where eating is restricted to an 8-hour window) have refreshed the concept of fasting in the public’s eye.
Students applying for History could investigate further into past beliefs around fasting and how charlatans have used extreme restrictive diets as a con in the past. Those applying for Theology might like to investigate other examples of abstinence in religion.
Section 377 of the Indian Penal Code was introduced 1861 during the British rule of India. It criminalised sexual activities that were considered “against the order of nature”. On the 5th September this law was struck down in relation to activities of the LGBT community, a huge step forward in LGBT rights in the nation with many now calling for same-sex marriage legalisation. Indian society is broadly considered to have strongly conservative values, particularly in the lower income population of this very class-based society. Yet traditional religions of India can be seen to be far more aligned with queer rights than those of the West.
An article published on Quartz identifies how the karmic based faiths contain ideas that support queer rights. These include the lack of a judgement day, therefore the possibility of eternal damnation, and that the body, personality and sexuality are outcomes of karmic burden making them natural. God and nature are one and infinite (ananta). Applicants for Theology and Oriental Studies can consider how the juxtaposition between these concepts and the colonial language of Section 377 highlight the impact of religious foundations have on the variances between contemporary values.
Manil Suri, of the New York Times, points out how he has experienced very little homophobia when visiting his family in Delhi. Bollywood is arguably ahead of its Western counterparts with the increasing depiction of gay characters as much more than mere caricatures. Importantly, India can be used as a model for other non-Western societies to follow because it can be viewed without an Imperialist, Western, authority that often is seen as patronising.
Applicants for Asian and Middle Eastern Studies should question whether the historical and religious foundations of Indian culture will result in LGBT rights becoming more accepted within its modern society. Also are other countries more likely to follow suit in increasing judicial liberties or consider these constitutional reforms Western influence in India?
Intelligent design, often used as an argument for the existence of God, posits that the natural world shows signs of having been designed by some form of intelligence rather than as the result of an undirected process such as natural selection. Proponents often cite the harmony and complexity of various elements of the human body. Intelligent design itself has received very little scientific support. But such issues have recently come to the fore since astronaut Tim Peake stated, “I’m not religious [but] it doesn’t necessarily mean that I don’t seriously consider that the universe could have been created from intelligent design”. Further to arguing that intelligent design has no evidential basis, many opponents are taking issue with the very idea that the human body and nature in general shows signs of perfection (and therefore of creative intelligence). Clearly, all human systems have flaws that allow them to malfunction. Our cells can develop cancer, our immune system can attack us, our eyes fail us. Evolutionary biologist Matan Shelomi asks, “who designed these faulty things? The answer can’t be a God, because a God so incompetent in designing vision sensors isn’t worth worshiping.”
It must be said that such critics have somewhat misrepresented the argument about the human eye and the main thrust of the intelligent design theory, which is less about perfection and more about complexity and codependence of elements. ‘Irreducible complexity’ describes a system in which all the parts work together to achieve a certain function, and which would not work if any one small part were omitted; advocates for intelligent design propose that many biological mechanisms can be described in this way, and therefore that natural selection could not bring about these mechanisms. Darwin himself conceded, “if it could be demonstrated that any complex organ existed, which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down”; however, he added, “I can find out no such case.” In fact, in the 20th century Herman Muller contemplated a sort of irreducible complexity—but rather than seeing it as an obstacle to evolution he described it as the expected result of evolution by natural selection: “being thus finally woven, as it were, into the most intimate fabric of the organism, the once novel character can no longer be withdrawn with impunity, and may have become vitally necessary”. Moreover, the irreducible complexity argument ignores the phenomenon of expiation, whereby an already existing trait may change function during the course of evolution.
Applicants for Biology or Natural Sciences (B) should be familiar with both historical and contemporary research on evolution. Those interested in Theology or Philosophy may wish to look into intelligent design as well as arguments from teleology.
Can you picture yourself as a cyborg? Do you yearn to transcend the limitations of feeble flesh? Then you might want to join the transhumanist movement. In his award-winning book To Be a Machine, Mark O’Connell describes the core transhumanist beliefs: “that we can and should eradicate ageing as a cause of death; that we can and should use technology to augment our bodies and our minds; that we can and should merge with machines, remaking ourselves, finally, in the image of our own higher ideals.”
Of course, we do already in some sense “augment” our natural bodies with the use of such things as contact lenses and hearing aids. As technology progresses, more and more people are being fitted with bionic limbs. But so far, these are used as a plan B—an attempt at a replacement for the loss of a natural limb, and certainly not preferable to it. Those who advocate for transhumanism, on the other hand, dream of a deliberate merging of man and machine, and see these artificial body parts as superior. By embracing technology and applying it to our own bodies, they hope to create humans with increased senses, intelligence, strength, and life expectancy.
If this all strikes you as rather dystopian, you’re not alone; many scientists have raised ethical concerns. Blay Whitby, artificial intelligence expert at Sussex University, uses the example of athletes without legs who run on carbon-fibre blades. It is not unlikely, he says, that such athletes will be able to outperform able-bodied runners; would it then be ethical for athletes to deliberately have their legs removed and replace them with such artificial legs in order to beat world records? For Whitby, the idea is repulsive. But others do not see the issue. Cybernetics expert Kevin Warwick protests, “what is wrong with replacing imperfect bits of your body with artificial parts that will allow you to perform better?” Warwick himself has already put his money where his mouth is and implanted electronic devises into his own body. It doesn’t take an expert to jump on the trend, however; several people have had the chip from their contactless card inserted under the skin of their hand in order to go about their day unburdened by a wallet.
Others still are putting their hopes on the future by handing over their bodies to be cryogenically preserved in liquid nitrogen after death, in the hopes of being thawed and awakened at some point in the future when technology has advanced enough to resurrect and enhance them.
Applicants for Philosophy or Theology might like to consider the ethics of this movement; what would be the implications of this new race of superhumans? Is it right to tamper with the natural world (or ‘creation’) in this way? Medics may also want to think about it from the standpoint of medical ethics. Does the natural body have an inherent value, such that it is always wrong to remove a healthy body part?
Can the so-called Golden Rule be considered a universal, stand-alone ethical code or is morality rather more complicated? This well-known maxim crops us again and again in the writings of different religious traditions: in Confucianism, “what you do not wish for yourself, do not do to others”; in Judaism, “what is hateful to you, do not do to your fellow man”; in Christianity, “do unto others as you would have them do unto you”; in Islam, “no one of you is a believer until he desires for his brother that which he desires for himself”; and so on. At face value, the Golden Rule seems eminently intuitive; one does not need to be a theologian or moral philosopher in order to grasp this principle, and to be liable to judgment according to their enactment of it. Whilst many aspects of morality are still up for debate and may vary across cultures, belief in mutuality seems to be more or less universal.
But can it indeed be applied to everyone? Immanuel Kant’s famous Categorical Imperative states, “act only in accordance with that maxim through which you can at the same time will that it become a universal law.” Whilst this seems very similar to the Golden Rule, Kant claimed that it was in fact superior, partly because the Golden Rule remained hypothetical rather than categorical- “If you want X done to you, then do X to others.” Kant claimed that “if you want X done to you” remains open to subjectivity and dispute. For example, you may well be willing to reject the help of others if it means you don’t have a duty to help them. On the other hand, because of its strict, universalising nature, Kant’s Categorical Imperative cuts through some of the subtleties of moral decision-making—sometimes an option seems to us to be the most ethical and loving in that scenario even though we would not command it in every case.
Some contemporary thinkers argue that the Golden Rule is less useful in our modern age, because easy access to information as well as globalisation and immigration make us increasingly aware of the differences between cultures in terms of ethics and lifestyle. Diverse societies means a diversity of values, which we cannot do justice to by a universal ethical maxim.
Applicants for Philosophy or Theology may wish to scrutinise the Golden Rule, so often taken for granted, and consider whether it can or should be applied universally. Do you think morality is objective? If you had to come up with one ethical rule that everyone had to live by, what would it be?
In The Descent of Man, Darwin wrote: “of all the differences between man and the lower animals, the moral sense or conscience is by far the most important.” Since Darwin’s time, researchers have been looking into the possible origins of our morality do determine whether it is a trait that evolved—and if so, how and why.
Darwin was puzzled by the fact that human beings voluntarily go to war and die for their larger groups, as this doesn’t fit with the idea of natural selection being driven by individuals acting on their own self-interest. He proposed the idea of group selection, according to which a group with more altruists would have more survivors in a war or crisis, thereby passing on the altruistic genes. But the frequency of such events and the force of group selection would have to be enormous for it to override selection between individuals, making this theory unlikely.
Evolutionary anthropologist Christopher Boehm argues that human morality emerged when hunter-gatherers formed groups to hunt big game (about a quarter of a million years ago), and cooperation became necessary for survival. In this type of society where the food source has to be actively shared, alpha male tendencies would have been suppressed and hierarchies eliminated in order to share food evenly. Those who tried to take more than their fair share of meat would have been killed, and hence self-control and the willingness to share would have become evolutionarily successful traits.
Michael Tomasello, co-director of the Max Planck Institute for Evolutionary Anthropology in Leipzig, has spent years conducting experiments on chimpanzees and human children to compare their social behaviour and cognitive abilities. He argues that human morality is a consequence of our tendency to collaborate more than other apes do. Chimpanzees may be said to have a social nature, with individuals sometimes working together; but according to Tomasello, only humans are “ultra-social”, having developed an enhanced predisposition to cooperation. This is borne out by experiments which shows that human toddlers are much more likely to choose cooperation than chimpanzees and are more willing to share rewards. Like Boehm, Tomasello subscribes to the collaborative hunting theory, but adds that this new food source not only encouraged sharing but led people to view themselves as part of a larger unit—a perspective which he calls “shared intentionality”, and which is behind all human collective projects and cultural institutions. This perspective, he believes, is the root of morality.
Applicants for Anthropology or Biology might want to familiarise themselves with Darwin’s thoughts on social evolution including the evolution of morality, and the subsequent research on it. Do you find the collaborative hunting theory convincing? Students wishing to study Theology or Philosophy may wish to think about the implications of these theories on our understanding of ethics more generally. If altruism is merely the result of certain genetic traits resulting in reproductive success for the individuals possessing them a specific context, is it objectively and universally required of us? In our current society, is altruism still an evolutionarily strong trait or do the ruthless come out on top?
If you thought that medieval illustrators drew a line in their borderline insanity with little drawings of antagonistic snails being battered by knights, you were sadly mistaken. While the snails were getting beaten to a slightly shell-y pulp, rabbits were on the other end of the spectrum; they were having a world of fun brutalising humans in the most bizarre ways. We have rabbits beating men with their bare hands (paws?), we have a rabbit openly beheading a man and then in what has to be one of the most ridiculous historical pictures around, a rabbit mid leap, giant axe in hand, about to violently sever what looks like an old wizard king.
Why do we see this recurring theme? And why rabbits specifically? It is thought that these drolleries (the name given to these marginal illustrations), were an instance of irony; an opportunity to reverse the classical ideas about certain animals in a humorously violent way. Hares, according to the bestiaries of the time, were considered to be timid, and fast runners. Meek, and arguably at the bottom of the ladder when it came to the classic ‘who would win in a fight’ games that people would play, it made it all the funnier to show a pair of them slap a man around with a stick whilst simultaneously sawing his foot off. Illustrators were apparently amused by the notion of this innocent, passive creature enacting their revenge on unsuspecting humans. This particular image has persisted through the ages, making arguably its most famous cultural reimagining in Monty Python, in the famous Killer Bunny sketch.
Students interested in studying History of Art should look at how images persist through the ages, and the importance of visual iconography in centuries past. Students thinking about Theology should look at how the meanings applied to images set a precedent in our cultural understanding of aspects of human nature and God, while those thinking of studying HSPS should look at the anthropology of man’s fascination with drawing.
Religious icons may at first glance appear to be foreign to the modern artistic sensibility, objects of a bygone era to be glanced at in museums. Far from being fossilised, however, this Christian devotional tradition is being kept alive by many artists who use the medium not only to convey religious meaning but to comment on modern society in new and arresting ways.
Nikola Sarik is one such artist. His most well-known piece to date is The Holy Martyrs of Libya, a tribute to the 21 Christians executed by ISIS in February 2015. The contrast between ancient and modern is striking; Christ, depicted in a traditional manner, embraces the murdered men dressed in orange jumpsuits, while their masked captors stand behind them. The work has the visual effect of dragging the religious dimension into the world of our current events; in turn, ancient depictions of saints and martyrs are brought more clearly into their own contemporary context as we recognise them as living, breathing people. Sarik’s deliberately flat, almost childlike style is in part inspired by 20th-century artists such as Klimt and Matisse, but is also reminiscent of the cartoonish style of Romanesque art.
Other artists use iconography to make poignant statements about suffering and oppression much closer to home. In 2016, an icon by Mark Dukes was unveiled called Our Lady, Mother of Ferguson, a reference to those killed by law enforcement officers in the US. The icon seamlessly blends ancient religious symbolism with contemporary imagery. It depicts a black madonna and the dark silhouette of a smaller figure, whose cruciform halo is also the crosshairs of a gun. Both figures have their arms raised in the traditional orans (praying) position, but the modern context gives this gesture a different meaning: hands up, don’t shoot.
Perhaps more striking still is Maxwell Lawton’s work, Man of Sorrows—Christ with AIDS, painted in the midst of the AIDS crisis. The traditional Man of Sorrows theme depicts a dejected Christ, crowned with thorns and showing his wounds. Lawton’s piece has Christ hooked up to an IV drip and covered in cancer sores typical of many AIDS sufferers. In the background, Jesus’s words from Matthew 25:40 are quoted in three languages: “The King will reply, ‘truly I tell you, whatever you did for one of the least of these brothers of mine, you did for me.’ ” Lawton’s work emphasises the fundamental theological concept of Christ identifying with social outcasts and those who suffer, and confronts the audience with their duty to do the same in a social atmosphere of shame, fear-mongering, and ostracism.
Applicants for History of Art and Theology might wish to consider whether religious art has a place in contemporary life, and how artists can harness centuries-old symbolism to comment on contemporary issues in unique ways. They should think about how Christian art has traditionally used varied imagery to convey information or make an emotional impact, and may want to assess how the above examples fit into this tradition and whether they are successful.
Did the atheist scientist Stephen Hawking unwittingly strengthen the argument for the existence of God? The question hinges on the so called ‘fine-tuning’ of the laws of physics to produce life; if the fundamental numbers in physics (such as the strength of gravity) were even slightly different than they are, it would make life impossible. To some, this is a strong point in favour of the existence of a creator—it would be too much of a coincidence if all the values underpinning the universe had by chance aligned perfectly to sustain life. However, in his book The Grand Design, Hawking and co-author Thomas Hertog argued for a naturalistic explanation based on the multiverse hypothesis. This depends on the theory of ‘cosmic inflation’, which proposes that the big bang was followed by a rapid expansion, and then by a rapid deceleration, which may have created a number of ‘pocket universes’.
Originally, Hawking and Hertog suggested that the physical laws of these pockets are radically different from one another. But such wide variation is unhelpful in explaining why we happen to live in this perfectly balanced universe. Among pockets with no matter at all, pockets completely full of matter, pockets that are very short-lived (to name just a few possibilities), the likelihood of our pocket being the way it is remains tiny. Hence in their final paper together, Hawking and Hertog imposed strict rules on the kind of pockets that could exist, limiting the variety. But this poses a problem once again; if all the pockets have identical or almost identical laws, we still have to explain why these laws are fine-tuned for the development of life. Hertog is confident that their multiverse theory may ultimately be scientifically testable, and may help to shed light on the origins of the universe. However, as long as the ‘fine-tuning’ question remains open, the idea remains popular that the laws of the universe show that it must have been designed by an intelligent creator. As Hertog comments, “Stephen would say that, theoretically, it’s almost like the universe had to be like this”.
Applicants for Physics and Natural Sciences may wish to familiarise themselves with the work of Hawking and Hertog, and to get to grips with the various theories about the beginning of our universe. Students wishing to study Theology or Philosophy should be aware of the traditional arguments for the existence of God, and may want to consider whether modern scientific research has dispelled or strengthened these arguments.