Sunday, 20 July 2014

Assisted Dying Bill Does Not Go Far Enough

The Assisted Dying Bill had its 2nd reading in the House of Lords last Friday, resulting in a 10 hour debate on the matter. While the former Archbishop, George Carey, has supported the Bill, saying it is compatible with being a Christian, many have warned about the dangers of legalising assisted suicide.

Some believe that if assisted suicide is legalised, it will pave the way to euthanasia. Critics of the Bill are worried that the British medical profession will suffer from the same kind of abuses that Holland has experienced, a country which has legalised voluntary euthanasia. Holland’s Termination of Life on Request and Assisted Suicide Act (2002) sets out the criteria upon which a physician can carry out euthanasia or assisted suicide.

In spite of the scaremongering stories you may read about assisted suicide leading to a slippery slope towards abuse and medical malpractice, this kind of thing has not happened in Holland. Holland remember has legalised not just assisted suicide, where the physician gives the patients the powers to end their life, but voluntary euthanasia, where the physician carries out the lethal act with the patient’s consent. Yet two decades of research into euthanasia in the Netherlands reveals a viable model of legalised euthanasia, where abuse of the system is rare.

The authors of the study found that “no slippery slope seems to have occurred” – the frequency of the ending of life without explicit patient request has not increased and there is no evidence for a higher frequency of euthanasia. In fact, by some accounts, the ending of life without explicit patient request has decreased, most likely due to the greater control, safeguards and transparency that comes with legalisation of the practice.

Lib Dem Care Minister Norman Lamb and religious leaders like Lord Carey and Desmond Tutu are in favour of assisted dying, while others such as Baroness Campbell and an oncologist called Karol Sikora have opposed the Bill. Those who oppose it claim that to entrust doctors with the power to end a patient’s life is extremely dangerous. In the words of Professor Sikora, the implementation of the Bill could lead to “death squads”.

David Cameron has also said that he is “not convinced” by the Bill and feels that   people might be being pushed into things that they don’t actually want for themselves.” However, Lord Falconer, who put forward this private member’s bill, insists that the final decision will always be left to the patient, with safeguards in place to prevent abuse. The proposed legislation would allow doctors to prescribe a lethal dose to some terminally ill and mentally competent patients who have less than 6 months to live. Falconer criticised the current situation, which allows the rich to travel to Switzerland to die in peace and dignity, while others are left “in despair” to suffer a “lonely a cruel death”.

In addition, doctors who want to carry out the procedure in the name of compassion are criminalised. Since most doctors will not even consider carrying out assisted suicide or euthanasia, many patients are forced to carry out the act themselves. Falconer argues that many patients feared “implicating their loved ones in a criminal enterprise” by asking them for help to die. Without assistance, patients might end their life by overdosing on pills or by suffocating themselves. If these methods fail, the patient may be left permanently brain damaged. Falconer maintains that passing the Bill would reduce the number of people from having to live their final months with unnecessary pain and suffering.

Considerations of welfare certainly justify this Bill, however, in order to apply these considerations fairly, we would need to legalise voluntary euthanasia as well. This Bill would not allow everyone to ‘die well’ – only a limited number of people. These restrictions may seem sensible at first glance, but what they really do is discriminate against other people who should have the right to die. For example, terminally ill patients with less than 6 months to live who cannot administer the lethal dose themselves would be denied relief. Furthermore, patients who experience immense suffering, but who are not terminally ill, would be denied relief.

The Bill would not benefit people like Tony Nicklinson who has been paralysed since his stroke in 2005. He lives a life characterised by suffering, but because he is not terminally ill and physically able to end his life, this Falconer’s Bill would not apply to him. This is extremely unfair and cruel - someone who is not terminally ill and suffering (like Mr Nicklinson) might face longer periods of suffering than someone who is terminally ill and suffering. This is why the Dutch system is fair, because it does not limit its scope of compassion. Dutch law offers assisted suicide and euthanasia to not just the terminally ill, but to those who experience unbearable suffering and which physicians will agree is incurable.

Many are concerned that any form of assisted death could lead to terminally ill patients requesting death even if they don’t want to die, but in order to stop being a financial burden on their loved ones. A survey conducted in Oregon, where assisted suicide is legal, found that 66% of respondents said they wanted the ‘right to die’ so that they do not become a financial burden on their loved ones. On the other hand, safeguards can still be put in place in order to ensure the voluntariness of the decision to die. Also, it would be far more beneficial to eliminate this fear of being a financial burden – through free and effective healthcare – than to leave patients without a means to alleviate their suffering. In conclusion, the Assisted Dying Bill is a sign of great progress, but it certainly does not go far enough in minimising suffering and protecting the freedom of people to die in dignity.

Tuesday, 15 July 2014

New Surveillance Bill Puts All of Our Privacy at Risk

David Cameron and Home Secretary Theresa May have assured us that the Data Retention and Investigation Powers Bill (DRIP) does not give the government new surveillance powers. Yet despite these assurances, privacy campaigners, civil liberties groups and lawyers argue that this new Bill – also known as the ‘snoopers’ charter’ – gives the government exactly these powers.

Cameron unveiled this ‘emergency’ surveillance legislation last week, in light of the fact that current EU mass surveillance methods were said to be illegal by the European Court of Justice (ECJ). So in order to carry on these illegal activities – and broaden them as well for that matter – Cameron decided that the UK should have its own mass surveillance legislation.

Back in April, the ECJ said EU legislation on mass surveillance conflicts with the fundamental rights of privacy and protection of personal data. The Court stated that the legislation “applies even to persons for whom there is no evidence capable of suggesting that their conduct might have a link, even an indirect or remote one, with serious crime.” The Court also pointed out that there was no way to limit the access that national authorities had to personal data.

Even if it were true that this new surveillance legislation merely reinstated existing surveillance legislation, it would still infringe on our civil liberties. Cameron is essentially trying to legalise blanket data retention, which the ECJ recently ruled is illegal. Critics of the Bill stress that the DRIP Bill puts all of our privacy at risk. The Bill requires that internet service providers (ISPs) and mobile operators store customer data for 12 months for purposes of law enforcement. In addition, the clauses in the Bill expand the government’s powers to directly intercept phone calls, emails, texts and social media traffic.

Cameron defended the Bill, saying, “We face real and credible threats to our security from serious organized crime, from the activity of pedophiles, from the collapse of Syria and the growth of ISIS in Iraq, and al-Shabaab in East Africa. I'm simply not prepared to be a prime minister who has to address the people after a terrorist incident and explain that I could have done more to prevent it.”

The whistle-blower Edward Snowden said that this law closely mirrors the Protect America Act 2007, introduced by George Bush in order to also deal with the dangers of terrorism and criminals. Through this legislation, the US government was able to engage in mass surveillance and information gathering activities. Snowden remarked that the “NSA could have written” the draft of the DRIP Bill.

David Allen Green, head of media practice at Preiskel & Co said, “They’re trying to make out this is clarificatory – but if you read through the Bill these are substantial amendments. They are creating things legally that weren’t there before.” Graham Smith, a private practice lawyer voiced similar concerns, saying that these new laws could allow the interception and retention of data from consumer services like iCloud and Google Drive. Isabella Sankey, policy director of the civil liberties group Liberty, told MPs: This fast-track legislation contains sweeping surveillance powers that will affect every man, woman and child in the UK.” Clause 4 of the Bill contains new powers for the UK to require overseas companies to comply with data interception and acquisition requests.

This Bill was brought in with no public debate or parliamentary scrutiny, but was made behind closed doors between the three party leaders. Like the Bush administration, the UK government believes that national security will always trump civil liberties. In any case, it is highly questionable whether we need sweeping surveillance in order to catch suspected pedophiles and terrorists. It is unjust and illiberal for the government to monitor everyone’s information under the guise of national security and crime prevention.

The idea that this Bill was brought in as a matter of “emergency” is extremely misleading, if not an outright lie. The ECJ’s decision against the Data Retention Directive was made in April, a decision which was not surprising, given that criticisms were made against this Directive for years. If this Bill really is needed as a matter of emergency, then it would have been introduced much earlier. There is no emergency. We are not under threat from ISIS. Cameron is trying to justify this attack on our civil liberties through scaremongering. And so we will lose our civil liberties, not suddenly, but through slow and gradual erosion.

Sunday, 13 July 2014

3 Things Schools Should Teach from an Early Age

Computer coding

While youth unemployment has gone down – shrinking from that unnerving number of 1 million – it is still high. 853,000 young people aged 16-24 were unemployed in February to April 2014. Young people are still finding it difficult to land a decent job upon completing their A-Levels or undergraduate degree. This is because students are leaving school or university without the skills and experiences which will make them suitable for a position in the UK’s growing sectors.

One of the fastest growing areas of employment in recent years relates to computer technology. This means that computer coders are needed more than ever. Many leaders and CEOs have championed the skills of coding in this increasingly digital era that we live in. As Steve Ballmer, the recently retired CEO of Microsoft, put it, “Computer programming is the single best professional opportunity in the world.”

It is essential that we teach children the basics of coding from an early age and then continue to offer more complex courses in coding as a GCSE and A Level. By the time pupils leave school, they will have the necessary coding skills to create and develop software programs. Code Club are doing a great job of offering voluntary-led after school coding classes for primary school children – they have over 1,900 clubs in the UK so far. Here kids learn how to make computer games, animations and websites - all extremely useful skills for any company.

John Naughtan wrote in The Guardian about how coding needs to be a part of the national curriculum for kids. Already in Estonia, public schools are teaching first graders web and mobile application development. The UK really needs to follow suit. In addition, businesses of all types are looking for people who can create, develop and maintain security software. And jobs involving computer programming pay pretty well too – a graduate computer programmer can start on a salary of £20-25K per annum.

Nutrition & health

A recent poll by Populus found that 66% of respondents supported a ban on sugary drinks in all UK schools. A can of Coke contains 35 grams of sugar, so I can sympathise with the aim of preventing children consuming this amount of sugar (or more) on a daily basis. Obesity affects 1 in 5 children aged 10-11 in the UK and part of this health problem can be attributed to the high sugar content of fizzy drinks.

On the other hand, banning sugary drinks in schools is a limited and short-sighted solution to this problem. Even if kids can’t buy a can of Coke in school, they can still buy one after school or drink them at home. Moreover, while banning sugary drinks in schools might reduce a pupil’s intake of sugar, there is no reason to suppose that this will inhibit them from drinking sugary drinks later in life. Don't get me wrong, I support keeping unhealthy food out of school meals and out of vending machines and tuck shops in schools. And school meals should not consist of pizza, hamburgers, chicken nuggets and chips, but should focus on daily servings of grains, fruits, vegetables and healthy sources of protein. This is something which Jamie Oliver has tirelessly tried to achieve in schools, with quite a lot of success to be fair. Where healthier meals have been introduced into schools, there have been improvements in children’s behaviour, concentration and exam performance. That said, like with banning sugary drinks in schools, offering healthy school meals is only part of the solution.

In order to ensure that children avoid unhealthy foods outside school, at home and in later life, it is vital that they understand the importance of nutrition and health. If children are taught good nutrition from an early age then they will be equipped to make sensible and informed dietary choices outside of school and when they grow up. Moreover, if children are taught about the vast health benefits of eating healthy – such as disease prevention, increased longevity, improved performance at sports, improved mood, concentration, memory, learning and other cognitive abilities – then they will be motivated to avoid unhealthy eating habits and prefer healthy eating habits.

We grow up not caring about what ingredients and additives are in our food because we have not been taught to care. Children should be taught about the consequences of what they choose to put in their body, so that they can become more informed consumers as they grow up. 

Critical thinking & ethics

Critical thinking is a skills-based subject, as opposed to a content-based subject. The majority of what kids learn at school is content-based and little (if anything) is taught about how to think critically. You will have to wait until AS/A Levels until studying critical thinking becomes a subject option and even then only a handful of pupils will study it. Despite people’s assumptions that studying philosophy at university is terrible for your career prospects, many employers recognise the benefits of the degree: it teaches students how to interpret, analyse and evaluate ideas and arguments.

These skills are transferable and are valued in many professions: law, policy, research, the civil service, journalism, business & finance, marketing & advertising, social work and teaching. What employers value is not the graduate’s knowledge of Plato’s metaphysics or Kant’s moral philosophy, but the skills of critical thinking that are gained by studying these topics. Being able to think in a ‘critical’ way means that you think carefully about what you read and write, judging what information and evidence is credible and reliable. You are able to question what other people have written and figure out if their ideas are justified or not. You can compare and synthesise information in order to support an argument. And you are able to notice and challenge biases, distorted views, prejudice and illogical arguments – not only in the work of others, but in your own work as well.

Being trained in analytical thinking will give you a better chance of academic success in higher education, where taking a critical approach to your studies is valued very highly. Simply memorising and explaining concepts will not impress your professors and examiners, despite the fact that this way of learning got you good grades in your GCSE and A Level essays and exams. And as mentioned before, critical thinking will impress potential employers – the ability to reason well can be applied to oral and written communication, strategic planning, trouble-shooting, problem solving and the critical evaluation of projects.

Learning critical thinking is not only important for academic and professional success, but for personal development as well. Being able to think clearly makes you less prone to take things for granted, less likely to accept ignorance, prejudice or poor reasoning, and more likely to form a worldview which is carefully considered and rational. However, it is difficult to create a generation who are academically successful or employable or rational if critical thinking is only taught as an optional AS/A Level subject. Children need to learn the basics of critical thinking, so that they can be prepared to study critical thinking in more depth at GCSE level, at A Level and then at university.

Alongside the teaching of critical thinking, it is important for children at an early age to learn about the nuances of ethics, on both a theoretical and practical basis. Studying a range of ethical theories involves critical thinking skills as well, which makes the study of ethics very useful. Children could be taught about the importance of ethical values such as empathy, honesty, generosity and tolerance, while comparing these to unethical traits such as hatred, dishonesty, selfishness and intolerance. By considering thought experiments and ethical dilemmas, children would be able to learn about the intentions and consequences of their actions. This will prepare them to deal with real life situations in childhood, adolescence and adulthood. If children were taught to think carefully about the intentions and the consequences of their actions, then they might become less prone to lie, steal, cheat, break promises, be selfish and bully others.

There is an increasing concern about ethics in various professions and in business. Ethics is central to the medical profession, so learning about ethical principles at an early age will give you transferable skills for a career in medicine. Furthermore, the skills of critical thinking that are gained through studying ethics will make you far more adept at dealing with the ethical dilemmas and balancing of values that are involved in the medical profession. For example, concern for patient autonomy can easily clash with the doctor’s view of the patient’s best interesting. More and more businesses are becoming socially responsible, taking responsibility for how their activities impact the environment, consumers, employees, communities and stakeholders. Businesses would value someone who was skilled at weighing up benefits and risks.

Professor Simon Robinson has argued how learning ethics increases students’ employability. However, Robinson addresses these arguments in relation to university students, whereas I think his arguments apply equally well to primary school pupils. If these skills are taught to primary school children, this not only means that these useful skills can be taught to every child, it means that by the time these pupils leave school at 18 they will be highly trained in ethical and critical thinking. This means that it might not be necessary for an 18 year old to study for a further 3 years, accumulating massive debt, with the hopes of being employable after graduation. We need to ensure that young people are able to find work without feeling pressured and expected to go to university. 

Tuesday, 8 July 2014

Why Do Humans Cry?

Crying as a response to some emotional state has always been seen as something uniquely human. There is, however, some debate as to whether elephants display the same kind of behaviour. We know that elephants do cry, but whether they cry in response to loss, grief or sadness is more of a contentious question. Based on the behaviour of elephants – which is all we have to go by – there are indications that they can experience joy, grief, rage, stress and compassion. Marc Bekoff is one evolutionary biologist who believes that examples of elephants crying can be associated with feelings of grief; as he writes in Psychology Today. This view was also expressed in the controversial book, When Elephants Weep by Jeffrey Mason and Susan McCarthy.

It is likely that elephants cry for some of the same reasons we do and possibly different reasons as well. Regardless of whether elephants cry as an emotional response, what we do know is that humans can cry as a response to a wide range of emotions: rage, grief, sadness, joy and ecstasy. But what is the purpose of crying? Is there some sort of adaptive or survival value to it?

There are competing theories which attempt to explain the strange behaviour of crying as an emotional response. One theory simply states that emotional tears are a response to joy, sadness, distress or physical pain – the tears themselves have no function; they are merely a by-product of an emotional state. However, if this were true, then why is it only humans (and possibly elephants) who weep in response to sadness, distress or physical pain? We know that other non-human animals can experience these emotional states, so why do they not also weep?

Another theory suggests that emotional crying has a cathartic effect; that crying simply makes us feel better in the face of a negative emotional state. The first proper study investigating the cathartic effect was carried out by Lauren Bylsma, who published her results in a paper entitled, When is Crying Cathartic? (2008). What Bylsma found was that crying is only cathartic in particular social contexts; having social support, for example, can facilitate a feeling of relief and release of inner tension. In other words, emotional crying does not always make people feel better. That said, the majority participants in their study did report being in a better mood after crying.

Further research has explained why we feel better after crying. Studies have shown that emotional tears contain higher levels of stress hormones than 'reflect tears' (those produced to flush out they eye when it is irritated) and 'basal tears' (those produced to protect the eye and keep it moist). Whether a person is experiencing a victory or a crisis, the body produces more stress hormones. Emotional tears balance our stress levels by releasing excess stress hormones, such as cortisol and corticotropin. Emotional tears also contain more manganese than reflex or basal tears, an element which affects our temper, the release of which can improve our mood.  

While Bylsma’s study suggests that crying can be cathartic in certain social contexts and for most people, some scientists have proposed a more over-arching and complex theory which seeks to explain, in evolutionary terms, why humans cry. An evolutionary explanation of emotional tears says that they are a form of non-verbal communication. They are messages that contain a request for help and humans have evolved the ability to cry emotionally because these messages have proved to be effective at eliciting altruistic behaviour from others.

A more detailed version of this theory is based on the assumption that in many species of animals, the appearance and behaviour of a newborn can elicit parental care. Since newborns have no verbal means of communication, they have to find other ways to elicit care from its parents. Parents who respond in the appropriate way to this non-verbal communication will increase the fitness of the child, increasing the likelihood of this newborn behaviour (and parental response to it) being favoured by natural selection. It has been observed in some species that older offspring which imitate the behaviour of newborns in order to elicit the same response from its parents, thereby increasing its chances of survival. Thus, the crying behaviour of humans is favoured by natural selection because it resembles the crying behaviour of newborns and so it induces helping behaviour.

The resemblances between adult crying and newborn crying are pretty clear: the wetting of the face with tears, the jerking respiration, screaming, the closed and wrinkled eyes, and the open mouth. Humans evolved to display these appearances and changed physical appearances in order to receive assistance from others. A different evolutionary theory of emotional crying has emerged, although in some ways it is similar to the one just described. Oren Hasson, an evolutionary biologist from Tel Aviv University, says that by blurring vision, tears lower defences and reliably function as a signal of submission. They can also function as a cry for help or to encourage group cohesion.

Hasson argues that by blurring your vision with tears, an attacker will recognise that you are expressing submissiveness and therefore be more likely to show mercy towards you. Moreover, this display of vulnerability could attract sympathy from others and perhaps gain their strategic assistance. And if people cry in a group, this allows us to bond over the fact that we have all lowered our defences, fostering a sense of mutual trust and a recognition of our shared emotions. Hasson does point out, however, that this evolutionary behaviour is only effective in certain contexts – it is unlikely to be effective at work, for example, where one’s emotions should be hidden. This is a very original theory which postulates that crying evolved in order to handicap the individual. This handicap then elicits mercy from an attacker or sympathy from bystanders witnessing the attack, or it can increase group cohesion. Emotional tears most likely evolved as an advantageous form of non-verbal communication. 

Monday, 30 June 2014

New Study Finds a Vegan Diet Can Significantly Reduce Your Carbon Footprint

Livestock are a major contributor of greenhouse gas (GHG) emissions – a 2010 report by the United Nations Food and Agriculture Organisation (FAO) found that methane accounts for most of the GHGs emitted due to modern agriculture. And the largest source of this methane is from cattle. Livestock contribute to GHG emissions in other ways as well. Manure deposited and left on pastures is a major source of nitrous oxide emissions – emissions from manure in Asia, Africa and South America account for 81% of global nitrous oxide emissions.

Deforestation in South America is a leading cause of carbon dioxide emissions. The Amazon rainforest and forests of South America are being cleared in order to grow soybeans, not to meet the demand for tofu and soy milk, but to feed livestock. Only a small portion of soy is consumed directly by humans – most of it ends up as feed for pigs, chickens, cows and even industrially farmed fish. Most people are therefore consuming soy indirectly. However, the carbon dioxide emissions are only part of the problem of deforestation. In order to meet the global population for animal products, 4 million hectares of South America forests are destroyed every year, and eliminating fragile ecosystems at this rate threatens wildlife, biodiversity, indigenous people, water reserves and soil quality.

A study by WWF Germany (you will need to translate the page into English) found that the average consumption of meat by a German person a year requires 1,030m squared of land to meet this demand. Germans eat roughly the same amount of potatoes as they do meat, yet only 15m squared of land is needed to meet this demand for potatoes. With the rising global population, it is simply not sustainable to meet these demands for animal products.

The United Nations and WWF advise people to eat less meat in order to curb global warming. This would be an effective and reliable way to significantly reduce your carbon footprint, however, given the massive difference in GHG emissions and ecological destruction between livestock and plant-based foods, surely it would be better to cut out animal products completely? Indeed, this is one of the main arguments in favour of adopting a vegan diet, the more popular reasons being for ethical or health reasons.

A new study published in the journal Climate Change has confirmed this reasoning behind the vegan diet. The authors of the study compared GHG emissions attributable to more than 55,000 meat-eaters, fish-eaters, vegetarians and vegans in the UK. The researchers found that the dietary GHG emissions of meat-eaters was twice as high as those of vegans.

These high emissions are caused not only by methane from cows, nitrous oxide from manure and carbon dioxide from deforestation, but also from the production, transportation and storage of animal products. There are high carbon dioxide emissions related to fossil fuels used to power factory farm machinery, for example. The report concluded that "reducing the intake of meat and other animal based products can make a valuable contribution to climate change mitigation," although, as was pointed out earlier, eliminating your intake of animal-based products is the most significant dietary way to reduce your carbon footprint. As much as bags for life should be encouraged, people should really pay closer attention to the food that goes in that bag. Your carbon footprint would be much less if you used plastic bags all the time, yet never bought animal products.

This new study adds to the body of evidence suggesting that a plant-based diet is the most eco-friendly diet. Another recent study found that “…reducing meat and dairy consumption is key to bringing agricultural climate pollution down to safe levels,” according to Fredrik Hedenus, one of the authors of the study. It logically follows from this that eliminating meat and dairy products will allow us to reach these safe levels much more quickly.

The co-author of the study, Stefan Wirsenius, says emissions can be somewhat reduce by more efficient meat and dairy production, but this reduction is relatively small compared to those that would be achieved by dietary change. As global population soars, more efficient agricultural methods will simply not be able to resolve the climate pollution associated with the high demand for meat and dairy products. It is estimated that by 2050 beef and lamb will contribute to 50% of global GHG emissions, whilst contributing to only 3% of our calorie intake. It is highly inefficient to use all of this energy and food to raise livestock, so that we can eat them later, rather than using these resources to feed ourselves.