2016 Voiceless People’s Choice Awards: Animal Advocacy Training Workshop Series

I am happy to say that my application to Voiceless: the animal protection institute grants scheme has been shortlisted for the 2016 Voiceless People’s Choice Awards. The application outlined a project to develop a series of ‘Animal Advocacy Training’ workshops to help people become confident and comfortable animal advocates.

promo-image

Lots of people would like to advocate for animals but don’t know where to start or don’t feel confident or comfortable being an advocate. This is a problem because each one of these people could be helping create a better world for animals. To build a stronger movement working for animals it’s important that there are opportunities for these people to develop the skills and knowledge necessary to become active animal advocates.

Working with others I want to create and deliver a series of six workshops that will:

  1. Facilitate people to become active animal advocates and improve their advocacy through the sharing of skills & knowledge and by providing them with tools & experiences that make them comfortable with and confident about conducting their own advocacy.
  2. Build connections within the animal advocacy movement and create a strong network of people who continue to share advocacy skills, knowledge and experiences.
  3. Increase the number of individuals who are actively working for the freedom of animals.

The workshop series will use an active and self-directed learning approach to ensure attendees are engage and work on tasks that are directly related to their passions and animal advocacy goals. The workshops will focus around the development of a self-directed ‘animal advocacy project’. Attendees will work on their project during the course of the workshop series and will be encouraged to practice what they are learning between workshops. By the end of the workshop series attendees will have developed an animal advocacy project that they will have the tools, confidence and network to implement. Workshop themes will include:

  1. Identifying my place in animal advocacy.
  2. Approaches to advocacy.
  3. Animal advocacy and the Law.
  4. Strategy and Planning.
  5. Connecting movements: Animal, environmental and social advocacy.
  6. Wellbeing and self-care as an advocate.

Open & Free

The face-to-face workshop series will initially run in Melbourne and be used to create and refine a set of learning resources and activities to help people become animal advocates. All materials will then be turned into a freely available online course to help people anywhere in the world become active animal advocates.

Research

During the workshop I will conduct educational research to identify which learning opportunities are most useful. This research will help to evaluate the effectiveness of the workshop series and inform improvements to future advocate training.

Please Vote

Please consider voting for my project in the 2016 Voiceless People’s Choice Awards. Although I am sure that all five projects are excellent so vote for whichever one you think will have the greatest benefit for the animals! Please remember you only have one vote so make it count. Voting is open from 12pm 19th Sept to 12pm 30th September 2016.

Until all are free!

Why the masks? A reflection on Anonymous for the Voiceless

Anonymous for the Voiceless is a crew of animal activists in Melbourne raising awareness about the ‘meat’, dairy and egg industries. Every Saturday for several months they have been in Melbourne’s CBD showing people what happens to individuals in the Animal agriculture industry. To do this they use video footage largely shot in Australian farms (courtesy of Aussie Farms, PETA, and other groups).

In Animal activism the use of footage showing industry practices is pretty standard. Anonymous for the Voiceless have added a unique aesthetic style and meaning to this method through the use of Anonymous symbolism. I think its an interesting form of awareness raising and have joined the group several times to help out.

Somewhat surprisingly I have heard the method and the aesthetic being criticised by people from outside and within the movement. The criticisms commonly include:

  • Showing violent video footage in the streets is possibly damaging to potential viewers, unnecessary and too aggressive. It’s nothing but a shock tactic.;
  • Why use the Anonymous masks, it seems cowardly to hide behind them; and,
  • It’s irresponsible to show the footage in such a way because kids can walk past and see it.

anon-voiceless2As someone who has spent time with the Anonymous for the Voiceless crew I’d like to provide my perspective in regard to these criticisms. I share my thoughts not in an attempt to shut down any criticism (I think constructive critique of our movement is vital in order to do a better job for animals!), but rather to provide a personal account of what I find good about this method of activism.

Critique 1: Showing violent video footage in the streets is too aggressive.

Response

  • Many people have no idea what happens to others to satisfy their food choices and the Anonymous for the Voiceless demonstration helps create awareness. It is an opportunity for people to be exposed to footage that they would otherwise never see. There is no cajoling or goading to make people stop and watch, they are free to do as they please. They have the choice to watch or walk away. Many simply walk past without a second glance, others stop and watch for 5-10 minutes.

Critique 2: Why wear the masks?

Response

  • There are people who deal with public engagement and outreach who are not wearing masks. The masks are symbolic and aesthetic. They represent the thousands of anonymous people who are standing against the violence and oppression of non-human animals in factory farming. We stand there not only as ourselves but as a movement against the atrocities perpetrated against non-human animals.
  • The mask make it more comfortable for the public to stay and watch the videos for longer periods of time. It also hides any facial reactions and possibly judgement that people might feel. The activist becomes an impersonal symbol and prop, rather than someone judging others.
  • Many vegans are intimidated to be involved in advocacy because of the aggression and negativity others express when they speak about the abuse of animals. The mask allows a barrier to protect volunteers from this negativity and to help them become comfortable and confident in their advocacy. There are people who have never done advocacy in the group who started with an anonymous mask and now feel comfortable to interacting with the public without the mask. That is a win in my books!
  • When 8-12 people are all standing in a square with masks on it creates a fantastic aesthetic and spectacle. Many people come from a long way down the street to see what’s going on only to stay, watch and engage with the footage.
  • Another wonderfully benefit of the Anonymous symbolism is that it has helped create a connection with the wider anonymous community. A group of Anonymous activists who support many justice issues have been introduced to animal rights and now include animal advocacy in their repertoire of causes.

Critique 3: It’s irresponsible to show kids this footage.

Response

  • Anonymous for the Voiceless demonstrates in public space and it is up to parents to monitor what their children see. Many families come up and discuss the footage with their kids, others walked straight past. It is their choice. No one is ever forced to watch the videos. Although, my observations suggest many kids get the message more than adults. They often ask their parents questions about what they see, and I will always remember the gentle words of a small boy who said ‘sorry pigs’ as he was moved along by his dad.
  • It is arrogant to think that younger people are not capable of thinking about and comprehending this issue. The abuse being shown in the footage is what most children will be contributing to because of their parents choices. Many parents are forcing their children to become complicit in the mass abuse, torture, oppression and killing of other individuals. If a child sees something they disagree with and then makes a choice to not be involved in it we should respect that choice. I know many vegans who as children learnt where animal flesh comes from and made the decision to stop eating it, from as young as 5!

From behind the mask I have seen many sharp intakes of breath, half turned heads and pained looks of conflict as people recognised that the footage they are viewing is something they contribute too. Most encouraging is when someone makes the decisions to go vegan after seeing the demonstration. That’s what makes it worthwhile for me.

anon-voiceless1

Presenting at the International Critical Animal Studies Oceania Conference 2016

icas_logoOn the 1st of October 2016, I will present at the ICAS Oceania Conference being held at the University of Canberra, Australia. My accepted abstract was titled ‘Diversity in animal activism: Preparing for impact opportunities for the next 10 years’.

During the presentation I hope to begin a discussion about the idea of Vegan Futures. Futures Studies is part art, part science, and is concerned with postulating possible futures and planning for desirable futures. Vegan Futures would be concerned with postulating and planning for a vegan future.

In particular, I will discuss the role of animal activism in the context of Vegan Futures. For instance, if we take a Vegan Futures approach can we highlight gaps in our current repertoire of vegan activist methods and strategies?

You can check out the abstract below.

“The near future promises a range of technologies that could be highly disruptive to established animal industries, for example Perfect Day milk and SuperMeat. With the right support from animal activists this disruption could accelerate the end of animal industries and potentially save millions of lives.  Unfortunately, the lack of diversity of advocacy strategies utilised by the movement means that this potential is unlikely to be taken advantage of. While there are a wide range of strategies employed by the movement, all levels spend a large proportion of time on educational outreach explicitly intended for individuals. Educating individuals is necessary but it largely fails to address social and structural factors that reinforce the use of non-human animals. This strategy has also become the standard form of animal activism, arguably to the detriment of other strategies. Recent examples from the New South Wales greyhound industry and Australian dairy industry exemplify the complexity of achieving industry change, with exposures of abuse in both industries being received differently by the public and each industry experiencing vastly different outcomes. By critically reflecting on these cases it was possible to find gaps in current animal advocacy approaches and identify additional strategies for reducing barriers to change. Identified strategies for the Australian animal activism context include, a targeted approach to educational outreach, community integrated support networks that facilitate transition, and research identifying alternatives to animal industries for dependent communities. Increasing the diversity of strategies employed by the movement will make it capable of adapting to opportunities for impact. If the movement wants to take advantage of the coming disruptions it must forecast what is required and begin to build the capacity now.”

ICAS Oceania Conferencehttp://www.criticalanimalstudies.org/oceania-conference/

 

A great outcome for a plant-based patient with Parkinsonism

A VeganSci Post 

parkinsons_signs

First off, ‘Parkinsonism’ is a set of symptoms of which Parkinson’s disease is one cause but not the only cause (you learn something new every day!). So when recounting this interesting little story to your friends please make sure you’ve got the distinction right. Remember, accuracy is important if you don’t want to come across as an over-reaching uninformed dingus that can get refuted really easily. Now that that is out of the way…

A very interesting case study reported that a 64 year old man had an incredibly positive improvement in Parkinsonism after adopting a plant-based diet. He was diagnosed with Parkinsonism at age 55, having developed bradykinesia (slowness of movement), bilateral rigidity, start hesitation, and sudden transient freezing. Over time he experienced constipation, anxiety, orthostatic hypotension, and gait freezing, all of which were difficult to treat with drugs because of a complex medical history.

The man started a protein restricted diet which produced slight improvements on symptoms. Two months later he to switch a plant-based diet and quickly saw significant improvements in his gait and motor symptoms. The man now enjoys running and ice skating which were basically impossible before (and at the age of 64 certainly puts me to shame).

The authors point to a few possible reasons for the improvements made while on a plant-based diet including, its protein sparing nature, fibre richness, and possible improved bioavailability of the levodopa drug treatment due to better bowel mobility, among other things. They also suggest that a plant-based diet may be beneficial for people diagnosed with Parkinson’s disease or related conditions.

Now, before you get all uppity about the sample size of one I want to strongly acknowledge that we can’t draw any conclusions about Parkinsonism and plant-based diets from this case study. However, case studies can provide hints at interesting areas for research. Plus, I thought this was a pretty cool story and a great outcome.

There also seems to be a few other case studies out there that point to similar results, see:

  1. Diets, food and Idiopathic Parkinson´s disease
  2. Pilot dietary study with normoproteic protein-redistributed plant-food diet and motor performance in patients with Parkinson’s disease

It’s important to note that these case studies suggest some positive benefit of a plant-based diet for treating symptoms that some patients of Parkinsonism experience. They DO NOT suggest that a plant-based diet helps prevent Parkinsonism or Parkinson’s disease. So please don’t make that leap when you’re trying to convince your friend how awesome veganism is. There is also a lot of research left to do in this space before we draw any conclusions.

Finally, DON’T take this as medical advice! Don’t convince your friends or family members to go vegan to treat their Parkinsonism/Parkinson’s disease. Convince them to go vegan for the animals, and to seek advice from a health professional regarding their Parkinsonism.

 

NB: You may notice that the case study uses the term ‘vegan diet’ while I use ‘plant-based diet’. I believe the latter is more accurate because veganism involves more than just a dietary change, it is a lived ethic.

 

Title: Dramatic response of parkinsonism to a vegan diet: Case report
Authors: Roger Kurlan, Rajesh Kumari and Ivana Ganihong
Journal: Journal of Parkinson’s disease & Alzheimer’s disease

——————————–

If you can’t access the paper try emailing the corresponding author directly and asking nicely for a copy. Most people will be more than happy to share their research with you. Alternatively, get in touch with me and I can help you out.

Teaching Diary #1: Preparing for effective learning

During the 13th and 14th of Jan 2016, I demonstrated in two practical (laboratory) classes for a first year biology unit, Cells and Genes. We were working through some examples of ‘Patterns of Inheritance’ which can be tricky stuff when first being introduced to the ideas. I was feeling pretty comfortable with my ability to teach  the prac because I had done it several times over the last few years. However, there were a few differences that helped me learn some  valuable lessons.

purple-corn-1280x960

Multi-coloured corn kernels representing different genotypes and phenotypes.

The most important lesson was that for effective learning to occur ‘both teacher and students need to be prepared for the lesson’.

For me (the teacher), I was filling in for  friend who was away and because I had taken the prac before I thought it would be a walk in the park. It wasn’t! I didn’t receive the prac notes, or questions that the students needed to answer, until a couple of hours before the class started, and never got a version with the answers. While not having an answer sheet wasn’t a major problem for the questions based on theory, the prac required students to count flies and record their traits so that they could look at inheritance patterns over three generations. The numbers that they got in the first and second generations impacted all following questions so they needed to be pretty accurate. The problem was that it recording the correct fly traits was difficult.

After my first prac I recognised that a large amount of the difficulty students were having with the prac was becuase they lacked some fundamental knowledge about the science we were doing. This included knowing the following:

  • What chromosomes, genes and alleles are and what happens during meiosis.
  • Terminology such as genotype and phenotype.

Many of the students also seemed to have real difficulty making the conceptual link between a representation of biology, e.g. Rr, actually means biologically, e.g. Dominant Purple Allele-Recessive Yellow Allele. Some students found it very hard to understand that this was a representation of an individuals alleles and allowed you to predict what colour they would be.

Some students also seemed to lack the confidence to attempt an answer even if they knew what the answer was.

I found that after my first session I wanted to try and address a few of these issue, so during the second prac I tried to introduce the concept of chromosomes, genes and alleles and how they relate to each other. I also tried to relate the parents of a punnet square to the actual individuals that we were theoretically breeding.

I also recognised that students needed to want to engage with what we were discussing rather than simply going through the motions and waiting to be given an answer. Those students who were really engaging with what we were doing grasped the concepts better than those that were disengaged.

If I ever taught this topic again in the same setting I would probably look at including more of an introduction about the fundamental ideas behind inheritance, and make the ‘breeding’ more explicitly between individuals rather than the concept of individuals.

In future practical and tutorial classes I will begin by trying to establishing an understanding of students incoming knowledge and try to work from where the student is beginning.

Is Abolitionist Bioethics Oppressive?

I recently came across an interesting talk by philosopher David Pearce at the Effective Altruism (EA) Global Melbourne 2015 conference. I couldn’t go past the great name ‘Abolitionist Bioethics’.

David discusses The Hedonistic Imperative which, in a nutshell, is the idea that we can use biotechnology to end suffering and promote happiness. Particularly that we can use genetic engineering to remove the biological capacity to suffer.

This idea isn’t contained to just ending suffering in humans but also for ending it in all living individuals that have the capacity to suffer. It’s an interesting idea but I couldn’t help thinking that it was also problematic because it assumes that humans know what its best for other species and are justified to interfere with their lives.

It’s seems to be based in human centric thinking that rationalises the imposition of human will on others. Non-human animals wouldn’t be given a choice to have their biology interfered with it would just be assumed that we are doing the right thing. To me this seems oppressive.

In an attempt to reduce suffering we would be violating others right to self determination, and potentially causing a form of suffering that is not physical or emotional but based in the understanding that rights have been violated.*

The talk really highlights for me how ingrained speciesism is within our culture. A philosopher who spends a lot of time thinking about how to improve the lives of all animals also assumes that it is OK to impose human ideology upon other individuals. It perpetuates the idea of human superiority, where humans can dominate nature and decide the fate of others.

Don’t get me wrong, I am definitely for reducing suffering but think that humans should focus on ending the immense amount of suffering we cause and leave others to live their lives without interference.


Caveat: These thoughts are based solely on the talk ‘Abolitionist Bioethics’. I intend to look further into the ideas outlined by David and hope to find some answers to the questions I have. I just wanted to write this stuff down before I forgot it!

A few other thoughts I had while watching the talk were:

  • If given the choice Abolitionist Bioethics could be viable for humans.
  • Our understanding of genetics, gene expression and epigenetics is currently so limited that this approach becoming viable is likely to be a long way off.
  • Not all suffering is bad, in fact some suffering makes individuals more resilient.

A few question I had were:

  • Does this idea only promote the removal of the ability to suffer, rather than ending events that are harmful? For instance, a rabbits ability to experience pain is removed but the fox will still hunt the rabbit. So the rabbit dies but doesn’t suffer when dying.
  • Is the idea of ending all harmful events simply another form human superiority where we get to dominate the world.
  • What unintended consequences could the removal of the function of suffering have metabolically, socially and ecologically?
  • *Can an individual who doesn’t have the capacity to suffer physically or emotionally still recognise injustice?

R Fun! – Text Mining to Create Vocabulary Lists

Use R to scrape and mine text from the web to create personalised discipline specific vocabulary lists!

I love playing with R and I have recently learnt how to scrape and text mine websites. I am going to provide a short tutorial on how to do this using an example I hope you find useful.

Learning the jargon of a new topic that you're interested in can significantly increase your comprehension of the subject matter, so it can be important to spend some time getting to know the lingo. But how can you work out the most important words in the area. You could find lists of key words but these may only identify words that people within the field think you need to learn. Another way is to created a vocabulary list by identifying the most common words across several texts on the topic. This is what we will be doing.

First of all you will need a topic. I will be using the topics of nutrigenomics because Jess (my wife) has recently become interested in learning about the interaction between nutrition and the genome. Now that we have a topic we will follow the following process to created our vocabulary list:

  • Find the documents that you will use to build your vocabulary list.
  • Scrape the text from the website.
  • Clean up the text to get rid of useless information.
  • Identify the most common words across the texts.

Finding the Documents

I am going to use PLOS ONE to find papers on nutrigenomics because it is open access and I will be able to retrieve the information I want. I start by searching PLOS ONE for the nutrigenomics, which finds 192 matches as of the 22/08/2015. Each match is listed by the paper name which contains a hyperlink to the URL for the full paper with the text we are interested in. In R we will use URL's to find the website we are interested in and scrape it's text. In order to scrape the text from every paper we will need to retrieve the corresponding URL's for each paper. To do this we will use the magic of package rvest which allows you to specify specific elements of a website to scrape, in this case we will be scraping the URL links associated with the heading of each paper returned in our PLOS ONE search. So lets get started!

PLOS_ONE_HiRes

First take note of the URL from your PLOS ONE search. In my case it is: http://journals.plos.org/plosone/search?q=nutrigenomics&filterJournals=PLoSONE. As I mentioned earlier there are 192 results associated with this search but they don't all show up on the same page. However, if I go to the bottom of the page at select to see 30 results per page the URL changes to specify the number of results per page. We can use this to our advantage and change the number from 30 to 192, which then gets the whole list of papers on one page and more importantly all their associated URL's on one page, e.g. http://journals.plos.org/plosone/search?q=nutrigenomics&sortOrder=RELEVANCE&filterJournals=PLoSONE&resultsPerPage=192. We are going to use this URL to find all of the URL's to our papers.

First we will open R and load the package that we require to get our vocabulary list. I like to use rvest.

install.packages("rvest")
install.packages("tm")
install.packages("SnowballC")
install.packages("stringr")
install.packages("wordcloud")
library(stringr) 
library(tm) 
library(SnowballC) 
library(rvest)

Now we can create a vector which contains the html for for the PLOS ONE nutrigenics search, with all returned papers on the same page. This literally pulls down the html code from the web address that you parse to the html() function.

paperList <- html("http://journals.plos.org/plosone/search?q=nutrigenomics&sortOrder=RELEVANCE&filterJournals=PLoSONE&resultsPerPage=192")

Using this HTML code, we can now locate the URL's associated with each paper title with the special rvest function html_nodes(). This function uses css or xpath syntax to identify specific locations within the structure of a HTML document. So to pull out the URL's we are after we will need to determine the path to them. This can be easily done on a Google chrome web browser using the inspect element functionality (I am not sure whether other web browser have a similar function but I am sure they do).

In Google chrome go to the list of papers in the PLOS ONE search page, right click on one of the paper titles and select 'inspect element'. This will split your window in half and show you the HTML for the webpage. In the HTML viewer the code for the specific element that you clicked on will be highlighted, this is what you want. You can right click this highlighted section and select 'copy css path' or 'copy xpath' and you will get the specific location for that node to use in html_nodes(). However, we want to specify every URL associated with a paper title in the document so we will need to use a path the contains common elements for every location we are interested in. Luckily 'css path' and 'xpath' syntax can specific multiple locations if they contain the same identifying elements. By looking at the HTML with Google chromes inspect element we can see that the URL's we are interested in are identified by class="search-results-title" and contained within a href="URL" tag. These two elements are common for each of our papers but will not include href= for links elsewhere on the page.

inspect-element

The code to retrieve the URL's occurs in three parts; first we parse our HTML file, then we specify the locations we are interested in with html_nodes(), and finally we indicate what we want to retrieve. In this case we will be retreiving a HTML attribute using the function html_attr()

paperURLs <- paperList %>%
             html_nodes(xpath="//*[@class='search-results-title']/a") %>%
             html_attr("href")

This returns a list of 192 URL's that specify the location of the papers we are interested in.

head(paperURLs)
## [1] "/plosone/article?id=10.1371/journal.pone.0001681"
## [2] "/plosone/article?id=10.1371/journal.pone.0082825"
## [3] "/plosone/article?id=10.1371/journal.pone.0060881"
## [4] "/plosone/article?id=10.1371/journal.pone.0026669"
## [5] "/plosone/article?id=10.1371/journal.pone.0110614"
## [6] "/plosone/article?id=10.1371/journal.pone.0112665"

If you look closely you will notice that the URL's are missing the beginning of a proper web address. Using these URLs will result in a retrieval error. To fix this we will add the start to the URL's with paste(). Here we are simply saying paste the string http://journals.plos.org to the beginning of each of out paperURLs and separate these two strings by no space.

paperURLs <- paste("http://journals.plos.org", paperURLs, sep = "")

# Check it out
head(paperURLs)
## [1] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0001681"
## [2] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0082825"
## [3] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0060881"
## [4] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0026669"
## [5] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0110614"
## [6] "http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0112665"

As you can see we now have a complete URL. Try copy/pasting one into your browser to make sure it is working.

Scraping the Text

We can scrape the text from these papers, using the URLs we have just extracted. We will do this by pulling down each paper in its HTML format.

Using the URL's we have extracted from the previous step we will pull down the HTML file for each of the 192 papers. We will use sapply() to do this, which is a looping function that allows us to run html() on every item whithin a list. This step is pulling a large amount of information from the web so it might take a few minutes to run.

paper_html <- sapply(1:length(paperURLs),
                     function(x) html(paperURLs[x]))

Now we can extract the text from all of these HTML files. Using the inspect element functionality of Google chrome we have determined that the content of the articles is found within class="article-content". We are using html_text() to extract only text from the html documents and trim off any white space with stringr function str_trim(). Because we have a list of 192 HTML documents we will iterate over each document using the awesome sapply() function. Where 1:length(paper_html) simply says iterate the following function where x equals 1 until 192.

paperText <- sapply(1:length(paper_html), function(x) paper_html[[1]] %>%
                     html_nodes(xpath="//*[@class='article-content']") %>%
                     html_text() %>%
                     str_trim(.))

This results in a very large vector containing the text for each of the 192 papers we are interested in.

Cleaning the Text

Now that we have all of the text that we are interested in we can transform it into a format used for text mining and start to clean it up clean it up.

First we need to load the tm and SnowballC packages.tm is used for text mining and SnowballC has some useful functions that will be explained later.

Now we will transform it into a document corpus using the tm function Corpus() and specifying that the text is of a VectorSource().

paperCorp <- Corpus(VectorSource(paperText))

Now we will remove any text elements that are not useful to us. This includes punctuation, common words such as 'a', 'is', 'the', and remove numbers.

First we will remove any special characters that we might find in the document. To determine what these will be take some time to look at one of the paperText elements.

# Check it out by running the following code.
paperText[[1]]

Now that we have identified the special?characters that we want to get rid of we can?remove them using the following function.

for(j in seq(paperCorp))
{
paperCorp[[j]] <- gsub(":", " ", paperCorp[[j]])
paperCorp[[j]] <- gsub("\n", " ", paperCorp[[j]])
paperCorp[[j]] <- gsub("-", " ", paperCorp[[j]])
}

The tm package has several built in functions to remove common elements from text, which are rather self explanatory given their names.

paperCorp <- tm_map(paperCorp, removePunctuation)
paperCorp <- tm_map(paperCorp, removeNumbers)

It is really important to run the tolower argument in tm_map(), which changes all characters “to lower” characters. (NOTE: I didn't do this in the beginning and it caused me trouble when I tried to remove specific words in later steps. Thanks to phiver on stackoverflow for helping fix this problem for me!). We will also remove commonly used words in the english language, using the removeWords stopwords() arguments.

paperCorp <- tm_map(paperCorp, tolower)
paperCorp <- tm_map(paperCorp, removeWords, stopwords("english"))

We also want to remove all the common endings to english words, such as 'ing', 'es, and 's'. This is referred to as 'stemming' and is done with a function in the SnowballC package.

paperCorp <- tm_map(paperCorp, stemDocument)

To make sure none of our filtering has left any annoying white space we will make sure to remove it.

paperCorp <- tm_map(paperCorp, stripWhitespace)

If you have a look at this document you can see that it is very different from when you started.

paperCorp[[1]]

Now we tell R to treat the processed documents as text documents.

paperCorpPTD <- tm_map(paperCorp, PlainTextDocument)

Finally we use this plain text document to create a document term matrix. This is a large matrix that contains statistics about each of the words that are contained within the document. We use the document term matrix that we use to look at the details of our documents.

dtm <- DocumentTermMatrix(paperCorpPTD)
dtm
## <<DocumentTermMatrix (documents: 192, terms: 1684)>>
## Non-/sparse entries: 323328/0
## Sparsity           : 0%
## Maximal term length: 27
## Weighting          : term frequency (tf)

We are close but there's still one cleaning step that we need to do. There will be words that occur commonly in our document that we aren't interested. We will want to remove these words but first we need to identify what they are. To do this we will find the frequent terms in the document term matrix. We can calculate the frequency of each of our terms and then creat a data.frame where they are order from most frequent to least frequent. We can look through the most common terms in the dataframe and remove those that we aren't interested in. First we will calculate the frequency of each term.

termFreq <- colSums(as.matrix(dtm))

# Have a look at it.
head(termFreq)
##       able  abolished    absence absorption   abstract       acad 
##        192        192        192       1344        192        960

Now we will create a dataframe and order it by term frequency.

tf <- data.frame(term = names(termFreq), freq = termFreq)
tf <- tf[order(-tf[,2]),]

# Have a look at it.
head(tf)
##            term  freq
## fatty     fatty 29568
## pparα     pparα 23232
## acids     acids 22848
## gene       gene 15360
## dietary dietary 12864
## article article 12288

As we can see there are a number of terms that are simply a product of the text being scraped from a website, e.g. 'google', 'article', etc. Now go through the list and make not of all of the terms that aren't important to you. Once you have a list remove the words from the paperCorp document.

paperCorp <- tm_map(paperCorp, removeWords, c("also", "article", "analysis",
                                      "download", "google", "figure",
                                      "fig", "groups", "however",
                                      "high", "human", "levels",
                                      "larger", "may", "number",
                                      "shown", "study", "studies", "this",
                                      "using", "two", "the", "scholar",
                                      "pubmedncbi", "view", "the", "biol",
                                      "via", "image", "doi", "one"
                                      ))

There will also be particular terms that should occur together but which end up being split up in the text matrix. We will replace these terms so they occure together.

for (j in seq(paperCorp))
{
  paperCorp[[j]] <- gsub("fatty acid", "fatty_acid", paperCorp[[j]])
}

Now we have to recreate our document term matrix.

paperCorp <- tm_map(paperCorp, stripWhitespace)
paperCorpPTD <- tm_map(paperCorp, PlainTextDocument)
dtm <- DocumentTermMatrix(paperCorpPTD)
termFreq <- colSums(as.matrix(dtm))
tf <- data.frame(term = names(termFreq), freq = termFreq)
tf <- tf[order(-tf[,2]),]
head(tf)
##                    term  freq
## pparα             pparα 23232
## fatty_acids fatty_acids 22272
## gene               gene 15360
## dietary         dietary 12864
## expression   expression 10752
## genes             genes  9408

From this dataset we will create a word cloud of the most frequent terms. The number of words being displayed is determined by 'max.words'. We will do this using the wordcloud package.

require(wordcloud)
wordcloud(tf$term, tf$freq, max.words = 100, rot.per = 0.2, colors = brewer.pal(5, "Dark2"))

nutrigenomics_wordcloud

You can use the tm dataframe to find common terms that occur in your field and build a vocabulary list.

By changing your search term in PLoS ONE you can create a vocabulary list for any scientific field you like.

That's it, have fun!!


If anyone has suggested changes to the code, qeustions or comments, please leave a reply below.

Compassionate Behaviour: a challenge to researchers of animal behaviour

It’s time for researchers of animal behaviour to develop ethical methods for studying animal behaviour, and move towards a framework of ‘Compassionate Behaviour’.

Behavioural research is perhaps one of the most fascinating areas of science because it explores our presence and interaction in the world. From communication to cognition and personality to pain, behavioural research captures our imagination and increasingly shows us that non-human species have behaviours that are as complex as our own.

Behaviour is studied in various ways including relatively innocuous observational methods, where an investigator sits and watches a study subject for hours, days, weeks and sometimes years; think Jane Goodall watching chimpanzees. If the investigator wants to test a particular aspect of behaviour they might manipulate the natural environment and see what response the study subject has (e.g. study shows bees have map-like spatial memory). However, there is a darker side to behavioural research that is not often discussed.

copy-cropped-img_2099.jpg

The Problem

There are invasive techniques in behaviour research which have a significant impact on the individuals being studied, including methods that manipulate the life history of individuals in the wild, as is done in brood size manipulations, or trapping wild individuals and bringing them into a laboratory. Of individuals brought into laboratory conditions many will be killed once the research is completed.

Finally, some behavioural studies will breed individuals for the sole purpose of conducting behavioural research, in which case they will live and die in the laboratory.

Ironically, behavioural research has helped us understand the incredible nature of non-human animals and has fostered movements that promote greater moral consideration for them (see 12). We now recognise that many species have incredibly complex social systems, excellent cognitive skills, individual personalities and can form friendships. That they are individuals who have an interest in self determination and freedom from interference.

copy-cropped-p1020401.jpg

Unfortunately our understanding of animals has not stopped us from purposefully subjugated them to tormenting conditions. In our pursuit of knowledge we have ignored the rights of non-human individuals to be free from unnecessary use.

This was particularly evident at Behaviour 2015, the 34th International Ethological Conference held in Cairns (10th – 14th August 2015) and where over 800 delegates from all over the world presented, discussed and planned all things behavioural science.

I was lucky enough to attend the conference and see dozens of talks on current animal behaviour research. A large number of talks involved studies that used captive individuals and many ending with the study subjects being killed. Despite the considerable cost to the individuals being studied I didn’t hear anyone question the ethics of what was being done. In fact, there seemed to be a generally accepted assumption in animal behaviour research that it’s OK to use and kill animals if it is in the pursuit of knowledge.

I would like to challenge this assumption, and argue that it is unethical to use invasive methods during the course of animal behaviour research.

Our Faulty Logic for Using Animals

Animals are used in experimental research simply because they are not human. This is evident by the extensive limits on human research but not non-human animal research. The distinction between human and non-human animals is based on species membership, whereby being a Homo sapien grants one greater moral consideration. However, species membership is a morally irrelevant characteristic because it is based on a difference in physiology, and physiology shouldn’t matter when deciding how to treat an individual. For instance, whether someone is male or female, Brazilian or Chinese does not matter how they should be treated morally. This is because the randomness of being born a female in Zimbabwe does not mean you are any less worthy of ethical treatment than any other person in the world. Biology does not matter when deciding what is morally acceptable, therefore taking away the freedoms of individual based on their species is unethical (see 3 for a more extensive discussion of this idea).

Some people argue that certain human traits make them superior to other species and justifies the human position to treat non-human animals as we like. This thinking is problematic for several reasons. First, it is incredibly bias to a human view of the world because superiority is based on exceptional human traits, e.g. intellectual capacity. If superiority were judge on the ability to fly, or swim, humans would be considered inferior to many other species. Second, behavioural research is finding that humans are not as exceptional as we once thought! An argument of superiority is illogical and does not give us cause to use non-human animals how we want.

The Challenge

I suggest that behavioural scientists look at the emergence of Compassionate Conservation as a guide to how ethics concerning the individual can be used in scientific research. Compassionate conservation seeks to promote the protection of wildlife as individuals and takes into consideration individual interests when planning management options. This is a huge change from a field where the wholesale killing of millions of individuals is considered OK just because they ‘don’t belong’ in an ecosystem; even when such intervention ultimately have little to no benefit.

We need a new field of Compassionate Behaviour which considers the interests of individuals in the pursuit of knowledge.