Skip to content
Home » Blog » A shameful confession … about Google Scholar

A shameful confession … about Google Scholar

I  have a confession to make. A shameful confession at that. I check myself out on Google Scholar on an almost daily basis.

Having got past the stage in my academic career where I’m worried about getting published, I’m now worried about whether anyone is actually reading anything I’ve written. Which is where Google Scholar comes in.

I set up my own citation tracking account a while ago now, something which many other academics seem to be doing as well (see picture below, including nice cheesy headshot). It can be quite helpful: it lets you see who is citing your work, what debates you are actually contributing to, who is writing in what topics, etc. It does, however, lead to a pernicious self-disciplining attitude to scholarship – as I have mentioned before– or maybe it’s just me. Worryingly I find myself judging both my own worth and that of others on the basis of my and their citations – which are now all too easily accessible. The latter is, of course, the most egregious of these activities. I routinely, for example, Google Scholar (to use a non-existent verb) people I meet in order to see how I measure up against them; am I ahead or behind? That sort of thing.

Others have discussed the benefits of Google Scholar and there are examples of similar kinds of software out there like Harzing’s Publish or Perish – a more apt name in considering academia’s increasingly competitive nature. To me, what citation-counting practices entail are forms of positioning and validation which are sometimes difficult to ascertain early on in an academic career; the fact that this is derived from a dodgy and distorted metric is by the by. On the one hand, for example, considering that an H-Index of 5 or above in the social sciences seems pretty reasonable, this form of self-validation can be gratifying if you pass muster. On the other hand, I readily recognize citation-counting as a form of self-disciplining behavior redolent of Foucault at his best. I am, by checking my citation count daily, buying into the idea that the only way my work has value is by judging it against a metric that only grows in importance because I put such weight in it – just like the invisible jailor in the Panopticon. There are clear links here between such self-disciplining behaviour and the new Impact Agenda in the UK – which I have previously made snarky comments about and which othershave made much more coherent and insightful critiques of – as well as practices in other national contexts which I am currently co-writing about with several colleagues.

But what has motivated my confession now? Well, I received an email (see picture below) from the academic publisher SAGE following the publication of an article (shameless self-promotion alert!).

SAGE emailed me to make several helpful suggestions about how to increase the “usage and citation of your article”, which included hints about blogging, Twitter, Facebook, social networking, etc., etc. All very helpful, but also very much reinforcing my citation-counting habits. To be reflective about this for one minute, and one minute only, I already do much of this anyway. Increasingly I think strategically about how to disseminate my work more widely so that it might actually get read by someone (e.g. I now routinely email people my articles when they come out). Why do I do this? Well, I tend to think that academia (in the humanities and social science at least) is a rather self-fulfilling environment in which the ideas of the moment are not necessarily the ‘best’ ideas (whatever that might mean), but are the ideas of the most networked, connected and already influential figures in a field. And I’ve bought into the idea that I need to be one of those influential figures.

Now, in order to try and come to some sort of rationale for this post – rather than a simple ramble through my psyche – I want to come back to how all of this might still be useful. For me, it is when we consider the influence of our work – e.g. who reads it, what debates does it contribute to, etc.? Even though I think that scholars must do and write relevant research, I still do not know whether any of my work is actually relevant. This is where Google Scholar comes in, even if it is a very blunt tool. It enables me to look at my research and to see whether anyone has found it useful – that is, has it been cited. What citation data reveals in stark numbers is the difference between my own judgements of my work and the judgements of others.
In order to make sense of what I’m talking about, I am going to go through my journal articles and highlight the difference between my expectations and actual citation counts. I’ve classified them according to their ‘performance’, for want of a better word, and make a comment on a range rather than going through them individually.
Done well:
  • MacKinnon, D., Cumbers, A., Pike, A., Birch, K. and McMaster, R. (2009) Economic Geography 85(2): 129-150 [Symposium, Evolutionary Economic Geography].
This article has been cited more than any other of my publications, largely because it is the work of several more senior scholars and myself as a very junior contributor. A lot of early career academics seem to have something like this – i.e. a publication that is done with supervisors, bosses, etc. that receives a significant citation count (total and yearly) because it is associated with more senior scholars. It’s a very helpful way to get noticed and a reason why it is important for supervisors, bosses, etc. to co-author with early career academics.
Done reasonably well:
  • Birch, K., MacKinnon, D. and Cumbers, A. (2010) Regional Studies 44(1): 35-53. 
  • Birch, K. and Mykhnenko, V. (2009) Journal of Economic Geography 9(3): 355-380. 
  • Birch, K. and Whittam, G. (2008) Regional Studies 42(3): 437-450. 
  • Birch, K. (2008) Economic Geography84(1): 83-103. 
  • Birch, K. (2006) Genomics, Society and Policy 2(3): 1-15. 
  • Birch, K. (2005) Bioethics 19(1): 12-28.
These articles have garnered modest citations of between 3-15 per year, although not every year. In my book this represents a reasonable ‘return’ for the work of an early-career social scientist. This post, for example, reveals that even for senior social scientists, the modal average citation count is 0. This refers to total citations and NOT citations per year. So, anything better than 0 per year is fine by me! What is revealing is checking more senior scholars on Google Scholar because it shows that many of their papers are actually only cited about 10 times a year. However, over a lifetime this adds up.
Done poorly (understandable):

1. Recent
  • Birch, K. and Tyfield, D. (2013) Science, Technology and Human Values 38(3): 299-327.
  • Levidow, L., Birch, K. and Papaioannou, T. (2013) Science, Technology and Human Values 38(1): 94-125.
  • Birch, K. (2012) Science as Culture 21(3): 415-419.
  • The SIGJ2 Writing Collective (2012) Antipode 44(4): 1055-1058.
  • Birch, K. (2012) New Genetics and Society 31(2): 183-201.
  • Levidow, L., Birch, K. and Papaioannou, T. (2012) Critical Policy Studies6(1): 40-66.

2. Less well-known

  • Birch, K., Levidow, L. and Papaioannou, T. (2010) Sustainability 2(9): 2898-2918. 
  • Birch, K. (2009) Area 41(3): 273–284.
  • Birch, K. (2008) Genomics, Society and Policy 4(2): 1-10. 
  • Birch, K. (2007) Distinktion 8(1): 83-99.
  • Birch, K. (2007) Geography Compass 1(5): 1097-1117.
  • Birch, K. and Cumbers, A. (2007) Scottish Affairs 58: 36-56. 
  • Birch, K. (2007) Totalitarian Movements and Political Religions 8(1): 153-161. 
  • Birch, K. (2006) Science as Culture15(3): 173-181 [Guest editor, Biofutures/Biopresents].
This is by far the largest category, as it probably is for most scholars when you look at citation patterns. Most of these have not garnered many (if any) citations and this is for two reasons: (1) They are fairly recent (e.g. 2012-2013) and have not been “picked up” yet, if they ever will; and (2) they are published in less well-known journals and when I was less well-known. It is therefore not surprising that they have received only between 1-10 citations each in total (and many of these are self-citations). One suggestion to improve the dissemination of research is publishing in open access forums, which is probably going to become more prevalent over the next few years. I’ve had mixed experience with this, in terms of citations, but think it is a great way to reach a wider audience.
Done poorly (disappointing):
  • Birch, K. (2011) Growth and Change 42(1): 71-96. 
  • Birch, K. and Cumbers, A. (2010) Environment and Planning A 42(11): 2581-2601.
The final category is probably the most perplexing for people. It’s those articles that you are proud of but have just not been read or cited. For me this includes two articles from an ESRC-funded project which I thought produced some really interesting empirical findings that pushed forward theoretical debates in the area. Neither has been cited more than once by anyone other than myself (and self-citations don’t count!). I really can’t tell why though. At best guess it’s because I’ve not tied into any specific debates in the field and so they don’t make sense to anyone out there.