Actually I don't really want to know. Listing h-indices here would be akin to having a pissing contest. Puerile, messy and smelly...
In recent years I've noticed an increase in the number of people who list the number of times their publications have been cited in their c.v.'s and/or biosketches.* Many have started listing their h-indices as well. At first I viewed this as a form of bragging. But now I've seen it enough that I'm beginning to pay attention. We could argue ad infinitum as to whether number of citations and/or h-indices are useful measures of a person's productivity and standing in their field.** That's akin to the interminable Mac vs. PC arguments*** and I'm really not interested in that kind of semi-religious "discussion". What I am interested in is people's opinions as to whether or not this is a reasonable practice. Do any of my readers list citation statistics in their c.v.'s or biosketches? What are your thoughts on this? Do your respective departments take such things into account come annual review time (which is now for many of us)?
I don't list citation statistics in my c.v./biosketch (but might do so in the future if it seems advantageous). And my department doesn't formally consider such things, although I suspect some of the senior faculty spend some time on ISI's Web of Knowledge prior to annual reviews figuring out the stats on the more junior faculty.
Finally, I wouldn't recommend grad students and junior postdocs listing such stats unless they have a really highly cited paper or two. I have seen senior postdocs applying for TT positions list their stats. In some cases it helps. In others, not so much. It's a good idea to poke around and see how you stack up versus your peers before making the decision.
* Yes, in biosketches in grant proposals. At least in NSF and private foundation proposals.
** Personally I find number of publications, average number of citations per publication, plus h-index the most informative combination, but recognize even that has flaws.
*** Mac.
13 comments:
I've never included anything about citations in any document until my annual review this year and I only did that because I was following a template. I had to look all my papers up for it because it had never really crossed my mind before.
I don't put citations or anything like that anywhere on my CV but I need to put the total number of citations for the previous year of all of my papers in my annual review.
And I wasn't aware there was a Mac vs. PC argument. It's a no-brainer.
I think it's douchetastic. Anybody that gives a fuck can easily find out for themselves.
The rise of the H-index is another fucking disaster for young investigators. The primary driver of the H index up until it gets near your total number of paper is how old you are.
Us young 'uns need to focus on productivity (and impact factor) relative to opportunity. Any bulsshit citation counts that don't adjust for seniority, like the H-index, just fuck us more.
Unless you had a paper in the last few year's thats being miraculously cited out the arse...
antipodean
Antipodean,
It's certainly true the h-index is an unfair measure if you're comparing apples to oranges (e.g. a newly minted assist. prof to a full prof), but not so much if comparing apples to apples.
And I would assert that impact factors are as bullshit as h-indices...
I dunno Odyssey.
I completely agree that the impact factor is bollocks. But to paraphrase Churchill, it's the worst system ever invented except for all the other ones we've tried.
And I've yet to see some sort of age-weighted h-index be presented.
-antipodean
Yes, I do take the h-index seriously, and I do not think it is bollocks in the least. I am fairly convinced by Hirsch's contention that it correlates reasonably well with overall scientific success and productivity, and as long as one does not take it as the one and only way to judge the body of work of a scientist -- but then again, any single criterion is fallible -- it is one that at least is numerical and does not depend on someone's subjective assessment. I think that the habit of relying on what "prominent scientists" say in order to make up our mind about someone else is far more pernicious.
I have written more about it here
Massimo,
Nice post. I'm sorry I missed it when you first wrote it. I think the keys here are how these indices are used and in combination with what other information.
I started including it after I was turned down for promotion to full professor (I was the first woman in 100+ years to come up for full professor in my department, the committee was all male ) who thought I needed "more time to develop" and wondered if I could "play with the big boys"... Six weeks later a list of the top 1000 most cited people in my field came out. I was on it. No one else in my department was.
It has the advantage of being utterly objective, and is IMHO a bit better than a mere paper count; which was the time honored way of doing it.
Like anything else, if you don't have to give it, and it's nothing to shake a stick at, don't. If it's good, wave it around...
and ComradePhysioProf is right, it's easy enough these days to look up (and I do for every tenure packet that crosses my desk...)
except papers that are cited most are not necessarily those that have the most interesting ideas. I'm a sceptic.
kiwi
My h-index would suck should I bother to calcuate it. Why? Because my PI failed to cite my awesome papers in two of the papers that followed (why? I can only guess two reasons: favoritism for my co-grad student and citation limits), and because everyone outside of our lab cites based on his cites instead of doing a proper literature review, I never get cited thereafter. Point being, people are lazy, citations are self-reinforcing, good papers get overlooked, and getting cited can be highly influenced by giving invited talks. Since I left my grad field, I have no opportunity to push my former work. Hence, poor citation record despite outstanding work in top journals.
Anon. on 11/12:
Such blatant gender bias makes my blood boil. I hope you've rubbed their faces in your h-index, top 1000 listing and anything else you could come up with.
Kiwi:
While it's certainly true that some papers do not get the cites they deserve, if you are consistently publishing "interesting" papers that do not get cited, perhaps you should think more in terms of "important." Of course you won't find much consensus in terms of defining what is interesting or important...
Anon. on 11/16:
What you describe cannot be overcome by any attempt at an objective measure, h-index or otherwise. Having publications that are in good journals but do not garner citations would raise a lot of questions among those reviewing your c.v. Unless those people have some knowledge of your personal situation, you're hosed.
You bet I sent in a package the next time that put all of that front and center.
Anon 11/16: That experience has made me much more attentive to now is trying to get a broader sense of someone who's tenure packet is in front of me. It's not just cites, it's not just numbers, it's not just invited talks, I really try hard to read between the lines in various ways and wonder how bias might have skewed what I'm looking at...so look for people who have that kind of view and make sure that they have the opportunity to write for you!! Fight back where you can...
--Anon 11/12
Post a Comment