I don’t think it could have been brought home more clearly than it was. We were sitting in a faculty meeting attempting to sort out which of our recent visitors should get offers to join our faculty. Apparently one of my colleagues felt that his man was not going to make the cut, so he pulled out a sheet of paper from which he read the number of citations in the literature of this person’s work over the past few years by others. His argument was essentially the same as that used to rank-order Web sites on Google: the best sites are those with the most links from other sites. So, despite the fact that the gentleman had visited the School of Physics for two days, had good references, impressed those he talked to, and gave a good colloquium presentation, he fell back on this statistic as his sole argument for making an offer to this person. That was, to my knowledge, the first time that had happened in the 33 years I had been attending faculty meetings in the School of Physics. I was appalled and objected, a bit frumpily, I suppose. But the incident raised a point regarding how we evaluate the work of our colleagues, whether it is for employment or awards or promotion.
Archimedes claimed that if he were given a lever long enough he could move the earth. Modern humans seem to have a similar claim: give us a number that is related to some complex phenomenon and we will misinterpret it. Optical engineering provides several good examples. The M2 factor that is applied to Gaussian beams is one. George Lawrence noted that a Gaussian beam with a shape ‘to die for’ that has broad low power wings will give a large factor value that doesn’t reflect quality of the beam profile. Then there is the use of single frequency contrast values for the MTF of an optical system. It’s sort of a “unimania,” the substitution of single numbers for complex relations.
The source of the statistics on the citations of our prospective faculty member is a service called Web of Science. It is part of ISI, formerly the Institute of Scientific Information. Most engineers and scientists are familiar with Current Contents, which was an early form of an alerting service, and the Scientific Citation Index, which is the forerunner of Web of Science. On the ISI web site Web of Science is described as a service that provides seamless access to the Science Citation Expanded®, Social Sciences Citation Index®, and Arts & Humanities Citation Index™. It enables users to search current and retrospective multidisciplinary information from approximately 8,500 of the most prestigious, high impact research journals in the world. ISI Web of Science also provides a unique search method, cited reference searching. With it, users can navigate forward, backward, and through the literature, searching all disciplines and time spans to uncover all the information relevant to their research. Users can also navigate to electronic full-text journal articles.
Sounds good. There are certainly aspects of this service that can provide insight as to what other researchers are doing and how others are using your research.
But just as “web” has a positive connotation of interconnectedness, it also has a negative meaning of entrapment and sinister doings. And I am inclined to harbor the second interpretation, despite the glorious description by ISI. Why?
It has become a blunt instrument of promotion and tenure and grant funding. Deans are using the numbers despite the fact that, in many cases, the searches are incomplete. For example, a search was done on a faculty member using their name and location (Georgia), but having arrived at Tech a few years ago, none of his earlier work was found. But that’s modest compared with the policy established by an evaluation panel for one of the government funding agencies that only citations from the last five years will be used in assessment of a researcher’s ability. However, many important papers are not immediately recognized or others have to catch up with the significance of the work described in the paper. My colleague in the next office, Brian Kennedy, told me that one of the reviewers of his last grant used citation data of his current work to critique his proposal. Although this approach is not completely without merit, I believe we may see the rise of professional backscratching: “You cite my papers and I’ll cite yours.”
Certainly there are valid uses for these citations. They can guide the researcher to those who he would never otherwise know are interested in his ideas and results. They can help those who are trying to understand the impact of another’s research by looking at the citing papers and discerning the contribution to the field that is being made by work described in the papers. But to assess such evidence takes time, effort, and thought. Nah, that’s too difficult. Let’s just go to the Web of Science and count cites.