HOW DO YOU MEASURE RESEARCH PRODUCTIVITY?
(VERY CAREFULLY)
By Andy Boynton, John and Linda Powers Family Dean
As deans, each of us has the task of not only building and improving a research culture but also measuring it. Needless to say, it’s a challenge. Just how do you put a number on intellectual output and the creation of knowledge that makes a difference?
We all have our ways. In the spirit of exchanging best practices, here’s my take on an eight-year experiment we’ve undertaken at the Carroll School of Management.
Interspersed with my comments will be pages from what we simply call “Research Reports.” These are three-page summaries compiled every spring, capturing the quantity and quality of a faculty member’s research. The reports give us a transparent way of evaluating the research, one that lends to comparison across positions and departments.
We’ve found these annual summaries to be very useful. They don’t explain everything about a professor’s research, and they shouldn’t, but they do offer a starting point for discussion.ĚýThey're one measure—a good first approximation of research productivity and impact.ĚýNaturally the data points come into play when considering tenure and promotion, but in my mind the real value is that they begin a conversation.
In the pages displayed here, you’ll see the standard references to “Impact Factor” and “Article Influence.” As you know, both of these refer to the quality of journals in which our professors are published. “Impact” relates to how often the average article in these journals is cited; “influence” gives greater weight to citations in higher quality journals.
You’ll see references to “Financial Times,” specifically the top 50 journals used by theĚýFinancial TimesĚýin compiling its rankings of business schools from the standpoint of research output. Our measures also take account of how often a given journal’s articles are cited in Web of Science, and how often articles written specifically by our faculty are cited in Google Scholar.
In addition, metrics are influenced by whether a faculty member’s article is coauthored, and if so, how many collaborators there are (more about how we don’t necessarily penalize coauthorship in a moment).
Here is the first page of the Research Report that every faculty member receives. It shows how their statistics are calculated:
Ěý | Research Metrics | Description |
1 | Impact Factor x publications divided by co-author | Impact factor divided by the total number of authors, summed over all publications from 2009 onwards. Impact factor is 5-year impact factor from the Web of Science (WOS). |
2 | Article Influence Score x publications divided by co-author | Article Influence Score divided by the total number of authors, summed over all publications from 2009 onwards. |
3 | Financial TimesĚýRanked x publications divided by co-author | A publication in aĚýFinancial TimesĚýjournal is assigned a score of 1, 0 otherwise. Aggregated score is divided by total number of authors, and summed over all publications from 2009 onwards. |
4 | Impact Factor x publications | Metric (1) with no division for number of authors. |
5 | Article Influence Score x publications | Metric (2) with no division for number of authors. |
6 | Financial TimesĚýRanked x publications | Metric (3) with no division for number of authors. |
7 | Web of Science citations divided by co-author | The number of Web of Science citations for each publication, divided by total number of authors, summed over all career publications. |
8 | Google Scholar citations divided by co-author | The number of Google Scholar citations for each publication, divided by total number of authors, summed over all career publications. |
On the second page, the faculty member sees his or her research statistics and how they compare with colleagues in the department and school. I have the honor here of using as a sample the May 2016 Research Report for Jeffrey Pontiff, the James F. Cleary Chair in Finance.
As you’ll read in this edition ofĚýCarroll Capital, Jeff recently won theĚýJournal of Finance’sĚý2016 Amundi Smith Breeden First Prize for his paper, “Does Academic Research Destroy Stock Return Predictability?” He collaborated with our former doctoral student R. David McLean, now at Georgetown University’s McDonough School of Business.
ĚýĚýĚý Ěý |
|
|
|
|
|
|
1 | Impact Factor x publications divided by co-author | 16.42 | 90 | 66 | 66 | 69 |
2 | Article Influence Score x publications divided by co-author | 17.84 | 90 | 96 | 84 | 69 |
3 | Financial TimesĚýRanked | 2.58 | 84 | 78 | 62 | 66 |
4 | Impact Factor x publications | 37.90 | 99 | 74 | 70 | 79 |
5 | Article Influence Score | 41.16 | 99 | 96 | 81 | 79 |
6 | Financial TimesĚýRanked | 6.00 | 98 | 84 | 66 | 77 |
7 | Web of Science citations divided by co-author | 506.92 | 99 | 92 | 63 | 88 |
8 | Google Scholar citations divided by co-author | NA | NA | NA | NA | NA |
I alluded earlier to the issue of coauthorship. I say “issue” because there’s more than one way to account for articles coauthored by our professors. On the one hand, we want to encourage collaboration; on the other, we want to see signs of independent authorship as well. The upshot: we include some measures that (in effect) subtract points for coauthorship and others that don’t.
Finally, I return to the broader question of quantifying the value of knowledge. I strongly believe that everyone’s research is nuanced, and to understand it well, you have to get beneath the data. Still, the data can tell you a lot.
All of us have goals in this regard—at the Carroll School, we are focused heavily on producing high-quality, impactful research. We need a measuring system compatible with this goal, not necessarily the whole picture of scholarly output, but at least a snapshot. We keep tinkering every year with our measuring system to make sure we’re getting the clearest possible picture.
I would be thrilled to share more details about our approach with readers ofĚýCarroll Capital, and hear about your efforts to measure research accomplishments. Please feel free toĚýsend me an e-mail.
I leave you now with the third and final page of Jeff Pontiff’s Research Report. “WOS” refers to Web of Science, and the 50 journals ranked by theĚýFinancial TimesĚýare assigned a value of 1 (all others carry a value of 0).
| Ěý Publication | Ěý Year | Ěý Source | Financial Times Journal |
| Times- | 5-year Impact Factor | Article Influence Score |
1 | Reversions of excess pension assets after takeovers | Ěý1990Ěý | Rand Journal of Economics | 1 | 2 | 38 | 2.29 | 3.02 |
2 | Private benefits from block ownership and discounts on closed-end funds | 1993 | Journal of Financial Economics | 1 | 2 | 61 | 5.88 | 6.02 |
3 | Closed-end fund premia and returns – Implications for financial market equilibrium | 1995 | Journal of Financial Economics | 1 | 0 | 37 | 5.88 | 6.02 |
4 | Costly arbitrage: Evidence from closed-end funds | 1996 | Quarterly Journal of Economics | 1 | 0 | 157 | 9.79 | 16.06 |
5 | Excess volatility and closed-end funds | 1997 | American Economic Review | 1 | 0 | 40 | 4.95 | 7.04 |
6 | Book-to-market as a predictor of market returns | 1998 | Journal of Financial Economics | 1 | 1 | 101 | 5.88 | 6.02 |
7 | How are derivatives used? Evidence from the mutual fund industry | 1999 | Journal of Finance | 1 | 1 | 84 | 7.55 | 9.86 |
8 | Market valuation of tax-timing options: Evidence from capital gains distributions | 2006 | Journal of Finance | 1 | 2 | 9 | 7.55 | 9.86 |
9 | Costly arbitrage and the myth of idiosyncratic risk | 2006 | Journal of Accounting & Economics | 1 | 0 | 62 | 4.68 | 3 |
10 | Shares Issuance and Cross-Sectional Returns | 2008 | Journal of Finance | 1 | 1 | 82 | 7.55 | 9.86 |
11 | Share Issuance and Cross-Sectional Returns: International Evidence | 2009 | Journal of Financial Economics | 1 | 2 | 23 | 5.88 | 6.02 |
12 | Idiosyncratic return volatility, cash flows, and product market competition | 2009 | Review of Financial Studies | 1 | 1 | 63 | 6.19 | 6.94 |
13 | Investment Taxation and Portfolio Performance | 2012 | Journal of Public Economics | 0 | 1 | 3 | 2.81 | 2.71 |
14 | Hierarchies and the Survival of POWs during WWII | 2012 | Management Science | 1 | 1 | 0 | 3.4 | 2.67 |
15 | The Year-End Trading Activities of Institutional Investors: Evidence from Daily Trades | 2014 | Review of Financial Studies | 1 | 3 | 3 | 6.19 | 6.94 |
16 | Shareholder Nonparticipation in Valuable Rights Offerings: New Findings for an Old Puzzle | 2016 | Journal of Financial Economics | 1 | 1 | NA | 5.88 | 6.02 |
17 | Does Academic Research Destroy Stock Return Predictability? | 2016 | Journal of Finance | 1 | 1 | NA | 7.55 | 9.86 Ěý |