Quantcast
Channel: Altmetrics Archives - Social Science Space
Viewing all articles
Browse latest Browse all 29

Social Science Ahead of the (Shallow) Curve on Altmetric Acceptance

$
0
0

Metrics word cloud

A new survey of university faculty finds that the idea of altmetrics – using something aside from journal citations as the measure of scholarly impact – has made less headway among faculty than might be expected given the hoopla surrounding altmetrics. These new measures are the most familiar in the social science community (barely) and least familiar in the arts and humanities (dramatically so).

The survey, titled “How Faculty Demonstrate Impact,” was presented at the bi-annual meeting of the Association of College & Research Libraries. Thanks to that audience, the paper begins by noting that while bibliometrics “has been an evolving part of the academic landscape for decades,” other sorts of scholarly metrics have been a more recent phenomena. And again, given that the library information science, or LIS, community has long labored with these various measuring tools, there was an underlying assumption that the rest of academe was equally conversant with them.

That was not the case.

“Our study also found that faculty are not nearly as familiar with altmetrics as those in the LIS field might have assumed, given the attention in LIS literature and many libraries’ outreach efforts to faculty,” wrote the authors, led by Caitlin Bakker, a biomedical research services librarian at the University of Minnesota Health Sciences Libraries. “Many more faculty reported being ‘not at all’ or ‘marginally’ familiar with altmetrics than reported being ‘familiar’ or ‘extremely familiar.’

“This finding serves as an important reminder that while altmetrics may no longer be a new concept in our field, it remains an unfamiliar concept to most faculty.”

One hint that the unfamiliarity might have been expected, the authors said, is that “very little” has been written about faculty uptake of altmetrics even as the literature on impact metrics themselves has grown. Their study, they wrote, is the first multi-campus attempt to address how faculty members interact with these measures.

The survey behind the study was open to faculty at four large American universities: The University of Minnesota, the Ohio State University, Valparaiso University and the University of Vermont. The authors categorized the 1,202 responses across four broad disciplinary areas: health sciences (n=444), sciences (n=343), social sciences (n=256), and arts and humanities (n=158). They then looked at the broad trends in opinion through that four-part disciplinary prism.

What they found in how faculty perceived and trusted traditional impact metrics and altmetrics was that social science, physical science and health science tended to answer in lockstep, while the humanities inevitably were less aware and less trusting of metrics of any stripe. And in general, although not by huge margins, social scientists were more familiar with altmetrics: “23.4% of respondents being either familiar or extremely familiar with altmetrics, when compared with Sciences (16.5%) and Health Sciences (16.2%).”

This humanities-versus-everyone-else result was echoed in a pair of questions about departmental encouragement and requirement to use impact metrics period. While most departments encourage the inclusion of any sort of impact measurements in their promotion and tenure process, only about two-fifths require their use – except in the humanities where only a quarter of respondents reckoned their department encouraged their use and an “overwhelming” number reported no requirement at all to use metrics.

Lastly, and perhaps not surprisingly, the more established an academic was, the less value they placed on metrics.

Assistant Professors placed the greatest amount of importance on impact metrics, while Full Professors placed the least amount of importance. Certainly, Assistant Professors preparing for the tenure process are seeking out ways to demonstrate the impact of their work and are often under the impression they should include statistical measures of impact as part of their tenure materials. More well-established Full Professors may not feel as much pressure to engage with statistical representations of their work’s impact given that they are no longer assessed for promotion.

The authors also charitably suggested that since full professors have already used metrics to assess others climbing the ladder, they “understand the limitations of statistical impact measures” in their broader assessment of metrics.

The full author team for “How Faculty Demonstrate Impact,” including Bakker, were: Jonathan Bull, scholarly communications librarian, Valparaiso University; Nancy Courtney, research impact librarian, Ohio State University Libraries; Dan DeSanto, instruction librarian, University of Vermont; Allison Langham-Putrow, scholarly communications librarian, University of Minnesota–Twin Cities; Jenny McBurney, research services coordinator and social sciences librarian, University of Minnesota– Twin Cities; and Aaron Nichols, access/media services librarian, University of Vermont.

For a more detailed breakdown of the results, please see the full report HERE.



Viewing all articles
Browse latest Browse all 29

Latest Images

Trending Articles





Latest Images