Bibliometrics and Research Evaluation. Uses and Abuses

Par : Yves Gingras
Offrir maintenant
Ou planifier dans votre panier
Disponible dans votre compte client Decitre ou Furet du Nord dès validation de votre commande. Le format ePub protégé est :
  • Compatible avec une lecture sur My Vivlio (smartphone, tablette, ordinateur)
  • Compatible avec une lecture sur liseuses Vivlio
  • Pour les liseuses autres que Vivlio, vous devez utiliser le logiciel Adobe Digital Edition. Non compatible avec la lecture sur les liseuses Kindle, Remarkable et Sony
  • Non compatible avec un achat hors France métropolitaine
Logo Vivlio, qui est-ce ?

Notre partenaire de plateforme de lecture numérique où vous retrouverez l'ensemble de vos ebooks gratuitement

Pour en savoir plus sur nos ebooks, consultez notre aide en ligne ici
C'est si simple ! Lisez votre ebook avec l'app Vivlio sur votre tablette, mobile ou ordinateur :
Google PlayApp Store
  • Nombre de pages136
  • FormatePub
  • ISBN978-0-262-33766-3
  • EAN9780262337663
  • Date de parution30/09/2016
  • Protection num.Adobe DRM
  • Taille2 Mo
  • Infos supplémentairesepub
  • ÉditeurThe MIT Press

Résumé

Why bibliometrics is useful for understanding the global dynamics of science but generate perverse effects when applied inappropriately in research evaluation and university rankings. The research evaluation market is booming. "Ranking, " "metrics, " "h-index, " and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything-teachers, professors, training programs, universities-using quantitative indicators.
Among the tools used to measure "research excellence, " bibliometrics-aggregate data on publications and citations-has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century.
In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research.
Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.
Why bibliometrics is useful for understanding the global dynamics of science but generate perverse effects when applied inappropriately in research evaluation and university rankings. The research evaluation market is booming. "Ranking, " "metrics, " "h-index, " and "impact factors" are reigning buzzwords. Government and research administrators want to evaluate everything-teachers, professors, training programs, universities-using quantitative indicators.
Among the tools used to measure "research excellence, " bibliometrics-aggregate data on publications and citations-has become dominant. Bibliometrics is hailed as an "objective" measure of research quality, a quantitative measure more useful than "subjective" and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century.
In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to. Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, ill-defined quantitative indicators often generate perverse and unintended effects on the direction of research.
Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.
L'ambassadeur de Galilée
Yves Gingras, William Shea
E-book
15,99 €
Science, on coupe !
Chris Turner
E-book
12,99 €