Bibliometrics and Research Evaluation: Uses and Abuses (History and Foundations of Information Science),Used

Bibliometrics and Research Evaluation: Uses and Abuses (History and Foundations of Information Science),Used

In Stock
SKU: SONG026203512X
Brand: MIT Press
Regular price$25.23
Quantity
Add to wishlist
Add to compare

Processing time: 1-3 days

US Orders Ships in: 3-5 days

International Orders Ships in: 8-12 days

Return Policy: 15-days return on defective items

Payment Option
Payment Methods

Help

If you have any questions, you are always welcome to contact us. We'll get back to you as soon as possible, withing 24 hours on weekdays.

Customer service

All questions about your order, return and delivery must be sent to our customer service team by e-mail at yourstore@yourdomain.com

Sale & Press

If you are interested in selling our products, need more information about our brand or wish to make a collaboration, please contact us at press@yourdomain.com

Why bibliometrics is useful for understanding the global dynamics of science but generate perverse effects when applied inappropriately in research evaluation and university rankings.The research evaluation market is booming. Ranking, metrics, hindex, and impact factors are reigning buzzwords. Government and research administrators want to evaluate everythingteachers, professors, training programs, universitiesusing quantitative indicators. Among the tools used to measure research excellence, bibliometricsaggregate data on publications and citationshas become dominant. Bibliometrics is hailed as an objective measure of research quality, a quantitative measure more useful than subjective and intuitive evaluation methods such as peer review that have been used since scientific papers were first published in the seventeenth century. In this book, Yves Gingras offers a spirited argument against an unquestioning reliance on bibliometrics as an indicator of research quality. Gingras shows that bibliometric rankings have no real scientific validity, rarely measuring what they pretend to.Although the study of publication and citation patterns, at the proper scales, can yield insights on the global dynamics of science over time, illdefined quantitative indicators often generate perverse and unintended effects on the direction of research. Moreover, abuse of bibliometrics occurs when data is manipulated to boost rankings. Gingras looks at the politics of evaluation and argues that using numbers can be a way to control scientists and diminish their autonomy in the evaluation process. Proposing precise criteria for establishing the validity of indicators at a given scale of analysis, Gingras questions why universities are so eager to let invalid indicators influence their research strategy.

⚠️ WARNING (California Proposition 65):

This product may contain chemicals known to the State of California to cause cancer, birth defects, or other reproductive harm.

For more information, please visit www.P65Warnings.ca.gov.

Recently Viewed