Presentations

Here are the list of participants to the STI 2017 Conference and STI 2017 Book of Abstracts.

You can find below the presentations of all sessions.

Introduction to the conference

Research & Innovation in Africa

Altmetrics

Careers & Mobility

Location-based Approaches

Gender perspective

Special Session: Developing an Evaluation Framework for Promoting Gender Equality in R&I (EFFORTI)

Innovation Dynamics

Issues in Scientometrics

New Methodologies

Open Access

Special track : Multiplying methods in research evaluation

(organized by Sarah de Rijke with Gemma Derrick, Thomas Franssen, Jordi Molas-Gallart, Ismael Rafols, Jesper Schneider, Inge van der Weijden, Paul Wouters and Julia Heuritsch

Research Systems Performance

Session on RISIS

See RISIS Website, CORTEXT Manager (v2)

SSH session

Transnational Research

Higher Education in Europe

Best Poster Award

Two Posters received the Prize of Best Poster Award:

Marianne Noel and Frédérique Bordignon

The pitfalls of signature. Questionning affilation in French research and higher education institutions

With the emergence of the citation cycle, scientific activities have entered an accountability regime (Wouters, 1999). Publications in prestigious journals account for the activities of labs and departments, contribute to their higher visibility and allow them to access funding (sometimes through a performance-based system), are used to attract attention and students to the laboratory (Rushforth & de Rijcke, 2015). In this context, affiliation is a topic (or a variable) of growing interest in science policy, as illustrated by the occurrence of the term (more than 70 times) in the proceedings of the STI conference in 2016. It is mainly addressed in technical terms (for instance how to clean addresses) but not only. With the development of social media, being easily identified and recognizable is nowadays an important issue for individuals in academia. It’s also a challenge for institutions. Affiliation reflects multiple socialities; it can mean joining or being part of a group, even sometimes excluding oneself from this group. Scientific activities take place in multiple collectives arranged around instruments, projects, domains or specialties, funding schemes… with professionals belonging to various organizations, institutions, etc. As emphasized by Pontille (2004), the history of the scientific signature is closely tied to how research is organized in a field. Signature is also a sign that should be understood inside a larger agency connected to the genesis of acts, their making and their archives (Fraenkel, 2008). In universities, various services are in charge of analysing and managing research outputs to inform institutions’ strategy: library, office of research, knowledge transfer structures, etc. In this study, we focus on how doing bibliometrics affect their role in the larger framework of the university organization. Job profiles in libraries are moving towards “data librarian” profiles because of the need of technical skills to retrieve information from databases, particularly for bibliometric analyses (Astrom & Hansson, 2013). In this paper, we propose to investigate the relation between information professionals and high-level management as revealing tensions (between monitoring and building fiability). Adopting a data infrastructure perspective, we wouldlike to question the intersection between collection, classification and evaluation practices. The goal of the paper is also to share information about the situation in France. For 10 years, the continuous creation of new funding schemes and structures has led to a landscape where research organizations and universities overlap largely. The French research and innovation system is often characterized as a “millefeuilles”, a pastry with multi-layered structure. Recommendations on affiliations have been issued at the national level (Dassa et al, 2015), but they are considered as an unrealistic ideal. Our case study deals with a prestigious school of engineering setup in a large university campus. We will present the context and stages of development of a tracking tool that includes 1/ the tagging of the multiple versions of the lab’s name (up to 1200 versions were found in the study period) and 2/ an attempt to identify articles containing the school’s name in the main databases (WoS, Scopus and HAL), a work that has been developed in interaction with the main commercial providers. We will describe actors being pulled back and forth between conflicting and non-coordinated demands and emphasize intended effects of these tagging/tracking activities. At least, this work aims to examine the lessons derived from the development of multiple affiliations as a possible consequence of growing interdisciplinarity and question the pertinent scale of analysis. Would the solution be, as suggested by ORCID, an Organization ID? (Pos20)See the poster stored on HAL repository

Richard Heidler, William Dinkel and Anke Reinhardt

Broadening the reviewer pool of the German Research Foundation: Drivers, effects and perspectives

Today, the scientific system uses peer review extensively as an assessment and decision-making instrument. This leads to a perceived burden (and therefore fears of negative effects) on the science system itself and on the individual reviewers. So far, very little is known about the characteristics of this increasing demand and even less about its consequences for decision-making processes that rely on peer review. Our contribution will provide empirical insights into the characteristics of the German Research Foundation’s (DFG) review system and reviewers. Based on a dataset of the DFG’s proposals and reviews, we develop a set of indicators to measure the peer review load both at the individual and at the systems level. (Pos11)