Towards Indicators for Opening Up Science and Technology Policy - - PowerPoint PPT Presentation

towards indicators for opening up science and technology
SMART_READER_LITE
LIVE PREVIEW

Towards Indicators for Opening Up Science and Technology Policy - - PowerPoint PPT Presentation

Towards Indicators for Opening Up Science and Technology Policy Ismael Rafols 12 Tommaso Ciarli 1 Paddy van Zwanenberg 1 Andy Stirling 1 1 SPRU Science and Technology Policy Research, University of Sussex, 2 INGENIO (CSIC-UPV),


slide-1
SLIDE 1

Towards Indicators for ‘Opening Up’ Science and Technology Policy

Ismael Rafols12 Tommaso Ciarli1 Paddy van Zwanenberg1 Andy Stirling1

1SPRU – Science and Technology Policy Research, University of Sussex, 2INGENIO (CSIC-UPV), Universitat Polit`

ecnica de Val` encia n.surname@sussex.ac.uk

Internet, Politics, Policy 2012: Big Data, Big Challenges? Oxford Internet Institute, University of Oxford

20-21 September 2012

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 0 / 25

slide-2
SLIDE 2
  • 1. Introduction
  • 1. The problem

The ‘problematic’ use of conventional S&T indicators

Closes down policy options (as well as technologies, in particular those closely associated with power, e.g. nuclear) Narrow inputs – e.g. publications, citations, patents Scalar outputs – e.g. rankings based on averages

◮ Aggregated solutions – missing within group variation

Opaque selections of inputs, outputs and classifications (privately owned databases) Some quantitative assumptions are debatable

◮ Impact Factor of journals: only 2 years, ambiguity in document types ◮ Average number of citations for data power law distributed – small

  • rganisations penalised (Leydesdorff and Bornmann, 2011)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 1 / 25

slide-3
SLIDE 3
  • 1. Introduction
  • 1. The problem

The ‘political’ use of conventional S&T indicators

Why have S&T indicators been so “narrow”? S&T Indicators are simple: suitable to policy maker S&T Indicators have a performative role: they don’t just measure, they signal to stakeholders what is important For example, scientometrics tools

◮ Not ‘just happen to be used’ in science policy (neutral) ◮ Part of the incumbent’s power (loaded): e.g. evaluation of research

◮ Policy makers, scientific community, job market (firms)

Scientific disciplines/communities and techniques such as statistics are a crucial ‘part of the technology of power in a modern state’ (Hacking, 1991, p. 181) Institutions use these techniques to articulate framings, goals and narratives and get people to accept them Ideas grounded on Foucault: “knowledge and power are inseparable”

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 2 / 25

slide-4
SLIDE 4
  • 1. Introduction
  • 2. The proposal

Claims of the presentation

Need for more inputs (variables) to build indicators: ‘broadening out’

◮ Already happening

Need for multiple outputs – based on alternative assumptions – allowing for policy evaluation of the diverse options in building the indicator: ‘opening up’ How? Which tools?

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 3 / 25

slide-5
SLIDE 5
  • 1. Introduction
  • 2. The proposal

Improving the use of tools for measuring S&T

Conventional indicators using narrow inputs Can openly compare multiple outputs making explicit underlying concepts and enabling heuristic tools to facilitate exploration. Complexity science tools and new science mapping tools

◮ More inputs: pubs, but also news, webs (Altmetrics), etc. ◮ Multidimensional outputs: interactive maps ◮ Multiple solutions for one indicator – assumptions

◮ Defining disciplinary areas when not comparable ◮ Different levels of aggregation ◮ More inclusive and contrasting classifications

◮ Analysis of distributions / variance

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 4 / 25

slide-6
SLIDE 6
  • 1. Introduction
  • 2. The proposal

Outline of the presentation

  • 1. Intro and motivations
  • 2. Background: policy use of S&T indicators
  • 3. Framework: breadth and openness
  • 4. Examples

◮ Opening up using broad inputs ◮ Opening using narrow inputs: Academic performance ◮ Opening using new tools: Interdisciplinarity

  • 5. Discussion and work in progress

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 5 / 25

slide-7
SLIDE 7
  • 2. Background

Policy use of S&T indicators: Appraisal

Appraisal

Policy Dynamics Framework

“The ensemble of processes through which knowledges are gathered and produced in order to inform decision-making and wider institutional commitments” (Leach et al., 2010)

Example: Allocation of resources based on research “(excell)ence”

Breadth – gathering Extent to which appraisal covers diverse dimensions of knowledge

Narrow: citations/paper Broad: citations, peer interviews, stakeholders, altmetrics, ...

Openness – producing Degree to which outputs provide an array of options for policies

Closed: fixed composite measure of variables → unitary and prescriptive advice Open: consideration of various dimensions → plural and conditional advice

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 6 / 25

slide-8
SLIDE 8
  • 3. Framework

Appraisal methods: broad vs. narrow & close vs. open

narrow broad closing-down

  • pening-up

range of appraisals inputs

(issues, perspectives, scenarios, methods)

effect of appraisal ‘outputs’ on decision

  • making

Source: Leach et al. (2010)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 7 / 25

slide-9
SLIDE 9
  • 3. Framework

Appraisal methods: broad vs. narrow & close vs. open

narrow broad closing-down

  • pening-up

range of appraisals inputs

(issues, perspectives, scenarios, methods)

effect of appraisal ‘outputs’ on decision

  • making

cost-benefit analysis

  • pen hearings

consensus conference scenario workshops citizens’ juries multi-criteria mapping q-method sensitivity analysis narrative-based participant

  • bservation

decision analysis risk assessment structured interviews

Source: Leach et al. (2010)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 8 / 25

slide-10
SLIDE 10
  • 3. Framework

Broadening out

Appraisal methods: broadening out

narrow broad closing-down

  • pening-up

range of appraisals inputs

(issues, perspectives, scenarios, methods)

effect of appraisal ‘outputs’ on decision

  • making

Conventional Scientometrics and S&T indicators? Multiple indicators Incorporation plural analytical dimensions (global & local networks hybrid lexical-actor nets etc.) New analytical inputs: media, blogsphere. BUT Unitary measures that are opaque, exclusive, tendency to favour the established perspectives … and easily translated into prescription Source: Leach et al. (2010)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 9 / 25

slide-11
SLIDE 11
  • 3. Framework

Opening up

Appraisal methods: opening up

narrow broad closing-down

  • pening-up

range of appraisals inputs

(issues, perspectives, scenarios, methods)

effect of appraisal ‘outputs’ on decision

  • making

Indicators for

  • pening-up

Making explicit underlying conceptualisations and creating heuristic tools to facilitate exploration NOT about the uniquely best method Or about the unitary best explanation Or the single best prediction Conventional Scientometrics and S&T indicators?

There are different ways of opening up, remaining narrow (i.e. with narrow inputs as scientometrics)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 10 / 25

slide-12
SLIDE 12
  • 4. Examples
  • 1. Opening using broad inputs

Broadening-out → Opening-up

narrow broad closing-down

  • pening-up

range of appraisals inputs

(issues, perspectives, scenarios, methods)

effect of appraisal ‘outputs’ on decision

  • making

Conventional S&T indicators?? Broadening out

  • pening-up

First broaden, without collapsing the variables in one indicator

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 11 / 25

slide-13
SLIDE 13
  • 4. Examples
  • 1. Opening using broad inputs

EU Innovation Scoreboard: composite indicator

(a) Country rankings (b) Sensitivity analysis

Source: (Grupp and Schubert, 2010)

Broad but narrow S&T indicator – Ranking (1a) is highly dependent on variables weightings (Grupp and Schubert, 2010) – Sensitivity (1b): when adopting different weights almost every country could be ranked at any position

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 12 / 25

slide-14
SLIDE 14
  • 4. Examples
  • 1. Opening using broad inputs

EU Innovation Scoreboard: opening the indicator

Source: (Grupp and Schubert, 2010)

Opening Consider the variables of the indicator contemporaneously but separated

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 13 / 25

slide-15
SLIDE 15
  • 4. Examples
  • 1. Opening using broad inputs

University ranking: opening the indicator

"University AP" "University BC" "University BM"

student profile teaching and learning research involvement knowledge exchange international

  • rientation

regional engagement student profile teaching and learning research involvement knowledge exchange international

  • rientation

regional engagement student profile teaching and learning research involvement knowledge exchange international

  • rientation

regional engagement

Finder Viewer

Clear selection Search a University Home Regions U-Map LLL Finder & Viewer News About Methodology FAQ Contact

Source: http://www.u-map.eu/finder.shtml

“U-Map offers you tools to enhance transparency” “A list of higher education institutions (HEIs) that are comparable on the characteristics you selected”

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 14 / 25

slide-16
SLIDE 16
  • 4. Examples
  • 2. Opening using narrow inputs: Academic performance

Difference in rankings (Innov VS BS) changing normalisation

Review a comparison of performance of six academic organisations using a bibliometric measure with different normalisations Measure: average number of citations per publication (Rafols et al., 2012) a Number of citations per publication b Number of citations weighted by average citations in the journal of publication c Number of citations weighted by average citations in field of publications – e.g. condensed matter, computational biology, atomic physics, business, management, economic finance, etc d Number of citations weighted by the number of reference in the citing article

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 15 / 25

slide-17
SLIDE 17

Difference in rankings (Innov VS BS) changing normalisation

0
 1
 2
 3
 4
 5
 6
 7
 ISSTI
 SPRU
 MIoIR
 Imperial

 WBS
 LBS


Cita%ons/Publica%on

 Raw


(a) Raw citations

0.0
 0.5
 1.0
 1.5
 2.0
 2.5
 3.0
 3.5
 ISSTI
 SPRU
 MIoIR
 Imperial

 WBS
 LBS


Cita%ons/Publica%on
 Journal
normalised



(b) Weighted by Journal

0
 1
 2
 3
 4
 5
 ISSTI
 SPRU
 MIoIR
 Imperial

 WBS
 LBS


Cita%ons/Publica%on

 Field
Normalised


(c) Weighted by Field

0
 0.05
 0.1
 0.15
 0.2
 ISSTI
 SPRU
 MIoIR
 Imperial

 WBS
 LBS


Cita%ons/Publica%on
 
Ci%ng‐side

Normalised


(d) Weighted by References

Source: Rafols et al. (2012)

slide-18
SLIDE 18
  • 4. Examples
  • 3. Opening using new tools: Interdisciplinarity

Heuristics of diversity

Variety Balance Disparity

Simpson -Herfindahl : 1- ∑ i pi

2

Shannon (Entropy): - ∑i pi ln pi Dissimilarity: ∑ij dij Generalised Diversity (Stirling): ∑ij(i≠j) (pipj)α (dij)β

d: distance between categories; p: share Source: Stirling (2007)

◮ Variety: Number of distinctive categories ◮ Balance: Evenness of the distribution ◮ Disparity: Degree to which the categories are different.

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 17 / 25

slide-19
SLIDE 19
  • 4. Examples
  • 3. Opening using new tools: Interdisciplinarity

Interdisciplinarity as diversity

Bibliometric comparison of interdisciplinarity in different academic

  • rganisations using overlay maps (Rafols et al., 2012)

Indicators: journal attributes, publications and references Distinguish different measures of diversity

◮ Variety: number of disciplines: n ◮ Balance: Size of each discipline: − 1 ln(n)

  • i pi ln p1

◮ Disparity: distance between the categories, computed using the

Global Map of Science

֒ → :

1 n(n−1)

  • i,j di,j

◮ Shannon entropy: − i pi ln p1 ◮ Rao-Stirling diversity: i,j pipjdi,j

where di,j = 1 − si,j, si,j is the cosine similarity between categories i and j, and pi the proportion of elements in category i Different measures of diversity are uncorrelated (Yegros et al., 2010)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 18 / 25

slide-20
SLIDE 20

ISSTI Edinburgh – Disciplines of publication

Source: Rafols et al. (2012)

Extremely diverse

Global map of Science

Social sciences, from sociology to political sciences and economics, health services, biological sciences, environmental sciences, and computer sciences

slide-21
SLIDE 21

London BS – Disciplines of publication

Source: Rafols et al. (2012)

Four disciplines

Global map of Science

Management, Business, Economics and Finance (some Psychology and Operations research).

slide-22
SLIDE 22
  • 4. Examples
  • 3. Opening using new tools: Interdisciplinarity

ISSTI and LBS compared

(a) ISSTI (b) LBS

Source: Rafols et al. (2012)

Using a graphic visualisation we can study the different measures of diversity in one figure, without having to compromise as with composite indicator

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 21 / 25

slide-23
SLIDE 23
  • 4. Examples
  • 3. Opening using new tools: Interdisciplinarity

MIoIR and WBS compared

(a) MIoIR Manchester (b) Warwick BS

Source: Rafols et al. (2012)

Which one is more interdisciplinary?

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 22 / 25

slide-24
SLIDE 24
  • 4. Examples
  • 3. Opening using new tools: Interdisciplinarity

Comparing diversities

ISSTI MIoIR WBS LBS Variety 28 19 20 9 Balance 0.653 0.543 0.46 0.37 Disparity 0.832 0.817 0.77 0.768 Entropy 3.558 2.966 3.078 2.343 Rao Stirling 0.81 0.726 0.68 0.603

Source: Rafols et al. (2012)

Which measure of diversity should we use to assess interdisciplinarity? (and relate it to performance)

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 23 / 25

slide-25
SLIDE 25
  • 5. Conclusions

Strategies for opening up

Work in progress... Presenting contrasting perspectives Simultaneous visualisation of multiple properties / dimensions

◮ Allowing the viewers/policy makers to take their own perspective ◮ Unveiling the assumptions and the properties of the indicators and

variables (distribution?) Interactivity

◮ Allowing the viewer to give its own weigh to criteria / factors ◮ Allowing the viewer to manipulate visualisation.

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 24 / 25

slide-26
SLIDE 26
  • 5. Conclusions

Closing thoughts

Keep it complex (Stirling, 2010) Is ‘opening up’ worth the effort? Conventional indicators tend to favour incumbents

◮ Incumbents have power and incentive to influence choice of indicators

Important to support diversity in S&T system

◮ Manage diverse portfolios to hedge against uncertainty in research ◮ Systemic (‘ecological’) understanding of the S&T ◮ Evolutionary understanding of excellence and relevance ◮ Open possibility for S&T to work for the disenfranchised

◮ There aren’t neglected diseases. There are neglected populations. Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 25 / 25

slide-27
SLIDE 27

Conventional Policy Dynamics

‘lock-in’ to policy favoured by incumbent power structures

multiple practices, and processes, for informing social agency (emergent and unstructured as well as deliberately designed ) complex, dynamic, inter

  • coupled and mutually
  • reinforcing socio
  • technical configurations

in science

narrow scope

  • f attention

SOCIAL APPRAISAL GOVERNANCE COMMITMENTS

simple ‘unitary’ prescriptions

POSSIBLE FUTURES

expert judgements / ‘evidence base’ “best / optimal /legitimate” S&T indicators risk assessment cost-benefit analysis also: restricted options, uncertainties in participation incomplete knowledges

  • Res. Excellence

$

IIIIII GUIDANCE / NARRATIVE

Source: Stirling 2010

Background

slide-28
SLIDE 28

Breadth, Plurality and Diversity

POSSIBLE PATHWAYS MULTIPLE TRAJECTORIES SOCIAL APPRAISAL GOVERNANCE COMMITMENTS

broad-based processes of ‘precautionary appraisal’ ‘opening up’ with ‘plural conditional’

  • utputs to policymaking

dynamic portfolios pursuing diverse trajectories viable options under: conditions, dissonant views, sensitivities, scenarios, maps, equilibria, pathways, discourses multiple: methods, criteria, options, frames, uncertainties, contexts, properties, perspectives

Sustainability

$

                

Source: Stirling 2010

Background

slide-29
SLIDE 29

Global map of science – 222 SCI-SSCI Subject Categories

Pajek

Rafols, Porter and Leydesdorff (2010)

Cogni&ve
Sci.
 Agri
Sci
 Biomed
Sci
 Chemistry
 Physics
 Engineering
 Env
Sci
&
Tech
 Matls
Sci
 Infec&ous
 Diseases
 Psychology
 Social
Studies
 Clinical
Med
 Computer
Sci
 Business
&
MGT
 Geosciences
 Ecol
Sci
 Econ
Polit.
&
Geography
 Health
&
Social
Issues


Source: Rafols et al. (2010)

Example 3 ISSTI LBS

slide-30
SLIDE 30

Global map of science – 222 SCI-SSCI Subject Categories

◮ CD-ROM version of the JCR of SCI and SSCI of 2009 ◮ Matrix of cross-citations between journals (9,000 x 9,000) ◮ Collapse to ISI Subject Category matrix (222 x 222) ◮ Create similarity matrix using Saltons cosine (Rafols et al., 2010)

ISSTI

slide-31
SLIDE 31

Backup slides Figures

References I

Grupp, H. and Schubert, T. (2010). Review and new evidence on composite innovation indicators for evaluating national performance. Research Policy, 39(1):67 – 78. Hacking, I. (1991). How should we do the history of statistics? In Burchell, G., Gordon, C., and Miller, P., editors, The Foucault Effect: Studies in Governmentality. University of Chicago Press, Chicago. Leach, M., Scoones, I., and Stirling, A. (2010). Dynamic sustainabilities: technology, environment, social justice. Earthscan. Leydesdorff, L. and Bornmann, L. (2011). Integrated impact indicators compared with impact factors: An alternative research design with policy implications. Journal of the American Society for Information Science and Technology, 62(11):2133–2146.

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 30 / 25

slide-32
SLIDE 32

Backup slides Figures

References II

Rafols, I., Leydesdorff, L., O’Hare, A., Nightingale, P., and Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy, 41(7):1262 – 1282. Rafols, I., Porter, A. L., and Leydesdorff, L. (2010). Science overlay maps: a new tool for research policy and library management. Journal of the American Society for Information Scienceand Technology, 61(9):1871–1887. Stirling, A. (2007). A general framework for analysing diversity in science, technology and society. Journal of The Royal Society Interface, 4(15):707–719. Stirling, A. (2010). Keep it complex. Nature, 468:1029–1031. Yegros, A., Amat, C., DEste, P., Porter, A. L., and Rafols, I. (2010). Does interdisciplinary research lead to higher scientic impact? Conference paper, STI Indicators Conference, Leiden.

Rafols, Ciarli, van Zwanenberg & Stirling () ‘Opening up’ S&T Policy IPP 2012 31 / 25