A PUZZLE OF ESTONIAN SCIENCE: HOW TO EXPLAIN UNEXPECTED RISE OF THE SCIENTIFIC IMPACT.

AuthorLauk, Kalmer
PositionReport
  1. Introduction

    A quality of a scientific publication of any country can be predicted, partly at least, from the GDP per capita but also from the percentage of money that was spent on R&D by this country (Allik 2013a, King 2004, Vinkler 2018). Hence, only very rich nations spending a considerable amount of the produced wealth on R&D afford to produce high-quality scientific papers, which have an impact on science. It was also noticed that open countries whose scientists collaborate with their foreign colleagues are likely to produce scientific output of higher quality (European Commission 2015, Moed 2005, Wagner and Jonkers 2017). Although wealth and money are important factors, countries differ considerably in terms of the efficiency of turning financial input into bibliometrically measurable output (King 2004, Leydesdorff and Wagner 2009, Vinkler 2008). This indicates that not all R&D money is necessarily turned into the high quality scientific output; some of it has been lost in translation. It was observed that countries differ in their ability to transform scientific research into immediate economic return (Vinkler, 2008). Besides money, achieving scientific excellence also requires reasonable science policies, research ethos, and even a culture that supports discovery of new ideas (Jurajda Kozubek, Munich, and Skoda 2017, Moed 2005, Ntuli, Inglesi-Lotz, Chang, and Pouris 2015, van Leeuwen and Moed 2012, van Leeuwen, Visser, Moed, Nederhof, and van Raan 2003).

    In the study of factors that could determine scientific excellence, the progress of science in the three Baltic states--Estonia, Latvia, and Lithuania--may be particularly informative (Allik 2003, Kristapsons, Martinson, and Dagyte 2003). By a coincidence, all three countries published only approximately 300 papers each year in journals covered by the Web of Science (WoS; or its predecessor, Clarivate Analytics) around the moment when the Soviet Union collapsed in 1991 (Allik 2003). Only fifteen or so years later, Lithuania's scientists published about 1,300 papers in the peer-reviewed journals against only about 400 papers that were authored by Latvian researchers in 2007 (Allik 2008; Figure 1). Although the three Baltic countries are often confused, the progress in their science output, both in quantity and quality, has diverged remarkably during the years after regaining independence in 1991. In spite of very similar historical, political, and economic experiences, the progress of science measured on the basis of their bibliometric indicators have been dramatically different (Allik 2011, 2015). To a certain extent, it looks like a natural experiment where three different subjects experienced different treatments with a purpose to observe how it could affect their scientific progress.

    In this paper we intend to provide an overview of the Estonian science, using Latvia and Lithuania as a benchmark, based on the latest release (March 15, 2018) of the Essential Science Indicators (ESI; Clarivate Analytics) covering 11 years long period from 2007 until 2017. As we hope to demonstrate, the progress of Estonian science, especially during the last decade, has been spectacularly fast. This progress of turning financial input into bibliometrically measurable output can be even called miraculous, because according to the Statistics Estonia, investments to R&D have diminished in the past three last years, despite the embarrassing fact that it is only 0.8% of Estonia's GDP (https://www.stat.ee/news-release-2017-128). We are not expecting to solve this puzzle--turning diminishing financial input into increasing bibliometric output--completely. Instead we hope to provide some additional knowledge how to avoid mistakes in nurturing such a delicate process as scientific excellence.

  2. Method

    Data were collected from the latest ESI release (updated on March 15, 2018) covering 11 years long period from January 1, 2007 until December 31, 2017. All journals, except universal such as Nature, Science and the Proceedings of the National Academy of Sciences (PANAS), are divided into 21 scientific areas in addition to Multidisciplinary containing papers, which are difficult to assign to any of these areas. When ESI was designed, it was decided to exclude humanities from the list of scientific areas. Thus, ESI data cannot tell anything specific about the state in the humanities for any country or institution.

    ESI followed more than 12 million articles in more than 12,000 journals that were published during 11-year observation period and indexed in the WoS. Inclusion in ESI is dependent upon meeting certain citation thresholds. Only the most highly cited individuals, institutions, journals, countries and papers are included in ESI. Researchers, institutions, and highly cited papers must exceed 1% top-citation threshold to be included in ESI. For instance, to be included as a highly cited researcher in any of 22 areas, the total number of citations to a person's output must be in the top 1% when compared to all other researchers in that particular area, who have published papers in this area during the last 11 years. Thresholds for areas are remarkably different. For example, a computer scientist enters ESI collecting at least 322 citations to papers published during the last 11 years while the threshold for a physicist is as high as 7,999 citations. Understandably, countries/territories and journals need to be among the top 50% in order to enter ESI.

    Because ESI includes countries/territories producing perhaps only a small number of papers during the 11-year observation period, we excluded from the further analysis all countries/territories publishing fewer than 4,000 papers. For example, over 3,000 papers were published by Senegal, Panama, Malawi, Uzbekistan, Zimbabwe, Macedonia, Sudan, and Burkina-Faso. It could be also mentioned that Bermuda, Seychelles, and Vatican published each fewer than 300 papers included in ESI over 11 years.

  3. Results

    Table 1 presents a list of countries who entered ESI and published more than 4,000 documents in the period 2007-2017. The listed countries are ranked according to the mean citations per paper (the 5th column Cites/Paper). The 6th column (Top Paper %) show the percentage of papers which reached the top 1% rank in their citations. The next, the 7th column (HSDI Rank) demonstrates country ranking on the High Quality Science Index, which was proposed to combine average citation rate with the percentage of papers reaching the top 1% (Allik 2013a). To compute HQSI both indicators, the mean citation rate and the percentage of top papers, were transformed into normalized scores after which their mean value was found. The last column show changes in the ranking position compared to a similar ranking list for the period 1997-2007 (Allik 2008; Table 1). Several countries (Luxembourg, Nepal, Ecuador, Qatar, Macau, Bosnia and Herzegovina, and Iraq) were missing from the previous list and we cannot compute the change in ranking for them.

    Small countries such as Iceland, Switzerland, and Scotland were able to produce science of the highest impact. Together with the Netherlands and Denmark they produced papers with the highest mean citation rate from which the highest percentage reached the top of citations. If we compare rankings, 1997-2007 (Allik 2008; Table 1) with the current one, then three countries, i.e. the Republic of Georgia, Singapore, and Saudi Arabia have improved their position most by increasing 50, 31, and 19 positions respectively. Three countries who dropped most in their ranking were Vietnam (-26), Poland (-18), and Russia (-18). Estonia improved 11 positions in the ranking while Latvia and Lithuania dropped 13 and 16 positions respectively in the ranking during the last 10 years.

    There were worries that Americans produce higher quality science than the EU countries, with a gap between them widening (Albarran, Crespo, Ortuno, and Ruiz-Castillo 2010, European Commission 2015, Leydesdorff, Wagner, and Bornmann 2014). Inspecting the table above, there is no foundation for these fears. USA not only lost 5 rank positions compared with the previous ranking 10 years ago, but its HQSI rank (15) is 8 positions behind the overall ranking (7) based on the mean citations. The negative gap can be used as a Mediocrity Index pointing to countries, which produce unexpectedly small number of highly influential papers compared with the total number of papers indexed in ESI (Allik 2013a). As an example, experts noticed already several years ago that Scandinavian countries, including Sweden, may have fallen into the comfort zone trap producing an...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT