[RSS][Google]

http://unllib.unl.edu/LPP/

Library Philosophy and Practice 2011

ISSN 1522-0222

An Exploratory Study of Indian Science and Technology Publication Output

Dr. G. Buchandiran
Department of Library and Information Science
Loyola Institute of Technology
Chennai 602 103, India

Introduction

The contribution of a nation to research promotes the knowledge building process. Understanding the research contribution of a nation is the basic component in science policy. In reality, science auditing is a complex process. The science policy makers employ a large number of science indicators - not all are equally important. Metrics employed in this study are not comprehensive as they are limited to scientific publications. Science auditing is not confined to a small set of indicators. To arrive at a broad science policy, we need to use many other indicators.

The paper presents the data and analyses and the review or evaluation of Indian research productivity from the derived data. The descriptions presented are the reflection of publication profile only and no way conclusive.

Objectives

Most government statistics are the outcome of the documentation of productivity from databases. They are totals calculated for a number of dimensions and published as such. These statistics are simple numbers produced by additions, not by complex mathematical tools (such as regressions and correlations). They refer not to the methodology for the treatment of data but to the data themselves.

Science and technology statistics follow the same pattern.

As the primary motivation to do this work is to analayse the scientific productivity of India using publication counts, the following specific objectives are set forth.

  • To observe the Indian scientific publication output for a period of five years;
  • To find and analyse the publication output of institutions contributing to the research output;
  • To observe the output in different disciplines and to document the trend in output in terms of discipline orientation;
  • To compare to a lesser degree the output of India in relation to China and South Korea; and

Data Source and Calculation of Indicators

Two crucial indicators employed in measuring research performance are the publication and citation indicators, although the assessment is not limited to the above two. Publications in the peer reviewed journals and further parameters based on publications and the impact of the publications based on the citation score of the journals are the derived indicators for the current presentation.

The bibliometric measures used in this paper are:

  • The number of publications from India as indexed in the SCI, which count research papers with Indian authors addresses,
  • Number of papers produced by the Indian institutions. The above assignment applies to measure the productivity of institutions,
  • The perceived quality of publications as measured through the Subfiled Impact Factor (the ratio between the total citations received in the current year for the articles published in the previous two years and total papers published in the previous two years) and the following databases have been used to present the data contained in this study

1. Science Citation Index Expanded published by the Institute for Scientific Information. The primary database for the current study is the Science Citation Index expanded which indexes more than 7000 journals. The SCI expanded version is published by the Institute for Scientific Information (ISI) which selects the above thousands journals among the hundred thousands scientific journals based on a few criteria. The ISI indexed journals by and large are significant than the journals not covered by ISI.

2. Journal Citation Reports (JCR) published by the Institute for Scientific Information.

3. Essential Citation Indicators, the product of the ISI which systematically publish consolidated and cumulative bibliometric data about journals, countries and disciplines; and

4. Direct scanning of thousands of Indian serial publications to identify the ‘real journals' in science and technology.

Scientific Output

In the period 1998 - 2009, the Indian S & T output as reflected in ISI database, is skewed. In the last five years (2005-2009) a significant growth is observed. Particularly in the year 2009, there is remarkable increase of 25% in scientific publications than the previous year. The table 1 shows the total S & T papers produced by Indian scientists in the eleven years period.

Table 1. S & T Publication Productivity of India during the last 10 years (as covered in ISI databases)

Papers

% of change

1998

15652

1999

16373

+ 4.60

2000

16486

+ 0.69

2001

16269

- 0.01

2002

17740

+ 9.04

2003

18726

+ 5.55

2004

17934

- 4.22

2005

19832

+ 10.58

2006

20847

+ 5.11

2007

23038

+ 10.5

2008

23745

+ 3.06

2009

29,190

+ 25.03

Despite the limitations in funding for science and technology, the contribution of Indian scientists to the world’s scientific outputs is increased during the last five years. This performance results mainly from the investments made in human resource training during the last thirty years mostly by the institutions and research laboratories. India established a large number of institutions in the recent period which is yielding currently large number of research papers.

The figure 1 below presents the growth pattern of publication of science papers in SCI indexed journals in the last eleven years.

Figure 1- Growth of Indian publications in the period 1998-2009

1

In the last few years, debates and discussions are initiated about the scientific output of India in comparison with China and South Korea. The aim of this study is not to analyse or critically compare the outputs. However, this study presents some data which need to be considered while inferring about the S & T comparison among nations. The scientific output of US, Canada and UK in the last couple of years remain constant while there is a significant increase for China and South Korea.

The prestigious high impact scientific journals, publish more papers of international authorship. However, the journals that have moderate or less impact publish more national papers. This is true for the journals published in the Asian and Latin American countries. Among the Chinese, South Korean and Indian journals, only two Chinese and one Indian journal have the impact factor* of more than 1. The impact factor values of 145 Chinese, South Korean and Indian journals(covered in ISI) is much less and can be deemed as national rather than international because of their low visibility and less international reception. Notwithstanding, the coverage of a specific country’s journals has influence on the publication profile of a country and such influence is reflected in databases.

In the table 2 below the number of Chinese, Indian and South Korean journals indexed by ISI databases and the papers published in these journals are given.

Table 2. Number of Chinese, Indian and South Korean journals indexed in I SI databases

2005

2006

2007

2008

2009

China

46/6543

57/9666

60/10648

68/12417

71/14280

India

47/4237

43/4352

50/4526

47/4725

47/4704

South Korea

17/2252

18/2607

21/2805

24/3321

27/3390

Source: Journal Citation Reports 2005-2009

The coverage of Chinese and South Korean journals is on increase, while the number of Indian journals remains almost constant in the last five years. The Indian journal papers also remained almost same while the Chinese and South Korean papers continue to increase. If the papers of the national journals are removed from the data, the publication profile of India, China and South Korea presents a change.

Since database coverage has considerable influence over the publication data, the discussions using standard journal set offer meaningful equations.

The Department of Science and Technology Government of India (DST) has a broad classification of science and technology. It has recognized eight broad S & T fields viz., Agricultural sciences, biological sciences, chemical sciences, earth sciences, engineering and technology, medical sciences, mathematics and physical sciences. The ISI assigned papers are now relocated and fitted in the eight DST recognized broad subjects. However, the ISI category the "multidisciplinary sciences" cannot be placed in any of the DST eight categories and now kept separately. The table 3 and figure 2 provide an understanding about the publication output in the analysed period.

Table 3.Total number of papers in broad disciplines

Total number of papers in broad disciplines

Serial Number

Subject Categories

2005 Number of Papers

2006 Number of Papers

2007 Number of Papers

2008 Number of Papers

2009 Number of Papers

1

AGRICULTURAL SCIENCE

1259

1035

1100

1233

1043

2

BIOLOGICAL SCIENCE

1817

2148

2245

2510

2654

3

CHEMICAL SCIENCE

3297

3958

4249

4486

5064

4

EARTH SCIENCE

700

685

523

876

732

5

ENGINEERING & TECHNOLOGY

3007

3356

3263

3864

3923

6

MATHEMATICS

410

428

451

502

433

7

MEDICAL SCIENCE

3330

3694

3901

4309

4517

8

PHYSICAL SCIENCE

2384

2683

2749

2838

2724

9

MULTIDISCIPLINARY SCIENCE*

680

689

584

742

794

  • Not recognized in DST classification

The growth in biological, chemical and medical sciences is significant in terms of number of papers in the five years period.

Figure 2

2

Growth pattern of publications in broad disciplines

Institutional Productivity

In the presentations below, the productivity of papers from institutions across countries is given.

In the year 2009, 6132 Indian institutions have contributed 23745 papers in Science and Technology. Academic institutions have contributed 15880 papers among the total output. For the remaining years also the academic institutions contribute significantly in S & T output. Academic institutions share the major responsibility of S & T research in India. In the table 4, the productivity from different types of institutions are given.

Table 4.Papers from different types of institutions in the five years period

Type of Institutions

2005 No. of papers

2006 No. of papers

2007 No. of papers

2008 No. of paper

2009 No. of papers

Academic Institutions

13779

12978

13591

19714

15880

Research Institutions

6641

7446

6306

7029

7135

Private Institutions

1213

469

964

570

562

Impact Indicators

In scientific research assessment system, considerable significance is attached to the quality of output. Investments in S & T research demands payoff. However, quality is difficult to measure and it is not confined to specific criteria. Hence, certain proxy measures such as citations to scientific papers, (total citations, mean number of citations, the ratio between the total number of papers and citations in a given period and other normalized counts) are widely employed in many scientific output assessment systems.

Measuring Impact

The major limitations in using the SCI papers for productivity assessment are:

  • A substantial number or percentage of papers has no or less subsequent impact in scientific research; and
  • SCI total papers count includes less significant items such as letters to editors, short communications and related ones.

The quality of papers is measured by the publication in high impact journals. Since the impact factors are raw values, the refined indicator, subfield corrected impact factor is used to identify the high impact papers. The subfield corrected impact factor values are calculated for all the Indian papers in the analysed five years. The journals that scored more than 10 in subfield corrected impact factor values are identified and listed in the decreasing frequency of its values. The threshold 10 even seems to be at random, it has identified the top journals across subfields. Accordingly for the analysed five years, the data about the journals and papers are given below.

The table 5 below presents for five years (2005-2009) the number of journals where Indian papers have appeared that have the sub field impact factor of more than 10 and the corresponding total papers published in these journals.

Table 5. Number of papers that have high sub field impact factor (=>10)

Year

No. of Journals

No of Papers

% of the total Indian output

2005

776

4726

26.35

2006

830

5727

28.87

2007

783

5955

28.56

2008

867

6663

28.92

2009

912

6428

27.07

The number of high impact factors in the analysed period has increased except for the year 2009 (where a small reduction is observed from 2008). The number of high impact journals that have published Indian papers in the analysed period is skewed. However, barring a small drop in 2009, the percentage of papers in high impact journals continue to increase marginally.

Indian Science and Technology Journals

When an Indian author writes a qualitative scientific paper, he/she likes to publish the paper in an international reviewed journal. The simple reason is that barring a very few, most of the Indian S & T journals lack perceived quality and the reception to them at the international level is very poor. The number of Indian journals covered in the international databases such as ISI and Scopus is very less. The citation record assessed through Science Citation Index or Goolge Scholar or other sources is not satisfactory. Hence, the researcher have carried out an exercise of identifying and documenting the Indian S & T journals with the important data, ‘extent of peer review’.

Peer Review: The most important measure to ensure quality of a journal is the peer review system. It has been criticized that many Indian journals do not have peer review system. Hence, in the current initiative, the peer review system followed by Indian journals is completely investigated. To understand the peer review system, the following practice is followed.

1. Whether the Indian journals are listed in peer reviewed databases such as web of science, scopus etc?

2. Whether the journals clearly specify the peer review system?

3. Whether the papers in the journals mention the date of receipt, review including revision and acceptance details of papers?

4. Whether the journals receive reasonable citations from international journals?

It should be noted that peer review systems enable the authors to enhance the quality of papers and experiments/investigations are put in the right direction by experts. In the absence of peer review mechanisms, journals are likely to publish trivial content. Encouraging young researchers and motivating them still challenges S and T research, despite many avenues to do. Peer reviewed journals enable young researchers to access to authoritative content.

We are concerned with poor peer reviewing practice of Indian journals. Unless the institutions insist their scientists to opt for publications in the peer reviewed journals and to consider the publications in the peer reviewed journals only, the Indian journals will continue to be in the vicious circle.

Conclusion

The purpose of this study was to show a scientometric evaluation of the outputs of India’s scientific publication for a period of five years. During this exercise output evaluation, a common procedure in scientometrics, almost completely eliminated conflicts of interest and was a key element in the study. It is demonstrated in this study that positive consequences for the quality of science result from the exercise of this study. The study also aimed to improve management and decision-making processes in science policy.

The research has two highlights from this study.

1. The total comprehensive S & T impact of Indian publications is higher than China and slightly lower than South Korea.

2. ISI databases continue to index more Chinese and South Korean journals

than Indian journals without any valid reasons.

ISI databases are inadequate to represent the volume of science done in the country. It is advocated to use databases such as Scopus, which indexes more peer reviewed journals.

Bibliometric measures are not the only indicators for understanding the science done in a country or arriving at science policy decisions. The pitfalls in bibliometrics measures are extensively addressed. These measures should be combined with other indicators to evolve science policy decisions.

The science indicators are not limited to publications alone. Many governments regularly compile indicators to understand the performance of scientists and institutions in their countries. They use several indicators to audit the scientific investment of their countries. One of the most comprehensive indicators is the Science and Engineering Indicators of the National Science Foundation (NSF) of the United States. NSF employs a wide set of indicators; however they give due importance to number of papers in top quality, referred journals. Similarly in many of the science indicators, papers in the quality and referred journals are given priority than any other indicator. Thus, the researcher defend the most crucial measure, that have used in this study.

References

Gupta, B.M., & Dhawan, S.M. (March 2008). A Scientometric Analysis of S&T Publications Output by India during 1985-2002. DESIDOC Journal ofLibrary & Information Technology 28(2): 73-85

Moed, H.F., & Hesseling, F. Th. (1996). The publication output and impact of academic chemistry research in the Netherlands during the 1980s: Bibliometric analyses and policy implications. Research Policy 25: pp.819-836.

Pichappan, P., & Buchandiran, G. (2006). Peer reviewing in Indian S&T Journals. Current Science 90(5): 615.

Rosenberg, N., & Nelson, R.R. (1994). American universities and technical advance in industry. Research Policy 23: pp323-348.

Smith, T.E. (1985). Journal Citation Reports as a Deselection Tool. Bulletin of the Medical Library Association 73: 387-89.

Tassey Tassey, G. (1999). Choosing government R & D policies: Tax incentives vs. direct funding. Review of Industrial Organization 11(5):  579-600.

Tsutomu, H. (2003). Three steps in knowledge communication: The emergence of knowledge transformers. Research Policy 32: 1737-1751.

Van Raan, A.F.J. and Van Leeuwen, T.N. (2002), Assessment of the scientific basis of interdisciplinary, applied research: Application of bibliometric methods in nutrition and food research. Research Policy 31: 611-632.

Vonortas, N.S. ( 1997). Cooperation in Research and Development. Boston: Kluwer Academic Press.

Woolf, S. (1989). Statistics and the modern state. Comparative Studies in Society and History 31: 588-603.