검색 전체 메뉴
PDF
맨 위로
OA 학술지
An Investigation of Information Usefulness of Google Scholar in Comparison with Web of Science* Google Scholar의 학술정보 검색을 위한 정보 유용성 비교연구
  • 비영리 CC BY-NC
  • 비영리 CC BY-NC
ABSTRACT
An Investigation of Information Usefulness of Google Scholar in Comparison with Web of Science*

Google Scholar(GS)는 인용색인 데이터베이스 측면에서 나름 부족한 부분은 있으나 무료라는 점과 대규모 크기의 데이터를 갖춘 이용자 중심의 자료라는 점에서 많은 이용자에 의해 이용되고 있다. 본 연구는 Google Scholar의 학술정보 검색을 겨냥한 정보 유용성 진단을 목적으로 기존의 통제어휘의 기반을 둔 인용색인 데이터베이스인 Web of Science(WoS)와 대비하여 비교분석한다. 실증적 정보 유용성 평가를 위해 두 가지 분석으로 진행되었다; 첫째는 문헌정보학 분야의 학술지를 중심으로 두 데이터베이스의 검색결과와 인용문헌의 수의 차이가 있는가를 정량 분석했다. 두 번째는 WoS 접근성이 떨어지는 이용자의 경우 GS가 WoS의 대체 정보원으로 기능할 수 있는지에 대해 이용자를 대상으로 정성분석을 실시했다. 실증 데이터 분석을 통해 얻어진 연구 결과는 정량평가에서 GS는 WoS에 대비해서 통계적으로도 유의한 높은 검색결과와 인용문헌수의 차이를 보여 대체 정보원의 기능이 가능한 반면, 정성평가에서는 이용자들은 GS와 WoS의 질적 차이를 크게 느끼지 못하는 것으로 나타나 대체기능을 확인할 수 없는 것으로 나타났다.

KEYWORD
Information Usefulness , Citation Analysis , Citation Indexing , Database Evaluation , User Evaluation
  • 1. Research Problem

    From its beginning, even before it was called as the current name, Web of Science (WoS) has been a dominant provider of citation indexing service for a long time. The history began with a paper version of citation indexes, turned into a text-based database, and now it offers much richer information including links to full-text documents. Since it is a kind of information that requires great time and effort as well as professional and expert knowledge, the service cannot be provided for free. It is still used and trusted by most researchers of various academic fields, but challenged by some competitors such as Scopus by Elsevier. Now Google offers similar service called Google Scholar (GS), which is a web search engine that enables users to find information on scholarly literature, for absolutely free. When WoS and other subscription-based citation indexing services were the sole provider of scholarly information, one might have given up obtaining the full-text of the cited materials or requested an interlibrary loan and waited for a week to receive a scanned version of the document. Full-text availability of indexed materials in WoS depends on the user’s affiliated organization’s subscription, while some lucky users can find a pre-print or full-text of documents from the author’s self-archiving site in GS.

    The purpose of this study is to investigate whether GS can substitute WoS for those who don’t have access to the subscription-based indexing service and if users feel GS is useful for scholarly information. To achieve the research purpose, the study will eval evaluate both quantitative and qualitative aspects of the two databases with the following research questions.

    RQ 1. Comparison of the quantitative measures between WoS and GS1-1. Is there a statistically significant difference in the number of records between WoS and GS?1-2. Is there a statistically significant difference in the number of citations between WoS and GS?

    RQ 2. Comparison of the qualitative evaluation of information service - is GS perceived as a possible substitutes WoS?2-1. Is ease of use perceived differently between WoS and GS by users?2-2. Is searching and customer support different between WoS and GS?

    The number of records and citations are selected to examine the information usefulness of GS because coverage has been an issue or even a threat to WoS, especially for social sciences, management, and education (De Winter et al. 2014, 1561). While 69% to 84% of the journals in chemistry, biology, physics, and health sciences are indexed in WoS, only 4 to 19% of the social science, management, and education journals are indexed in the database. Although that is why journals indexed in WoS are considered to be very prestigious in the field, users may not find it useful to realize that they are possibly missing relevant sources for their research.

    Perceived ease of use is defined as “the degree to which a person believes that using a particular system would be free of effort” (Davis 1989, 320). The concept is one of the widely used factors affecting people’s acceptance of a technology along with perceived usefulness and perceived enjoyment (Ong et al. 2013). In other words, assuming other factors being equal, people tend to accept a system that is easier to use. To investigate whether users of academic libraries will accept GS as a substitute for the existing citation indexing service, WoS, the study will survey 67 undergraduate students asking how easy they perceive GS compared to WoS.

    The result of this study is expected to contribute to find out whether GS can be a useful retrieval tool for scholarly literature in spite of the obvious weaknesses compared to WoS for academic libraries.

    2. Review of the Literature

       2.1 Researches on Citation Indexing Services and Google Scholar

    WoS is a citation indexing service provided by Thompson Reuters. It covers various academic disciplines in various types such as journal articles, books, conference proceedings, and technical papers. Because of its authority as a reliable source for bibliographic records and the vast amount of data available for analysis, WoS has been used as a source of citation data and as a research subject, especially as an object for comparison with other databases. Studies involv involving citation analysis mostly use citation data extracted from Web of Science Core Collection or a list of highly ranked journals from Journal Citation Reports (JCR), which also uses WoS’s citation data to generate journal impact factors and other scores for ranking those journals (Carcía et al. 2014; Crespo et al. 2014; Jokic et al. 2010; Leydesdorff et al. 2013).

    WoS is also used as an object for comparison with other similar database services. Scopus, another citation indexing database service provided by Elsevier, covers journals, books, conference papers, and patents in the field of life sciences, social sciences, physical sciences, and health sciences. It started its service in 2004, much later than WoS, but has been rapidly growing and expanding its coverage, and recognized as another influential citation indexing database service. After its advent, researchers rushed into testing its reliability and comparing it to the existing service that has been a dominant and sole figure for a long time (Gavel and Iselid 2008; Gorraiz and Schloegl 2008; Lopez-Illescas et al. 2008; Meho and Rogers 2008). In a more recent study, Abrizah et al. (2013) investigated journals in the field of library and information science on its scientific impact and subject categorization. Their study compared coverage, scientific impact, and subject categorization of library and information science journals in WoS and Scopus, and discovered that some journals were found to have different impact factors and journal ranks, and only five out of top-twenty journals listed in JCR were indexed in Scopus. Now it seems Scopus has been settled as another option for citation indexing, or at least as a resourceful supplement to WoS.

    GS is a web search engine devoted to scholarly works of various formats and types, including journal articles and books. Since it is a web service, the major advantage over subscription based WoS or Scopus is accessibility. It is basically free, although most full-text articles indexed in the service are available only through subscription. Still, just like the old days when WoS offered bibliographic information only, users can find vast amount of useful information for free through GS. Especially, those who have no affiliated organizations subscribing those expensive citation indexing services can also find bibliographic information about journal articles, books, and maybe some web pages that contain scholarly information. Because of its possibility to complement the existing citation indexing services, GS has been studied by many researchers on its usefulness. As of August 2014, the search for “Google Scholar” as a topic resulted in 2,370 records in WoS Core Collection. The search results can be analyzed by “Analyze Results” menu, which rank the records by such fields as authors countries, document types, funding agencies, languages, source titles, and WoS Categories.

    shows the top 10 WoS subject categories that include researches on GS. “Library Science and Information Science” is a subject field that studies the database the most, followed by medical-related fields. In fact, other than two information science related fields, “information science library science” and “computer science information systems”, medical-related areas are the ones that study GS as a research subject the most. Researches on GS in LIS study the usefulness of GS or compare GS with other services, while most of the researches in medical-related areas use GS as a tool for collecting sources for their analyses.

    [

    ] Top 10 WoS Subject Categories of Researches on GS

    label

    Top 10 WoS Subject Categories of Researches on GS

    Although GS was not built to compete with subscription- based citation indexing services, users and researchers still expect similar or at least close results to the experience of using WoS or Scopus. Cothran (2011, 293) surveyed 1,141 graduate students to analyze their “perceptions of Google Scholar as part of their research process” in terms of ease of use, usefulness, and satisfaction and loyalty. The study showed the respondents perceived that GS was fairly easy to learn, understand, access, and use, and recognized it as “a useful resource for their research” (Cothran 2011, 298). They were generally satisfied with the service but loyalty was not so strong.

    Harzing (2013, 1057) tested if GS can be used as a source for citation data using 20 Nobel prize winners as authors for articles. The test revealed that GS displays “considerable stability over time” and its comprehensive coverage seems very promising to expand through all disciplines. In addition, the study tested if citation metrics can be compared between GS and WoS, using h-index and the number of citations calculated with data from each database. The average h-index and citations for the tested set were very similar in both databases for Medicine and Physics, while it was higher in WoS for Chemistry. Some subject fields showed dramatic difference between the two. The study argued that “in terms of comparability between disciplines GS might provide a less biased results than the Web of Science” (Harzing 2013, 1074). Then Harzing (2014, 565) continued tracking the coverage of GS and indicated the coverage was “increasing at a stable rate” and the comprehensiveness were also improved.

    However, not all of researchers agree on the usefulness of GS. Lopez-Cozar et al. (2014) tested if GS can handle false papers which can influence the bibliometric indicators as a result. They demonstrated how false documents became easily indexed in GS and how it affected the citation data, which means the manipulation of such data were very possible with GS. Franceschet (2010a, 243) also pointed out that “Google Scholar computes significantly higher indicators’ scores than Web of Science”. Yet, Ortega and Aguillo (2013, 394) used GS Citations to analyze research collaborations in institutional and country levels and created a visualization of such collaborations. Their conclusion suggested that GS Citations “is a suitable tool for collaboration studies only at macro level between countries and institutions” (Ortega and Aguillo 2013, 403).

    Researches on GS compare the service with other similar citation indexing databases, especially WoS, for a particular subject field (Amara and Landry 2012; Garcia-Perez 2010; Mikki 2010; Mingers and Lipitakis 2010). De Winter et al. (2014, 1547) examined GS and WoS in diverse research fields, and concluded that “GS has shown substantial expansion” although GS still has some weaknesses in errors including “false positive citations, duplicates, and lack of publication year” (De Winter et al. 2014, 1561).

    Researches on GS still bring controversies and disagreements about whether the system is even comparable with the existing citation indexing service. However, some studies find GS can be useful in academic libraries. In a study on acquiring scholarly contents for academic libraries, Shim (2012) listed a few reasons why academic libraries with low budget need to consider using GS as a substitute for expensive subscription-based citation indexing services. Even though GS lacks controlled vocabulary and various functions, one of the biggest advantages of GS is the ease-of-use and the other is its full-text searchability. Contrary to the existing citation indexing services where only indexed fields are searchable, GS searches the full-text of documents themselves, which enables users to find what they can’t find in citation indexing databases.

       2.2 Researches on User Evaluation for Quality Assessment of Database

    To ensure quality and value of information services or databases, certain criteria need to be applied for evaluation. Regarding the online science and technology information service, Kim et al. (2013) analyzed the influence of the expectation and perceived performance on user satisfaction and loyalty. The evaluation criteria used by the study include credibility, ease-of-use, system usability, responsiveness, security, quality of information, and problem-solving capability for both expectation and perceived performance of the online information service. The survey for this study includes selected questions from these previous studies on evaluation of information services, and some evaluation criteria will be directly examined by the researcher. Regarding evaluation of science and technology databases, Yoo (2004, 29-32) suggested the following evaluation criteria, displayed in

    .

    [

    ] Evaluation Criteria for Scientific Databases (Yoo, 2004)

    label

    Evaluation Criteria for Scientific Databases (Yoo, 2004)

    The first four criteria in the shaded cells are the evaluation criteria for ensuring quality of information data itself, hence the rest of the information service quality assessment indices, Searching, Ease-of-Use, and Customer Support are applied to measure information usefulness in this study. Yoo (2000) also indicated the three criteria can be used for evaluation of web-based information services. Especially, the sub-topics of Ease of Use will be used as questions for the user survey.

    Criteria for Searching is associated with relevance of the retrieval method, tool, and media, and involves questions like response speed, systematic arrangement of the menu, existence of Boolean operator, and completeness of the keyword and thesaurus system.

    For Ease-of-use criteria, questions involve whether there is a standardized or general menus available, consistency of the prompt and related screens, usefulness of messages on screen, and the existence of help functions.

    Criteria for User Support include availability of user manuals or online tutorials. The qualitative aspects of information usefulness of WoS and GS, will be measured by user surveying these three criteria.

    3. Data Collection

    The study used a set of journals highly ranked in Journal Citation Reports Social Science (JCRSS) edition for 2013, in the category of “Information Science & Library Science”. Among 83 journals in the category, top 36 journals were selected for accommodating easier recognition of comparisons between the two databases because comparison of all 83 journals’ measurements resulted in an unrecognizable graph. The rank of the journals were based on their Eigenfactor Score, which is “an indicator of citation impact normalized by the size of the journal” (Davis 2008, 2186). Instead of widely used Journal Impact Factor(JIF), Eigenfactor Score is used because it uses a different approach to measure the importance of a journal. While calculating JIF includes journals, proceedings, or books indexed by WoS as citing works regardless of its topic or relatedness to the cited journal, Eigenfactor Score only includes other influential journals as citing works.

    Basically, it considers “a journal to be important if it receives many citations from other important journals” (Bergstrom and West 2008, 1850), and the idea came from how Google ranks web sites. JIF does not consider the prestige or type of works that cite the journal articles, but Eigenfactor Score limits the scope of citing works to other influential journals. Also, the scores are not affected by journal self-citation because calculation of the score does not include references from one article in a journal to another from the same journal. Because it is a normalized score, it is more relevant for “complex subject areas that include several clusters of journals with a more specific focus” (Thompson Reuters 2012, 1).

    In addition, while the original JIF counts citations of the past two years only, Eigenfactor Score counts citations for five years, which allows “a broader eval uation of journal citations, in particular for disciplines with longer cited lives” (Franceschet 2010b, 556). JIF scores must be appropriate for rapidly changing disciplines whose citation cycle is close to two years, but Eigenfactor Score can be better for other disciplines that need to consider citations for at least five years. WoS also offers 5-year JIF from 2007, but there are some journals missing the 5-year JIF score.

    To investigate if GS can be helpful for those who do not have access to WoS, the study conducted a survey for evaluation of the service, and the researcher compared qualitative aspects of each database based on the selected evaluation criteria, including searching, ease-of-use, and customer support. Especially, ease-of-use will be the main topic for the survey, asking how easy users perceives in using the database.

       3.1 Quantitative Measurement: Number of retrieved records and citations

    To evaluate the quality of GS compared to WoS in terms of its quantitative aspects, the number of records and citations for each journal was counted. For the number of records indexed in WoS, abbreviated journal titles were used as the search query with the field set as publication name. The list of the 36 journals’ abbreviated and full titles in the ranked order is shown in Appendix.

    To maximize the number of records in the result, basic search using all databases was performed without any limitations applied. For the number of citations for journals, Journal Citation Reports were used where the result page for journals in the subject category displays the number of total citations in 2013 for each journal.

    For GS, a software named Publish or Perish (Harzing 2007) was used to perform the search. The software uses data from GS to show a list of documents containing the query, whether it is a name of an author or a title of a journal. A typical result page for a ‘journal impact’ search displays the number of papers, citations, years, cites per paper, papers per author, cites per year, citations per author per year, the average annual increase in the individual h-index, the contemporary h-index, and Egghe’s g-index. Since the software only collects first 1,000 papers to get the metrics, the number of results were obtained from searching the GS site directly, using advanced search which allows users to search for the journal in a text box called “return articles published in”.

    The number of citations were obtained from Publish or Perish, and most of the citation count was obtained from the first 1,000 papers included in the result. Among the top 36 journals, only five of them had less than 1,000; Journal of Informetrics, Journal of the Association for Information Systems, Information Systems Journal, International Journal of Computer-Supported Collaborative Learning, and Journal of Strategic Information Systems. Other than those 5 journals, the number of citations for each journal was counted with the first 1,000 results.

       3.2 Usability Measurement: User’s Evaluation of Service

    To examine whether GS can be useful enough to substitute WoS if one has no affiliation with organizations subscribing to the service, a survey of 67 undergraduate students majoring in library and information science was conducted. The online survey was open for three days, from September 11, 2014 to September 13, 2014, and the average scores for each question were calculated with SurveyMonkey’s “Analyze Results” menu. Before conducting the survey, a brief introduction and explanation about each database were given to the respondents, then a set of tasks was given, including 5 known item searches and one task for them to decide what to search. After completion of the tasks, the respondents answered 10 Likert-scale questions for each database, shown in

    later, and one open-ended question for those who have any comments. Out of 67 respondents, 34.32% have experience in WoS and 77.61% have used GS.

    4. Data Analysis

       4.1 Quantitative Aspects

    shows the descriptive statistics of the number of records and citations for the selected journals in WoS and GS. The average number of records and citations for the journals indexed in WoS is much lower than GS results. The range for the number of records in WoS is between 76 to 1,835 while GS results range from 90 to 9,160. For the number of citations, GS has much wider range, from 101 citations to 287,111 while WoS citations range from 210 to 3,975.

    [

    ] Descriptive Statistics for the number of records and citations in WoS and GS

    label

    Descriptive Statistics for the number of records and citations in WoS and GS

    There was a particular result in GS for JASIST, considering the journal’s well-known prestigious status in LIS field. No results were found when searched for “Journal of the American Society for Information Science and Technology” or “Journal of the Association for Information Science and Technology” but 25 records for JASIST and 164 records for JASIS were found, so 189 was recorded as the number of records for JASIST.

    is a set of graphs comparing the number of records in each database. To facilitate the viewing, the first graph shows the results for the top 18 journals, and the second graph is for the lower half of the journals. The difference between the number of records indexed in each database varies among journals. Some journals such as Journal of Health Communication and Journal of Informetrics show very little difference while others such as Journal of Information Technology are indexed in GS far more than in WoS. None of the journals are indexed in WoS more than in GS.

    is a set of graphs for the number of citations in each indexing service. Here, bars for WoS data can be hardly seen because the gap between the two is so huge. Especially, journals such as MIS Quarterly and Information Systems Research show large gaps between WoS and GS in terms of the number of citations.

    The differences of these quantitative measures between the two databases were verified with t-Test. The results of the t-Test shows the differences of the number of records and citations between WoS and GS are both statistically significant with 99% confidence interval. The t-Test results from SPSS 21 are shown in

    .

    [

    ] t-Test Results for Quantitative Measurem

    label

    t-Test Results for Quantitative Measurem

    The paired sample t-Test revealed that the quantitative measurements of the information usefulness of the two databases are significantly different, in terms of the number of records and the number of citations. The negative t score denotes that the mean scores of the two measurements for GS are much higher than the mean scores for WoS, which means GS offers much higher level of information usefulness than WoS in terms of the number of records and citations.

       4.2 Usability Evaluation

    Before conducting the survey, the following preliminary evaluation of the databases was conducted in terms of Searching, Ease-of-use, and Customer Support, which have reviewed in prior literature review.

    4.2.1 Searching

    Recently updated WoS offers three search options: All Databases, Web of Science Core Collection, and SciELO Citation Index. “All Databases” allows users to search across all database products indexed in WoS, while Web of Science Core Collection and SciELO Citation Index are for selected sets of citation indexes. Web of Science Core Collection includes Science Citation Index Expanded, Social Science Citation Index, and Arts & Humanities Citation Index among others, and SciELO Citation Index contains scholarly literature published in open access journals from Latin American, Portugal, Spain, and South Africa. The search page is set to “All Databases” and “Basic Search” as a default, and users can switch to “Web of Science Core Collection” to have more options such as “Author Search”, “Cited Reference Search”, and “Advanced Search”. SciELO Citation Index also offers “Cited Reference Search” and “Advanced Search” in addition to “Basic Search”.

    shows the structure of the available search options for WoS.

    [

    ] Search Options in WoS

    label

    Search Options in WoS

    Advanced Search allows users to enter command line queries using field tags, Boolean operators, parentheses, and query sets, with limitations on language, document types, and timespan. The search page shows the list of available Boolean operators and most-used field tags such as TS for topic, TI for title, and AU for author.

    The search page for GS is as simple as Google itself, with a text box and a search button. The only difference from Google is that it has radio buttons below the text box to let users choose articles or case law. For articles, users may select if patents should be included in the result. GS also offers advanced search if users can find a small triangle in the text box, which opens a list of options available.

    is the screen shot of the advanced search option for GS.

    The same list of options is available in the search result page, where the text box for search is also always offered. Availability of various options for menus or commands is also related to ease-of-use criteria.

    4.2.2 Ease-of-Use

    Various options and menus for search in WoS can be helpful for users with more experience in searching. It can take more time for novice users to learn about available features and field tags. Yet, an experienced user with clear information needs can make better use of the various features and options and feel like they have more control. In addition to the menu’s diversity, WoS maintains search history for each session, and users can save the search history either to their WoS accounts or to their own local drive. The search history is very useful when one needs to combine the results. If they don’t want the whole set of search results to be saved, they can mark some records to make a “marked list”, select some fields such as author(s), title, source, and abstract, and save the list online or other file formats. All of these features and functions are explained in detail in Help. The Help pages offer explanation of each function on each corresponding page, and users can easily find other information in contents and index of the help pages. When an error occurs, a message appears above the text box explaining why the query cannot be completed, for example, missing a closing quotation mark or irrelevant query syntax. The result page for each record contains bibliographic information as well as links to full-text, if available, and buttons for print or email the result. Users can also save the individual record page to other file formats such as HTML or plain text.

    As simplicity is the virtue of Google, the simple front page of GS can be bewildering for experienced users who want to have some level of control over search. However, for users with less experience in searching or without any specific information needs may find it useful to have a mysterious text box. The search result page of GS shows a list of articles containing the query mainly in title, unless otherwise specified. Users can customize the search result by time, and sort the list either by relevance or by date. They can also choose to include patents or citations, but they cannot save the result or export it to other formats. When the full-text of the item is available, a link to a PDF file with description of where the file is from appears on the right.

    The most notable difference of the individual record of the search result between WoS and GS is that the former has an organized structure enhancing consistency of the database while the latter has links to the original location of each item and does not have a consistent format for each item.

    4.2.3 Customer Support

    WoS provides users with multiple routes for customer feedback and support. They can contact customer service for technical support, provide feature feedback regarding data or citation correction, missing article or issue, submit or recommend a journal for coverage, and give feedback about access, searching, marked lists, and other product features. Online tutorial is available either with recorded training modules or with live training through registration. Quick reference cards for other languages are available in PDF for Czech, French, German, Italian, Polish, Portuguese, and Spanish.

    Description and explanation of GS and its functions are available through “About Google Scholar” at the very bottom of the front page. The page contains information about search tips and mostly in the format of FAQ, listing frequently asked questions and answers below each question. However, GS does not offer any HELP function and does not seem to care whether the quotation mark is closed or not, unlike WoS where an error message appears for an unclosed quotation mark.

    4.2.4 Survey

    The questions in the survey for 67 undergraduate students with LIS major were based on the evaluation criteria “Ease of Use” in

    and its sub-topics. The survey consists of 10 Likert-scale questions with 6 levels, from “don’t know”, “strongly disagree”, “disagree”, “neither agree nor disagree”, “agree”, and “strongly agree”. One open-ended question was also given for any comments. Each level of the Likert-scale questions was weighted, 1 for “don’t know” and 6 for “strongly agree”, and the scores denote the average score for each question.

    The following

    shows the list of questions and average scores for each database as well as the differences between the scores.

    [

    ] Survey Questions and Scores

    label

    Survey Questions and Scores

    [

    ] t-Test for the Survey

    label

    t-Test for the Survey

    Average scores for overall questions regarding GS is slightly higher than WoS, and some of the questions regarding the menu’s simplicity and easier navigation as well as consistency of the result show GS is perceived easier than WoS. However, t-Test for the results with SPSS 21 revealed that the difference that the users feel is not statistically significant at 0.05 level.

    5. Conclusion

       5.1 Summary

    The result of this study can be summarized into the following answers to the research questions.

    RQ 1. Comparison of the quantitative measures between WoS and GS.1-1. Is there a statistically significant difference in the number of records between WoS and GS?

    The result shows that the difference in the number of records between WoS and GS exists for the selected set of library and information science journals, and the difference is statistically significant (t = − 5.218, p < 0.01). The t-Test result denotes the average number of records for GS is higher than WoS, which indicates the level of information usefulness in terms of coverage is higher for GS.

    1-2. Is there a statistically significant difference in the number of citations between WoS and GS?

    The result shows that the difference in the number of citations between WoS and GS exists for the selected set of library and information science journals, and the difference is statistically significant (t = − 4.178, p < 0.01). The t-Test result denotes the average number of citations for GS is higher than WoS, which indicates the level of information usefulness in terms of citations is higher for GS.

    RQ 2. Comparison of the Usability evaluation of information service - is GS perceived as a possible substitutes WoS?2-1. Is ease of use perceived differently between WoS and GS by users?

    The survey results show that the difference of the perceived Ease-of-use is not statistically different between WoS and GS, although GS is perceived slightly easier than WoS for use by the respondents in some elements of the criteria.

    2-2. Is Searching and Customer support different between WoS and GS?

    The qualitative assessment shows that WoS offers much more systematic and diverse options for Searching, in terms of search menus, command languages, and controlled vocabulary than GS. WoS is also better in Customer support, providing various chances for users to give feedback about the service, and to access to online tutorials for each of their services.

       5.2 Discussion and Future Research

    The study tried to investigate whether GS can be useful for those who don’t have access to subscription- based citation indexing services, in terms of the number of records and citations indexed for the WoS subject category of “Information Science and Library Science”. In addition, the study evaluated the usability aspects of GS compared to WoS in terms of Searching, Ease-of-use, and Customer support. Especially, criteria for Ease-of-use were evaluated by surveying users of academic libraries.

    However, the study has some limitations that should be considered for future research. First, the scope of the study is restricted to the set of journals in a specific subject domain. Researches on GS mostly focus on a specific subject domain but either with vast amount of data (4,600 publications for Mingers and Lipitakis 2010) or longitudinal analysis (De Winter et al. 2014; Harzing 2014). Additional quantitative measurements as well as qualitative assessment are necessary. For quantitative aspects, GS indexes much more records and citations than WoS for the tested set of journals in the library and information science field.

    The quality of those records and citations are still questionable because there are researches suggesting the vulnerability of manipulation of citation data. The basic problem may be the lack of control for GS, although the ability to extract vast amount of information automatically can be useful. Then again, more researches on GS and its possible usefulness should be followed, to utilize the extensive coverage and to find a way to filter out cases that can affect the bibliometric indicators.

    For qualitative aspects, in-depth interviews with more experienced users may be helpful to compare the databases more in detail. Also, expanding the scope to other disciplines may result in more meaningful findings. In the process of searching for researches on GS, the study discovered medical-related areas mostly use GS as one of the retrieval tools for collecting sources for their analyses. There must be a reason why they acknowledge GS as a legitimate search tool along with other subscription-based databases like PubMed, Embase Biomedical databases, and CINAHL.

    Secondly, comparison between GS and WoS can be problematic since WoS is such a well-established citation indexing service, not to mention that the services is offered based on subscription only. More research is needed for comparison between freely offered web-based services such as MS Academic Search can be more appropriate to investigate which service can be better for academic libraries with low budget.

    참고문헌
    • 1. 김 완종, 김 혜선, 현 미환 2013 온라인 과학기술정보 서비스 품질에 대한 기대수준과 성과에 대한 지각수준이 이용자 만족도와 충성도에 미치는 영향 [『정보관리학회지』] Vol.30 P.207-228 google cross ref
    • 2. 심 원식 2012 빅딜, 오픈액세스, 구글 학술검색과 대학도서관의 전자학술정보구독 [『정보관리학회지』] Vol.29 P.143-163 google
    • 3. 유 사라 2000 Web 정보서비스 평가를 위한 기존 측정지표 분석 Ⅰ [『한국문헌정보학회지』] Vol.34 P.133-156 google
    • 4. 유 사라 2004 『정보품질과 정보서비스 평가론』 google
    • 5. Abrizah A., Zainab A.N., Kiran K., Raj R.G. 2013 “LIS Journals Scientific Impact and Subject Categorization: A Comparison between GS and Scopus.” [Scientometrics] Vol.94 P.721-740 google cross ref
    • 6. Amara Nabil, Rejean Landry 2012 “Counting Citations in the Field of Business and Management: Why Use Google Scholar Rather Than the Web of Science.” [Scientometrics] Vol.93 P.553-581 google cross ref
    • 7. Bergstrom Carl T., West Jevin D. 2008 “Assessing Citations with the Eigenfactor Metrics.” [Neurology] Vol.71 P.1850-1851 google cross ref
    • 8. Carcia J.A., Rosa Rodriguez-Sanchez Rosa, Fdez-Valdivia J., Robinson-Garcia Nicholas, Torres-Salinas Daniel 2014 “Best-in-class and Strategic Benchmarking of Scientific Subject Categories of Web of Science in 2010.” [Scientometrics] Vol.99 P.615-630 google cross ref
    • 9. Cothran Tanya 2011 “Google Scholar Acceptance and Use among Graduate Students: A Quantitative Study.” [Library & Information Science Research] Vol.33 P.293-301 google cross ref
    • 10. Crespo Juan A., Neus Herranz, Li Yunrong, Ruiz-Castillo Javier 2014 “The Effect on Citation Inequality of Differences in Citation Practices at the Web of Science Subject Category Level.” [Journal of the Association for Information Science and Technology] Vol.65 P.1244-1256 google cross ref
    • 11. Davis Fred D 1989 “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology.” [MIS Quarterly] Vol.13 P.319-340 google cross ref
    • 12. Davis Philip M. 2008 “Eigenfactor: Does the Principle of Repeated Improvement Result in Better Estimates than Raw Citation Counts?” [Journal of the American Society for Information Science and Technology] Vol.59 P.2186-2188 google cross ref
    • 13. De Winter, Joost C.F., Zadpoor Amir A., Dodou. Dimitra 2014 “The Expansion of Google Scholar versus Web of Science: A Longitudinal Study.” [Scientometrics] Vol.98 P.1547-1565 google cross ref
    • 14. Franceschet Massimo 2010 “A Comparison of Bibliometric Indicators for Computer Science Scholars and Journals on Web of Science and Google Scholar.” [Scientometrics] Vol.83 P.243-258 google cross ref
    • 15. Franceschet Massimo 2010 “Ten Good Reasons to Use the Eigenfactor Metrics.” [Information Processing and Management] Vol.46 P.555-558 google cross ref
    • 16. Garcia-Perez Miguel A 2010 “Accuracy and Completeness of Publication and Citation Records in the Web of Science, PsycINFO, and Google Scholar: A Case Study for the Computation of h Indicies in Psychology.” [Journal of the American Society for Information Science and Technology] Vol.61 P.2070-2085 google cross ref
    • 17. Gavel Ylva, Iselid Lars 2008 “Web of Science and Scopus: A Journal Title Overlap Study.” [Online Information Review] Vol.32 P.8-21 google cross ref
    • 18. Gorraiz Juan, Schloegl Christian 2008 “A Bibliographic Analysis of Pharmacology and Pharmacy Journals: Scopus versus Web of Science.” [Journal of Information Science] Vol.34 P.715-725 google cross ref
    • 19. Harzing Ann-Wil 2007 Publish or Perish google
    • 20. Harzing Anne-Wil 2013 “A Preliminary Test of Google Scholar as a Source for Citation Data: A Longitudinal Study of Nobel Prize Winners.” [Scientometrics] Vol.94 P.1057-1075 google cross ref
    • 21. Harzing Anne-Wil 2014 “A Longitudinal Study of Google Scholar Coverage between 2012 and 2013.” [Scientometrics] Vol.98 P.565-575 google cross ref
    • 22. Jokic Maja, Kresimir Zauder, Srebrenka Letina 2010 “Croatian Scholarly Productivity 1991-2005 Measured by Journals Indexed in Web of Science.” [Scientometrics] Vol.83 P.375-395 google cross ref
    • 23. Leydesdorff Loet, Stephen Carley, Ismael Rafols 2013 “Global Maps of Science Based on the New Web-of-Science Categories.” [Scientometrics] Vol.94 P.589-593 google cross ref
    • 24. Lopez-Cozar Eilio Delgado, Robinson-Garcia Nicolas, Torres-Salinas Daniel 2014 “The Google Scholar Experiment: How to Index False Papers and Manipulate Bibliometric Indicators.” [Journal of the Association for Information Science and Technology] Vol.65 P.446-454 google cross ref
    • 25. Lopez-Illescas Carmen, de Moya-Anegon Felix, F. Moed Henk 2008 “Coverage and Citation Impact of Oncological Journals in the Web of Science and Scopus.” [Journal of Informetrics] Vol.2 P.304-316 google cross ref
    • 26. Meho Lokman I., Yvonne Rogers 2008 “Citation Counting, Citation Ranking, and h-index of Human-Computer Interaction Researchers: A Comparison of Scopus and Web of Science.” [Journal of the American Society for Information Science and Technology] Vol.59 P.1711-1726 google cross ref
    • 27. Mikki Susanne 2010 “Comparing Google Scholar and ISI Web of Science for Earth Sciences.” [Scientometrics] Vol.82 P.321-331 google cross ref
    • 28. Mingers John, Evangelia A.E, Lipitakis C.G. 2010 “Counting the Citations: a Comparison of Web of Science and Google Scholar in the Field of Business and Management.” [Scientometrics] Vol.85 P.613-625 google cross ref
    • 29. Ong Chorng-Shyong, Chang Shu-Chen, Lee Shwn-Meei 2013 “Website Satisfaction Dimensions: Factors between Satisfaction and Dissatisfaction.” [Information Development] Vol.29 P.299-308 google cross ref
    • 30. Ortega Jose Luis, Isidro F. Aguillo 2013 “Institutional and Country Collaboration in an Online Service of Scientific Profiles: Google Scholar Citations.” [Journal of Informetrics] Vol.7 P.394-403 google cross ref
    • 31. Thompson Reuters 2012 Eigenfactor Metrics in JCR Web: Frequently Asked Questions google
    OAK XML 통계
    이미지 / 테이블
    • [ <Table 1> ]  Top 10 WoS Subject Categories of Researches on GS
      Top 10 WoS Subject Categories of Researches on GS
    • [ <Table 2> ]  Evaluation Criteria for Scientific Databases (Yoo, 2004)
      Evaluation Criteria for Scientific Databases (Yoo, 2004)
    • [ <Table 3> ]  Descriptive Statistics for the number of records and citations in WoS and GS
      Descriptive Statistics for the number of records and citations in WoS and GS
    • [ <Figure 1> ]  Record Count for Journals in WoS and GS
      Record Count for Journals in WoS and GS
    • [ <Figure 2> ]  Citation Count for Journals in WoS and GS
      Citation Count for Journals in WoS and GS
    • [ <Table 4> ]  t-Test Results for Quantitative Measurem
      t-Test Results for Quantitative Measurem
    • [ <Table 5> ]  Search Options in WoS
      Search Options in WoS
    • [ <Figure 3> ]  Advanced Search for GS
      Advanced Search for GS
    • [ <Table 6> ]  Survey Questions and Scores
      Survey Questions and Scores
    • [ <Table 7> ]  t-Test for the Survey
      t-Test for the Survey