Научная статья на тему 'Systematically searching empirical literature in the Social Sciences: results from two meta-analyses within the domain of Education'

Systematically searching empirical literature in the Social Sciences: results from two meta-analyses within the domain of Education Текст научной статьи по специальности «Строительство и архитектура»

CC BY
121
15
i Надоели баннеры? Вы всегда можете отключить рекламу.
Ключевые слова
SYSTEMATIC REVIEWS / META-ANALYSIS / INFORMATION RETRIEVAL / SEARCH STRATEGIES / GREY LITERATURE / BIBLIOGRAPHIC DATABASES / LITERATURE SEARCH / LITERATURE REVIEWS / LIBRARIANS

Аннотация научной статьи по строительству и архитектуре, автор научной работы — Pickup David I., Bernard Robert M., Borokhovski Eugene, Wade Anne C., Tamim Rana M.

Introduction. This paper provides an overview of the information retrieval strategy employed for two meta-analyses, conducted by a systematic review team at Concordia University (Montreal, QC, Canada). Both papers draw on standards first articulated by H.M. Cooper and further developed by the Campbell Collaboration, which promote a comprehensive approach to systematically searching an extensive array of resources (bibliographic databases, print resources, citation indices, etc.) in order to locate both published and unpublished research. The goal is to verify if searching comprehensively through multiple resources retrieves studies that are unique, and hence, improve the overall representativeness of a diverse body of literature. We also analyze the sensitivity and specificity of the results by data source. Methods. In order to determine the source sensitivity, we consider percentage of results from each source retrieved for full-text review. In order to determine the source specificity, we derive a percentage from the total number of studies included in the final meta-analysis compared against the overall number of initial results found. Results. Results demonstrate the need to search beyond the subject-specific databases of a particular discipline as unique results can be found in many places. Databases for related disciplines provided 129 unique includes to each meta-analysis, and multidisciplinary databases provided 44 and 99 unique includes for the two meta-analyses in question respectively. Manual search techniques were much more sensitive and specific than electronic searches of databases and yield a higher percentage of final includes. Discussion. The results demonstrate the utility of a comprehensive information retrieval methodology like that proposed by the Campbell Collaboration, which goes beyond the main subject databases to locate the full range of information sources, including grey literature.

i Надоели баннеры? Вы всегда можете отключить рекламу.
iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.
i Надоели баннеры? Вы всегда можете отключить рекламу.

Текст научной работы на тему «Systematically searching empirical literature in the Social Sciences: results from two meta-analyses within the domain of Education»

THEORY AND METHODS OF PROFESSIONAL EDUCATION

UDC 303.425.2

DOI: 10.21702/rpj.2018.4.10

Systematically Searching Empirical Literature

in the Social Sciences: Results from Two Meta-Analyses

Within the Domain of Education

David I. Pickup1, Robert M. Bernard1*, Eugene Borokhovski1, Anne C. Wade1, Rana M. Tamim2

1 Centre for the Study of Learning and Performance (CSLP), Concordia University, Montreal, Canada

2 College of Education, Zayed University, Dubai, OAE

* Corresponding author. E-mail: robert.bernard@concordia.ca

Introduction. This paper provides an overview of the information retrieval strategy employed for two meta-analyses, conducted by a systematic review team at Concordia University (Montreal, QC, Canada). Both papers draw on standards first articulated by H.M. Cooper and further developed by the Campbell Collaboration, which promote a comprehensive approach to systematically searching an extensive array of resources (bibliographic databases, print resources, citation indices, etc.) in order to locate both published and unpublished research. The goal is to verify if searching comprehensively through multiple resources retrieves studies that are unique, and hence, improve the overall representativeness of a diverse body of literature. We also analyze the sensitivity and specificity of the results by data source. Methods. In order to determine the source sensitivity, we consider percentage of results from each source retrieved for full-text review. In order to determine the source specificity, we derive a percentage from the total number of studies included in the final meta-analysis compared against the overall number of initial results found. Results. Results demonstrate the need to search beyond the subject-specific databases of a particular discipline as unique results can be found in many places. Databases for related disciplines provided 129 unique includes to each meta-analysis, and multidis-ciplinary databases provided 44 and 99 unique includes for the two meta-analyses in question respectively. Manual search techniques were much more sensitive and specific than electronic searches of databases and yield a higher percentage of final includes. Discussion. The results demonstrate the utility of a comprehensive information retrieval methodology like that proposed by the Campbell Collaboration, which goes beyond the main subject databases to locate the full range of information sources, including grey literature.

Abstract

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

Keywords

systematic reviews, meta-analysis, information retrieval, search strategies, grey literature, bibliographic databases, literature search, literature reviews, librarians

Highlights

► Databases for related disciplines provided over 100 unique includes to each meta-analysis.

► Multidisciplinary databases provided 44 and 99 unique studies to each meta-ana-lysis respectively.

► Manual search techniques were much more sensitive and specific than electronic searches and yielded a higher percentage of finally included studies.

► Logic and rigor of systematic literature searches apply across fields of studies in social sciences and are useful for primary empirical research as much as for systematic reviews and meta-analyses.

For citation

Pickup D. I., Bernard R. M., Borokhovski E., Wade A. C., Tamim R. M. Systematically Searching Empirical Literature in the Social Sciences: Results from Two Meta-Analyses Within the Domain of Education. RossiiskH psikhologicheskH zhurnal - Russian Psychological Journal, 2018, V. 15, no. 4, pp. 245-265. DOI: 10.21702/rpj.2018.4.10

Original manuscript received 28.11.2018

Introduction

This paper will provide an overview of the information retrieval strategy employed for two large-scale meta-analyses within the domain of Education, conducted by a systematic review team at Concordia University [1, 2]. The team, in consultation with library professionals, has drawn on standards first articulated by H.M. Cooper [3, 4] and further developed by the Campbell Collaboration [e.g., 5, 6], which promote a comprehensive approach - by systematically searching an extensive array of resources (bibliographic databases, search engines, print resources, citation indices, etc.), using detailed strategies tailored to make maximum use of the features of each resource, in an attempt to locate both published and unpublished research. The results demonstrate the utility of a comprehensive information retrieval methodology that goes beyond the main literature databases to locate the full range of information sources, including so-called 'grey literature.

Theoretical Framework

When G.V. Glass [7] introduced the concept of meta-analysis, he conceived of the research paradigm as an "analysis of analyses" that would offer a statistical

examination of a large collection of studies so that their results could be integrated and a clearer view of the overall picture properly understood and presented. Rather than "casual, narrative discussions" the resulting publication would be a genuine attempt to make sense of an ever-expanding and often conflicting information landscape. A meta-analysis therefore is a specific class of systematic review that relies on quantitative data from primary studies addressing a common core research question. Meta-analysis summarizes systematically collected effect sizes from individual studies to estimate either the average magnitude of the difference on a common dependent variable between a treatment group and an alternative group (d-family effect size) or degree of association between variables of interest (r-family effect size) in the entire population (or a large-scale sample), in question and then tries to explain the variability that surrounds the overall effect size by systematically coding and analyzing methodological, substantive, and demographic moderator variables. The main research question (or group of related questions) should be stated and substantiated a-priori to inform search strategies, to set up and describe inclusion criteria, and to meaningfully guide the review process through all its steps [e.g., 4] - from information retrieval to study selection, through effect size extraction, aggregation, and analyses toward interpretation and presentation of the findings.

This paper focuses on one step in Cooper's process, the literature search stage. When first articulating the methodology of meta-analysis, G.V. Glass [7] said little about the best practices and methods to employ in searching the literature, however as the methodology of meta-analysis became more formalized, more detailed standards were developed [3, 8, 9, 10]. The work of two international organizations, the Cochrane Collaboration, and the Campbell Collaboration, further helped develop standards for the conduct of systematic reviews and meta-analyses, including for searching the literature [6, 11]. Essential components of these standards are systematicity, replicability and comprehensiveness of literature searches.

Systematicity demands a well-planned strategy, determined as part of the research question formulation, for how to proceed to gather all existing evidence, or the entire population of studies [4]. This includes such steps as keyword formulation, informed by discipline-specific reference sources such as dictionaries and encyclopedias, as well as database thesauri, as well as determining which selection of resources to use. Once carefully planned, the strategy can then be carried out in a systematic fashion.

H.M. Cooper [4] notes that researchers conducting reviews will have varying degrees of resources available (for example, the bibliographic databases that their institutions subscribe to), but that by searching broadly using various strategies, they will produce data sets that arrive at the same overall conclusions, and the

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

standard of replicability can be met. The standard of replicability can further be enhanced if strategies are recorded and documented so that they can be reviewed and judgments made about their potential impact on the overall review or meta-analysis. This 'search history' should also note the date searches were conducted to make clear any gaps that have since developed in the literature.

When aiming for comprehensiveness, it is important to find a proper balance between recall and precision [12], or in other words, sensitivity and specificity [13]. A strategy that emphasizes sensitivity will yield a larger quantity of results by searching a greater number of sources and targeting a greater array of synonymous terms that may also be relevant. A strategy that aims for specificity will result in fewer, but more likely to be relevant results. In truth, these two concepts are not dichotomous and the best strategies aim to find a good balance between both sensitivity and specificity, for example by employing a good mix of both the OR and AND operators in a Boolean logic-driven search.

Study Identification and Retrieval

H.M. Cooper [4] described a broad strategy to identify all relevant studies, which made use of various tactics using what he termed informal and formal channels. The former refers to personal contacts and approaches to research communities (what he termed 'invisible colleges') as well as browsing related websites using the WWW. The latter referred to searching library catalogues and databases, as well as conference proceedings, and included browsing reference lists of identified studies.

The search should attempt to locate both formally published research, usually in the form of journal articles, and research published in less traditional forms such as locally generated technical and evaluation studies (e.g. at the school or community level), government commissioned reports, theses, and unpublished manuscripts - what is generally classified as "grey literature" [e.g. 14, 15]. Although grey literature has often not undergone any peer review process, it may be considered a necessary counterbalance [16] to formally published materials, which demonstrate a tendency show greater statistical significance and higher effect sizes, i.e. 'publication bias' [17, 18].

Due to the complicated and variated requirements of a systematic and comprehensive search strategy that balances sensitivity and specificity, as well as the retrieval of identified studies, many systematic review teams seek outside help from an information specialist or search professional.

The Role of the Librarian

Many researchers have drawn on the expertise of librarians to assist with the search and information retrieval stage of systematic review and meta-analysis

projects, as their expertise with both electronic and manual search techniques coincides nicely with the requirements of the methodology [e.g. 19, 20, 21, 22]. Evidence suggests that librarian involvement produces a marked improvement on the quality of the review; A. Booth [23] examined the qualitative reviews found in Medline and noted that those explicitly involving a librarian in the process had the largest number of databases searched (thus improving the scope of outreach). L. Zhang, M. Sampson and J. McGowan [24] found that it was notably easier to assess the quality of search strategies in reviews where a librarian had been involved as they were more likely to include detailed reporting and take personal responsibility. Likewise, S. Golder, Y. Loke and H.M. Mcintosh [25] report that while few of the searches they analyzed were reported with enough description for the search to be replicated, nearly half of those that could be had been conducted by a librarian.

Librarians have also been at the forefront of attempts to identify areas for improvement in systematic review standards [24, 26] and to formulate best practices and guidelines [e.g., 5, 27]. These standards, if respected, will lead to better quality and less publication bias [18, 28].

The systematic review team of the Centre for the Study of Learning & Performance (CSLP) has for many years included a dedicated librarian on staff. The librarians employed over the years have assisted with question formulation through scoping of the literature, conducted literature searches following methods closely aligned with those promulgated by the Campbell Collaboration [5, 6], tracked and managed the retrieved studies, assisted with coding, and served as co-authors on final reports. The review team has always advocated for a high standard of methodological quality in meta-analysis [29, 30]. This report, in turn, is intended to test a high standard in information retrieval methodology.

Methods

The following section reports the information retrieval strategies and results from two recent publications that were each the culmination of projects that stretched over several years and synthesized a large body of research. The first investigated critical thinking interventions (CT) and the second looked at the effectiveness of classroom technology integration at the post-secondary level (PedTech). Both reviews investigated areas of research that spanned a variety of academic disciplines and sectors.

A comprehensive approach best articulated in the Campbell Collaboration's Information Retrieval Policy Brief [5, 6] was adopted to search widely, using a diversity of resources and methods, namely:

1) Subject databases;

2) Multidisciplinary databases;

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

3) Citation Indexes;

4) Web searching;

5) Branching (Hand searching).

These strategies were designed to locate publicly available published and unpublished literature of assorted publication types (i.e. articles, reports, conference papers, manuscripts, dissertations & theses, etc.) and provide the most representative (i.e., unbiased) picture of available research evidence. The goal of this paper is to verify if searching comprehensively through multiple resources in various fields retrieves studies that are unique (not be found anywhere else), and hence, improve the overall replicability [4] and representativeness of a diverse body of literature. We will also analyze the specificity and sensitivity of the results by data source (i.e. by database, hand searching, etc.) to see how well they performed in each review and if any conclusions might be drawn. To estimate the sensitivity of an information source, we will determine what percentage of the results found in the source were retrieved for closer full-text examination. To estimate the specificity of an information source we will divide the total number of studies from a source that are included in the final meta-analysis by the total number of initial results found in that source.

Before looking at the various data sources used and how they performed, we shall review the strategies employed for each review and provide a summary of the results.

Critical Thinking (CT)

P.C. Abrami, R.M. Bernard, E. Borokhovski, D.I. Waddington, C.A. Wade and T. Persson [1] conducted a review of studies on the development and enhancement of critical thinking skills and dispositions with a link to student achievement; it began with the research question, "What impact do instructional interventions have on the development of students' CT skills and dispositions?" The final dataset contained 341 effect sizes from experimental research (quasi- or true-experiments) that used a standardized test for critical thinking skills as an outcome such as the Watson-Glaser Critical Thinking Appraisal [31] or the Cornell Critical Thinking Test [32]. In addition, the review examined how different types of instructional interventions affect CT skills and dispositions, what impact pedagogical background (e.g., instructor training) has, and how calculated effect sizes vary with age (educational level), subject matter, and treatment duration.

To build the search strategy, keywords used were divided into two main concepts (Domain and Method). In some searches, a third concept (Context) was added. Searches were not limited to a particular population. Search strategies were customized for each database using a combination of controlled vocabulary and natural language terms.

RUSSIAN PSYCHOLOGICAL JOURNAL • 2018 VOL. 15 # 4

THEORY AND METHODS OF PROFESSIONAL EDUCATION

Domain: Critical Thinking, Thinking Skills.

Method: Experiment, Studies, Intervention, Treatment, Control Group, Post test.

Context: Education, Student, Learning, Teaching. Terms were combined within sets using the Boolean operator OR, and the sets themselves were combined using the AND operator (Figure 1).

("critical thinking" OR "thinking skills") AND (Experiment* OR Study OR Studies OR Intervention* OR Treatment* OR "Control Group" OR Posttest OR "post test") AND (education OR student* OR learn* OR teach*)

Figure 1. CT Sample Search

The databases selected can be classified into three groups: main subject databases, related subject databases, and multidisciplinary databases. Main Subject databases:

► Australian Education Index (https://www.acer.org/au/library/ australian-education-index-aei);

► British Education Index (https://www.leeds.ac.uk/bei/index.html);

► CBCA Education (https://www.proquest.com/libraries/academic/data-bases/cbca.html);

► Education Abstracts/Fulltext (https://www.ebsco.com/products/ research-databases/education-abstracts);

► ERIC (https://www.ebsco.com/products/research-databases/eric). Related Subject databases:

► ABI/Inform Global (https://www.proquest.com/products-services/abi inform global.html);

► EconLit (https://www.ebsco.com/products/research-databases/econlit);

► Medline (https://www.nlm.nih.gov/bsd/medline.html);

► PsycINFO (https://www.apa.org/pubs/databases/psycinfo/index.aspx);

► SocIndex (https://www.ebsco.com/products/research-databases/socindex);

► Sociological Abstracts (https://www.proquest.com/products-services/ socioabs-set-c.html);

► Social Services Abstracts (https://www.proquest.com/products-services/ ssa-set-c.html).

Multidisciplinary databases:

► Academic Search Complete (https://www.ebsco.com/products/ research-databases/academic-search-complete);

► Dissertations & Theses Global (https://www.proquest.com/products-ser-vices/pqdtglobal.html);

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

► Francis (https://www.inist.fr/?FRANCIS-74&lang=en);

► PAIS International (https://www.proquest.com/products-services/pais-

set-c.html);

► Web of Science (www.webofknowledge.com/).

The primary tools used for retrieval of grey literature were Yahoo (http:// yahoo.com) and Google (http://www.google.com); a series of searches were run using different combinations of keywords and the first 200 results of each were browsed. As a further step, the 'open access' library OAIster (https://www.oclc. org/en/oaister.html) was searched, as was the Ed/ITLib digital library (https:// www.editlib.org).

Approximately sixty review articles and previous meta-analyses were used for "branching" (their bibliographies were scanned for other relevant studies). A citation search was also conducted on many of these same review articles using the Web of Science database to locate publications that had cited them; citations searches were also conducted on the main CT tests [e.g. 31].

Pedagogical Technology (PedTech)

R.F. Schmid, R.M. Bernard, E. Borokhovsi, R.M. Tamim, P.C. Abrami, M.A. Surkes, C.A. Wade and J. Woods [2] performed a meta-analysis that reviewed primary research addressing the impact of computer technology, whether face-to-face or blended, on students' achievement, performance, or attitudes. The population was limited to post-secondary formal education. Results were limited to post-1990 to capture modern Internet-era technologies. The review reports the overall weighted average effects of using technology on the academic achievement and attitudes of students, while exploring moderator variables in an attempt to offer an explanation for how the technology interventions lead to positive or negative effects. The search strategy was broadly based and retrieved a total set of nearly 12,000 abstracts for review. Of these, 1105 were chosen for further full-text review, and produced 879 achievement effect sizes and 181 attitudinal effect sizes.

Similar to the CT review, search strategies were customized for each database using a combination of controlled vocabulary and natural language terms. Key concepts used in search strategies were grouped into three sets - domain, population and outcome.

Domain: Technology, Computers, Web-based Instruction, Online, Internet, Blended Learning, Hybrid Course, Simulation, Electronic, Multimedia, PDAs, etc.

Population: College, University, Higher Education, Postsecondary, Continuing Education, Adult Learning, etc.

Outcome: Learning, Achievement, Attitude, Satisfaction, Perception, Motivation, etc.

Terms were combined within sets using the Boolean operator OR, and the sets themselves were combined using the operator AND (Figure 2).

(technolog* OR comput* OR "web-based instruction" OR online OR Internet OR "blended learning" OR "hybrid course*" OR simulât* OR electronic OR multimedia OR PDAs) AND (colleg* OR universit* OR "higher education" OR postsecondary OR "continuing education" OR "adult learn*") AND (learn* OR achieve* OR attitud* OR satisf* OR perception* OR motivat*)

Figure 2. PedTech sample search

The main subject domain (educational technology) was quite broad and synonyms such as 'electronic' and 'computer' can appear in many different contexts. Therefore, wherever possible database-specific descriptors were used. In the case of some databases the NOT operator was used to exclude studies pertaining to "distance education" in the descriptor field (see Figure 3 for an example from ERIC).

(DE=("handheld devices" or "computer assisted instruction" or "computer uses in education" or "educational technology" or "technology integration" or "electronic learning" or "laptop computers" or "blended learning" or "computer peripherals" or "computers" or "calculators" or "graphic calculators" or "cybernetics" or "instrumentation" or "data processing" or "electronic publishing" or "computer mediated communication" or "artificial intelligence" or "hypermedia" or "multimedia instruction" or "multimedia materials" or "computer simulation" or "electronic mail" or "electronic journals" or "portfolio assessment" or "internet" or "courseware") OR (KW=("PDA" or "personal digit* assistant*" or "cell* phone*" or "learning object*11 or "elearn*" or "e-learn*" or "hybrid course*" or "hybrid learn*" or "e-portfolio" or "eportfolio" or "digital portfolio" or "world wide web"))) not (DE=("communications satellites" or "distance education" or "open universities" or "telecommunications" or "telecourses" or "virtual universities"))

Figure 3. PedTech controlled vocabulary sample

The following databases were searched: Subject databases:

► Australian Education Index (https://www.acer.org/au/library/ australian-education-index-aei);

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

► British Education Index (https://www.leeds.ac.uk/bei/index.html);

► CBCA Education (https://www.proquest.com/libraries/academic/data-bases/cbca.html);

► Education Abstracts/Fulltext (https://www.ebsco.com/products/ research-databases/education-abstracts);

► Education: Sage Full Text Collection (https://us.sagepub.com/en-us/nam/ education-collection);

► ERIC (https://www.ebsco.com/products/research-databases/eric); Related Subject databases:

► ABI/Inform Global (https://www.proquest.com/products-services/abi inform global.html);

► Communication Abstracts (https://www.ebsco.com/products/ research-databases/communication-abstracts);

► Medline (https://www.nlm.nih.gov/bsd/medline.html);

► PsycINFO (https://www.apa.org/pubs/databases/psycinfo/index.aspx); Multidisciplinary databases:

► Academic Search Complete (https://www.ebsco.com/products/ research-databases/academic-search-complete);

► Dissertations & Theses Global (https://www.proquest.com/products-ser-vices/pqdtglobal.html);

► Francis (https://www.inist.fr/?FRANCIS-74&lang=en);

► Web of Science (www.webofknowledge.com/).

Google (http://www.google.com) and Yahoo (http://www.yahoo.com) web searches were performed to locate grey literature, including a search specifically for conferences (which were then browsed manually). Online resources such as the Ed/ITLib Digital Library (http://editlib.org), Australian Policy Online (https:// apo.org.au/), and the OAIster 'open access' archive (https://www.oclc.org/en/ oaister.html) were searched as well, principally for reports and conference papers.

Review articles and previous meta-analyses were used for "branching" and the tables of content of recent issues of major journals in the field of educational technology were manually scanned for additional studies. Further, a number of online-only e-journals in the subject area of educational technology had been identified in the Google searches and these were also browsed.

Results

Tables 1 and 2 provide the raw totals from each source for the CT and PedTech reviews respectively (including duplicates found in more than one source), of those, how many were retrieved for full text review, how many were included in the final analysis, and how many of the includes were found uniquely in that source and no other.

Table 1. Overview of search results from Critical Thinking [1].

Source Total Results Retrieved Included Unique Includes

AACE / EdlTLib 295 83 18 16

ABI/Inform Global 219 23 2 2

Academic Search Premier 736 182 40 14

Australian Education Index 303 75 9 5

Branching 923 274 80 29

British Education Index 164 66 8 4

CBCA-Education 9 4 0 0

Dissertations & Theses 968 272 196 175

EconLit 2 1 0 0

Education Fulltext 166 76 25 1

ERIC 2,453 703 168 108

FRANCIS 25 7 3 1

Google 152 71 14 1

Index to Theses 81 9 3 3

Manual Search 15 6 2 1

Medline 1,011 241 75 22

OAIster 57 19 10 6

PAIS International 0 0 0 0

PsycINFO 995 333 160 105

SocIndex 39 17 2 0

Social Services Abstracts 28 4 1 0

Sociological Abstracts 15 5 2 0

Web of Science 611 238 92 29

Yahoo 231 123 39 11

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

Table 2. Overview of search results from PedTech [2]

Source Total Results Retrieved Included Unique

AACE / EdlTLib 472 122 15 13

ABI/Inform Global 110 16 3 2

Academic Search Premier 931 165 52 24

Australia Education Index 492 77 19 19

Australian Policy Online 12 1 0 0

Branching 937 379 210 134

British Education Index 958 281 79 47

CBCA-Education 27 8 0 0

Communication Abstracts 50 13 1 1

Dissertations & Theses 878 167 50 47

Education Fulltext 108 27 5 2

Education: Sage Fulltext Collection 17 4 0 0

ERIC 4,960 1197 425 269

FRANCIS 127 19 9 3

Google 279 89 32 15

Manual (journals) 664 302 63 39

Manual (conferences) 43 17 4 4

Medline 1,737 447 157 122

OAIster 191 20 2 1

PsycINFO 301 53 8 4

Web of Science 1,517 370 128 72

Yahoo 592 218 51 36

Analyzing the data sources

Using the datasets provided by these two large-scale meta-analyses, we shall now take a closer look at the results breakdown to ascertain the overall sensitivity and specificity of the various information sources. In order to get an idea of the sensitivity of each source, it may be informative to consider what percentage of studies from each source was retrieved for full-text review. Tables 3 and 4 provide a breakdown for each project on how the various information sources used compare as a percentage of the total number of results found by the search.

Table 3. Data sources by sensitivity - Critical Thinking [1]

Source Total Results % Retrieved Full-text

AACE / EdITLib 295 28%

ABI/Inform Global 219 11%

Academic Search Premier 736 25%

Australian Education Index 303 25%

Branching 923 30%

British Education Index 164 40%

CBCA-Education 9 44%

Dissertations & Theses 968 28%

EconLit 2 50%

Education Fulltext 166 46%

ERIC 2,453 29%

FRANCIS 25 28%

Google 152 47%

Index to Theses 81 11%

Manual Search 15 40%

Medline 1,011 24%

OAIster 57 33%

PAIS International 0 0%

PsycINFO 995 34%

Social Sciences Index 39 44%

Social Services Abstracts 28 14%

Sociological Abstracts 15 33%

Web of Science 611 39%

Yahoo 231 53%

CC BY 4.0 257

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

Table 4. Data sources by sensitivity - PedTech [2]

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

Source Total Results % Retrieved Full-text

AACE / EdITLib 472 26%

ABI/Inform Global 110 15%

Academic Search Premier 931 18%

Australia Education Index 492 16%

Australian Policy Online 12 8%

Branching 937 40%

British Education Index 958 29%

CBCA-Education 27 29%

Communication Abstracts 50 26%

Dissertations & Theses 878 19%

Education Fulltext 108 25%

Education: Sage Fulltext Collection 17 24%

ERIC 4,960 24%

FRANCIS 127 15%

Google 279 32%

Manual (journals) 664 45%

Manual (conferences) 43 40%

Medline 1,737 26%

OAIster 191 10%

PsycINFO 301 18%

Web of Science 1,517 24%

Yahoo 592 37%

Next, in order to focus on the specificity of each data source, we consider how each source compares in terms of total included studies (both uniquely discovered and duplicates) compared against the overall number of initial results.

Table 5. Data sources by specificity - Critical Thinking [1]

Source Total Results % Included

Dissertations & Theses 968 20%

OAIster 57 18%

Yahoo 231 17%

PsycINFO 995 16%

Education Fulltext 166 15%

Web of Science 611 15%

Manual Search 15 13%

Sociological Abstracts 15 13%

FRANCIS 25 12%

Google 152 9%

Manual (Branching) 923 9%

Medline 1,011 7%

ERIC 2,453 7%

AACE / EdITLib 295 6%

Academic Search Premier 736 5%

Social Sciences Index 39 5%

British Education Index 164 5%

Index to Theses 81 4%

Social Services Abstracts 28 4%

Australian Education Index 303 3%

ABI/Inform Global 219 1%

CBCA-Education 9 0%

EconLit 2 0%

• B^H CC BY 4.0 259

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

Table 6. Data sources by specificity - PedTech [2]

Source Total Results % Included

Manual (Branching) 937 22%

Google 279 12%

Manual (journals) 664 10%

Manual (conferences) 43 9%

Medline 1,737 9%

ERIC 4,960 9%

Yahoo 592 9%

Web of Science 1,517 8%

British Education Index 958 8%

FRANCIS 127 7%

Dissertations & Theses 878 6%

Academic Search Premier 931 6%

Education Fulltext 108 5%

Australia Education Index 492 4%

AACE / EdITLib 472 3%

ABI/Inform Global 110 3%

PsycINFO 301 3%

Communication Abstracts 50 2%

OAIster 191 1%

Australian Policy Online 12 0%

CBCA-Education 27 0%

Education: Sage Fulltext Collection 17 0%

260 CC BY 4.0 1 (OIS^^B

Discussion

The results from an analysis of the information retrieval methods used in the P.C. Abrami, R.M. Bernard, E. Borokhovski, D.I. Waddington, C.A. Wade and T. Persson [1] and R.F. Schmid, R.M. Bernard, E. Borokhovsi, R.M. Tamim, P.C. Abrami, M.A. Surkes, C.A. Wade and J. Woods [2] meta-analyses demonstrate the critical importance of using a comprehensive approach to information retrieval. To begin with, these results show the need to search beyond the subject-specific discipline of your research question as unique results can be found in many different places. For the CT review, databases of related fields (not Education) provided 129 studies included in the final analysis not found in the subject databases or elsewhere, and the multidisciplinary databases (excluding ProQuest Dissertations & Theses) provided an additional 44 unique includes. For PedTech, the related field databases coincidentally also yielded 129 unique included studies, and the multidisciplinary databases (excluding ProQuest Dissertations & Theses) provided a further 99. The strategy to 'cast a broad net' and search in many databases ultimately proves warranted by the inclusion of studies in the final meta-analysis not found anywhere else. In the CT meta-analysis, the ABI/Inform Global database did not produce very many relevant results, with only 23 of the original 219 warranting full-text review - however the two studies included in the final meta-analysis were not found in any other database. Likewise, the Index to Thesis database (a collection of mainly British dissertations) produced three uniquely found includes out of its original 81 results. The same can be seen in the PedTech review with the Australian Education Index, despite only 19 out of the original 492 results making their way into the final meta-analysis, these 19 were all uniquely found in that database.

Also of particular note was the success in both cases of the 'manual' strategies - "branching" reference lists of previous reviews and key studies, as well as scanning recent issues of important journals (in the case of PedTech these were also supplemented with browsing more obscure e-journals). In the CT review these manual strategies located a combined 30 studies included in the final analysis that were not found elsewhere, and in PedTech the combined result was 177 unique includes. Perhaps not surprisingly given the human element in selection of results, the manual searches also proved more sensitive and specific than electronic searches of databases, yielding a higher percentage of final includes.

Also of interest was the performance of searches conducted using the Google and Yahoo search engines; in both cases the searches of Yahoo resulted in more unique studies not found elsewhere - 11 studies for CT and 36 for PedTech. Please note that, at the time the Yahoo searches were conducted, their search was managed in-house (prior to 2004 search results were provided by Google

РОССИЙСКИЙ ПСИХОЛОГИЧЕСКИЙ ЖУРНАЛ • 2018 ТОМ 15 № 4 ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

and since 2009 they have been provided by Bing). Although academics are sometimes biased against using popular search engines for research, they do yield results not found elsewhere. Lastly, dissertations have also proved to be a fruitful source of analyzable results, especially in the case of the CT review, and a search of ProQuest's flagship Dissertations & Theses database is essential.

Overall, the results demonstrate the utility of a comprehensive search and information retrieval methodology which goes beyond the main subject databases to locate the full range of information sources, including grey literature from sources like the Web, as well as manual searches of conference proceedings and specialist collections. Further, while this paper reports the results of the methodological approach to systematic searching within the domain of Education, its principles are applicable to the Social Sciences more broadly, including Psychology, Sociology and others. Indeed, critical thinking (the subject focus of one of the two described reviews) is a cross-disciplinary concept. This same rigorous and systematic approach may be extended into primary research as well, with the same methods employed when writing literature review sections of empirical papers or standard narrative reviews. Researchers working at institutions that do not have access to a great array of bibliographic databases can take some comfort in the performance of the more 'manual' strategies - web searching and browsing online resources, i.e. EdITLib (now LearnTechLib.com), and conference websites. They may also wish to pursue international partnerships and/or obtain the services of an Information Specialist to conduct searches for their review projects.

Conclusions

This analysis provides some validation of the information retrieval standards promoted by the Campbell Collaboration [5, 6]. The results demonstrate that many relevant studies may be found using a diversity of retrieval methods and resources, which has significance for primary research as well as for the conduct of meta-analyses and systematic reviews in Education and the Social Sciences more broadly. The systematic reviewer should make every possible effort to find the available studies in order to provide as unbiased a result as possible and increase replicability [4]. Information retrieval within a systematic review or meta-analysis is not a one-shot deal; it requires considerable expertise, time, and resources and researches may wish to consider consulting with a librarian when formulating their strategy. Ultimately, drawing on the full body of research available on a given research topic, and not simply the easiest retrievable information, will provide a solid foundation to ensure a high quality review of the evidence.

References

1. Abrami P. C., Bernard R. M., Borokhovski E., Waddington D. I., Wade C. A., & Persson T. Strategies for Teaching Students to Think Critically: A Meta-Analysis. Review of Educational Research, 2015, V. 85, Issue 2, pp. 275-314. DOI: 10.3102/0034654314551063

2. Schmid R. F., Bernard R. M., Borokhovski E., Tamim R. M., Abrami P. C., Surkes M. A., Wade C. A., Woods J. The effects of technology use in post-secondary education: A meta-analysis of classroom applications. Computers & Education, 2014, V. 72, pp. 271-291. DOI: 10.1016/j.compedu.2013.11.002

3. Cooper H. M. The integrative research review: A systematic approach. Thousand Oaks, CA, Sage Publications, 1984. 144 p.

4. Cooper H. M. Synthesizing research: A guide for literature reviews. 5th ed. Thousand Oaks, CA, Sage Publications, 2017.

5. Hammerstrom K., Wade A., Jorgensen A.-M. K. Searching for studies: A guide to information retrieval for Campbell systematic reviews. Oslo, The Campbell Collaboration, 2010.

6. Kugley S., Wade A., Thomas J., Mahood Q., Jorgensen A.-M. K., Hammerstrom K. T., & Sathe N. Searching for studies: A guide to information retrieval for Campbell systematic reviews. Oslo, The Campbell Collaboration, 2017. DOI: 10.4073/cmg.2016.1

7. Glass G. V. Primary, Secondary and Meta-Analysis of Research. Educational Researcher, 1976, V. 5, Issue 10, pp. 3-8. DOI: 10.3102/0013189X005010003

8. Glass G. V., McGaw B., & Smith M. L. Meta-analysis in social research. Beverly Hills, CA, Sage Publications, 1981.

9. Lipsey M. W., Wilson D. B. Practical meta-analysis. Thousand Oaks, CA, Sage, 1990.

10. Borenstein M., Hedges L., Higgins J., & Rothstein H. Introduction to metaanalysis. Chichester, UK, Wiley, 2009.

11. Lefebvre C., Manheimer E., & Glanville J. Searching for studies. In J. P. T. Higgins & S. Green (eds.) Cochrane Handbook for Systematic Reviews of Interventions (Version 5.1.0, Chap. 6). The Cochrane Collaboration, 2011. Retrieved from http://handbook.cochrane.org

12. Cooper H., Hedges L. V., & Valentine J. C. (eds.) The Handbook of Research Synthesis and Meta-Analysis (2nd ed.). New York, Russell Sage Foundation, 2009. 632 p.

13. Pettigrew M., & Roberts H. Systematic Reviews in the Social Sciences: A Practical Guide. Malden, MA, John Wiley & Sons, 2008.

14. Auger C. P. Use of reports literature. London, Butterworths, 1975. 226 p.

iНе можете найти то, что вам нужно? Попробуйте сервис подбора литературы.

15. Auger C. P. Information Sources in Grey Literature (4th ed.). Boston, MA, Walter de Gruyter GmbH, 2017. 177 p.

ТЕОРИЯ И МЕТОДИКА ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ

16. Conn V. S., Valentine J. C., Copper H. M., & Rantz M. J. Grey Literature in Meta-Analyses. Nursing Research, 2003, V. 52, Issue 4, pp. 256-261.

17. Polanin J. R., Tanner-Smith E. E., & Hennessy E. A. Estimating the Difference Between Published and Unpublished Effect Sizes: A Meta-Review. Review of Educational Research, 2016, V. 86, Issue 1, pp. 207-236. DOI: 10.3102/0034654315582067

18. Rothstein H. R., Sutton A. J., & Borenstein M. (eds.) Publication bias in metaanalysis - prevention, assessment and adjustments. Chichester, UK, Wiley, 2005. 356 p.

19. Swinkels A., Briddon J., & Hall J. Two physiotherapists, one librarian and a systematic review: collaboration in action. Health Information and Libraries Journal, 2006, V. 23, Issue 4, pp. 248-256. DOI: 10.1111/j.1471-1842.2006.00689.x

20. Harris M. R. The librarian's roles in the systematic review process: a case study. Journal of the Medical Library Association, 2005, V. 93, Issue 1, pp. 81-87.

21. DeLuca J. B., Mullins M. M., Lyles C. M., Crepaz N., Kay L., & Thadiparthi S. Developing a Comprehensive Search Strategy for Evidence Based Systematic Reviews. Evidence Based Library and Information Practice, 2008, V. 3, № 1, pp. 3-32. DOI: 10.18438/B8KP66

22. Helmer D., Savoie I., Green C., & Kazanjian A. Evidence-based practice: extending the search to find material for the systematic review. Bulletin of the Medical Library Association, 2001, V. 89, Issue 4, pp. 346-352.

23. Booth A. "Brimful of STARLITE": toward standards for reporting literature searches. Journal of the Medical Library Association, 2006, V. 94, Issue 4, pp. 421-429.

24. Zhang L., Sampson M., & McGowan J. Reporting of the Role of the Expert Searcher in Cochrane Reviews. Evidence Based Library and Information Practice, 2006, V. 1, № 4, pp. 3-16. DOI: 10.18438/B85K52

25. Golder S., Loke Y., & McIntosh H. M. Room for improvement? A survey of the methods used in systematic reviews of adverse effects. BMC Medical Research Methodology, 2006, 6 (3). DOI: 10.1186/1471-2288-6-3

26. Yoshii A., Plaut D. A., McGraw K. A., Anderson M. J., & Wellik K. E. Analysis of the reporting of search strategies in Cochrane systematic reviews. Journal of the Medical Library Association, 2009, 97 (1), pp. 21-29. DOI: 10.3163/15365050.97.1.004

27. Wade C. A., Turner H. M., Rothstein H. R., & Lavenberg J. G. Information retrieval and the role of the information specialist in producing high-quality systematic reviews in the social, behavioural and education sciences. Evidence & Policy, 2006, V. 2, № 1, pp. 89-108. DOI: 10.1332/174426406775249705

28. Bernard R. M. Things I Have Learned about Meta-Analysis Since 1990: Reducing Bias in Search of "The Big Picture". Canadian Journal of Learning and Technology, 2014, V. 40, № 3. DOI: 10.21432/T2MW29

29. Bernard R. M., Borokhovski E., Schmid R. F., & Tamim R. M. An exploration of bias in meta-analysis: The case of technology integration research in higher education. Journal of Computing in Higher Education, 2014, V. 26, Issue 3, pp. 183-209. DOI: 10.1007/s12528-014-9084-z

30. Bernard R. M., Borokhovski E., & Tamim R. M. Detecting bias in meta-analyses of distance education research: big pictures we can rely on. Distance Education, 2014, V. 35, Issue 3, pp. 271-293. DOI: 10.1080/01587919.2015.957433

31. Watson G., & Glaser E. M. Watson-Glaser critical thinking appraisal. San Antonio, TX, PsychCorp, 1980.

32. Ennis R. H., & Millman J. Cornell critical thinking test. Pacific Grove, CA, Critical Thinking Books & Software, 1985.

i Надоели баннеры? Вы всегда можете отключить рекламу.