Robert M. Bernard

  • Distinguished Professor Emeritus; Systematic Review Team Leader, Centre for the Study of Learning and Performance (CSLP), Concordia University

[email protected]

scholar.google.com/citations?user=KP1-fJ0AAAAJ

Impact Metrics
10,115
Total Citations
11
PR Journals
0
h-index
0
i10-index
0
Top Conf
0
Other Works
Awards & Honors
Distinguished Professor Emeritus

Concordia University

2018
AECT Division of Distance Learning – Distance Education Journal Article Award (First Place)

Association for Educational Communications and Technology (AECT)

2012
Outstanding Reviewer Award (Review of Educational Research)

American Educational Research Association (AERA)

2010
Award for Distinguished Scholarship (Arts & Science)

Concordia University

2007
Editor’s Award (Best Article of the Year)

National Student Speech Language Hearing Association (NSSLHA)

2006
Annual Award of Excellence in Research

Canadian Association for Distance Education (CADE)

2004
Past Positions

Professor, Concordia University

1991–2017

Associate Professor, Concordia University

1983–1991

Assistant Professor, Concordia University

1979–1983

Lecturer, University of Washington

1978–1979

Pre‑Doctoral Research Associate, University of Washington

1974–1978

Elementary School Teacher, Nashville, TN Public Schools

1968–1974
Education
PhD, Educational Communication
University of Washington (1978)
MS, General Curriculum
University of Tennessee (1974)
BS, Elementary School Curriculum
University of Tennessee (1972)
Biography

Distinguished Professor Emeritus of Education at Concordia University (Montreal), and Systematic Review Team Leader at the Centre for the Study of Learning and Performance (CSLP). His work centers on meta-analysis and systematic review in educational technology, with influential syntheses on distance/online/blended learning, technology integration, interaction treatments in distance education, and the teaching of critical thinking. Prior to academia he taught in Nashville (TN) Public Schools; he holds a PhD in Educational Communication (University of Washington, 1978), an MS (1974) and BS (1972) in curriculum from the University of Tennessee. citeturn13view0

Research Interests
  • Blended Learning
  • Critical Thinking
  • Higher Education
  • Meta-analysis
  • Quantitative Methods
  • Research Methods
  • Technology Integration
Peer-reviewed Journal Articles & Top Conference Papers
11

Journal of Higher Education Theory and Practice • Journal

Robert M. Bernard

Across 140 studies, contrasts student‑centered (treatment) with more teacher‑centered (control) practices in undergraduate science. The random‑effects mean favored student‑centered instruction (ḡ=0.34). Meta‑regression identified flexibility (negative predictor), class size, subject matter, and technology use as significant moderators (model R²≈.36). citeturn17search0

Review of Educational Research • Journal

Robert M. Bernard

Synthesizes 341 effects from quasi‑/true‑experimental studies using standardized CT measures. The weighted random‑effects mean (g≈0.30) indicates that instruction can meaningfully improve critical thinking skills/dispositions across levels and disciplines. Dialogue opportunities, authentic/situated problems, and mentoring were especially effective. citeturn8search0turn8search1

Journal of Computing in Higher Education • Journal

Robert M. Bernard

Within a broader program on technology integration, this meta‑analysis focused on comparative studies of blended learning (BL) vs. classroom instruction (CI). Results indicated BL outperforms CI by roughly one‑third of a standard deviation (g≈0.33; k≈117), with effects enhanced when cognitive‑support technologies and explicit interaction treatments are present. Methodological steps for meta‑analysis are detailed to inform future syntheses. citeturn6search1

Journal of Computing in Higher Education • Journal

Robert M. Bernard

Assesses potential sources of bias across meta‑analyses on technology integration in higher education using an explicit framework (e.g., selection criteria, coding, model choice, publication bias). Several syntheses exhibited multiple methodological vulnerabilities. The paper illustrates how such biases can distort “big‑picture” estimates and offers recommendations to strengthen future meta‑analytic practice. citeturn5search2

Computers & Education • Journal

Robert M. Bernard

Primary meta‑analysis of technology integration in postsecondary classrooms (1990–2010), excluding online/DE comparisons. From 11,957 abstracts screened, 1,105 studies yielded 879 achievement and 181 attitude effects. Random‑effects means were g≈0.27 (achievement) and g≈0.20 (attitudes). Meta‑regressions indicated larger gains when cognitive support tools were used and when treatment–control technology differences were greater. citeturn7search0

Distance Education • Journal

Robert M. Bernard

Explains five common bias‑producing aspects of meta‑analysis and examines 15 meta‑analyses (2000–2014) on DE/online/blended learning that compare with classroom instruction. Concludes that improving review quality and asking more actionable questions—beyond simple DE vs. classroom comparisons—are essential for trustworthy “big‑picture” conclusions. citeturn5search1turn5search5

Canadian Journal of Learning and Technology • Journal

Robert M. Bernard

A reflective, practice‑oriented article cataloguing substantive and methodological sources of bias in systematic reviews/meta‑analyses (e.g., scope, inclusion criteria, model choice, handling of publication bias), with guidance to reduce distortions and improve the utility of evidence for practitioners and policymakers. citeturn10search0

Review of Educational Research • Journal

Robert M. Bernard

A second‑order meta‑analysis of 25 meta‑analyses (1,055 primary studies) found a random‑effects mean achievement effect of g≈0.35 for technology use in face‑to‑face classrooms; a study‑level validation using 574 independent effects from 13 meta‑analyses produced g≈0.33. Results support a modest, positive impact of technology on learning and highlight heterogeneity and quality considerations. citeturn3search3

Distance Education • Journal

Robert M. Bernard

Surveys models and methods for integrating diverse evidence in education, emphasizing approaches suited to distance/online learning research. Advocates moving beyond DE vs. classroom comparisons and demonstrates synthesis methods that accommodate varied designs and outcomes. citeturn12search1

Review of Educational Research • Journal

Robert M. Bernard

Compares student–student, student–teacher, and student–content interaction treatments (ITs) within DE courses across 74 studies (74 achievement effects). After adjusting for methodological quality, the random‑effects mean effect on achievement was g≈0.38, favoring conditions with stronger ITs. Strengthening interaction—especially in asynchronous courses—was associated with increased cognitive engagement and higher achievement. citeturn4search2

Review of Educational Research • Journal

Robert M. Bernard

Synthesizes 232 comparative studies (1985–2002; 599 outcomes) on achievement, attitudes, and retention in distance education (DE) versus classroom instruction. Average effects were near zero but highly variable, indicating that some DE implementations outperform classroom instruction while others underperform. Interactivity and communication features (e.g., two‑way audio/video, CMC) emerged as key moderators linked to better outcomes, underscoring that design and interaction quality, not delivery mode per se, drive effects. citeturn2search0