University of Southern California
Association for Educational Communications and Technology (AECT)
Work‑Learning Research / Learning Development Accelerator
American Educational Research Association
United States Army
American Society for Training & Development (now ATD)
International Society for Performance Improvement (ISPI)
USC Rossier School of Education
SITE Foundation
International Society for Performance Improvement (ISPI)
Association of Applied Psychology
Association for Psychological Science
American Psychological Association
Director, Center for Cognitive Technology (CCT), University of Southern California
Professor of Educational Psychology and Technology, University of Southern California
Professeur associé (Visiting/Adjunct), Université de Montréal (University of Montreal)
Visiting Professor; Program Board member (Graduate Organizational Psychology), Dublin City University
Associate Professor then Professor; Chair, Department of Instructional Technology; Joint appointment in Psychology, Syracuse University
Senior R&D Associate; Manager, Technology; Director, ERIC Clearinghouse on Information Resources (Center for R&D in Teaching), Stanford University
Associate Professor of Communication Studies; Co‑Director, Center for Communication Research, California State University, Sacramento
Richard E. Clark is Emeritus Professor of Educational Psychology and Technology at the University of Southern California’s Rossier School of Education and Emeritus Clinical Research Professor of Surgery in the Keck School of Medicine. He is widely known for his evidence-based analyses of media and method effects in learning (“media are mere vehicles”) and for advocacy of fully guided instruction, cognitive task analysis, and performance improvement. Clark’s career includes faculty appointments at Stanford and Syracuse prior to joining USC in 1978; he directed USC’s Center for Cognitive Technology and has collaborated extensively with medicine and the U.S. Army on complex skills training. His degrees are BA (Western Michigan University, 1963), MS (University of Pennsylvania, 1965), and EdD (Indiana University Bloomington, 1970). citeturn10search1turn16view0turn7search0
A comprehensive blueprint for complex learning that integrates: (1) whole learning tasks, (2) supportive information, (3) just‑in‑time information, and (4) part‑task practice to coordinate cognitive processes and build expertise.
An evidence‑based approach to complex skills training emphasizing fully guided instruction, worked examples, authentic tasks, and structured feedback; developed and fielded in military and professional settings.
A learning‑engineering framework to diagnose and remedy motivation problems by assessing values, self‑efficacy, emotions, and attribution errors, and by targeting strategies to increase effort, persistence, and control beliefs.
Interdisciplinary Education and Psychology • Journal
Introduces the BEC framework for diagnosing and addressing effort‑based motivation problems. Focuses on four factors—values, self‑efficacy, emotions, and attribution errors—and offers measures and evidence‑based strategies to help learners start, persist, and invest adequate mental effort.
Educational Psychologist • Journal
Argues that minimally guided approaches overload working memory and ignore human cognitive architecture. Reviews decades of evidence showing fully guided instruction is more effective and efficient for novices; benefits of reduced guidance emerge mainly for learners with high prior knowledge. Summarizes design models that support guidance.
Educational Psychologist • Journal
This article argues that minimally guided instructional approaches conflict with human cognitive architecture and decades of empirical evidence. It synthesizes research showing that guidance-heavy methods (e.g., explicit instruction, worked examples) are generally more effective and efficient for novice learners, and that the advantages of reduced guidance emerge only when learners already possess high prior knowledge to generate internal guidance. The paper outlines implications for designing instruction aligned with cognitive load theory. citeturn17search4
Performance Improvement Quarterly • Journal
Meta‑analysis of adequately designed field and lab studies shows incentive programs yield an average 22% performance gain, with team‑directed incentives especially effective. Effects were generally robust across settings, study types, and outcome measures, while some program designs produced larger gains than others.
Educational Technology Research and Development • Journal
Presents the four‑component instructional design (4C/ID) model for complex skill acquisition. Training blueprints integrate: learning tasks, supportive information, just‑in‑time information, and part‑task practice. The article explains links to cognitive processes, offers design methods for each component, and illustrates with a worked example.
Educational Psychologist • Journal
Synthesizes theoretical and methodological distinctions between academic self‑concept and self‑efficacy. Finds that, although related, self‑efficacy tends to show stronger predictive and explanatory power for academic performance and persistence, while self‑concept more strongly relates to affective outcomes. Offers directions for future research.
Educational Technology Research and Development • Journal
Clarifies arguments about media effects on learning, motivation, and efficiency and responds to critiques. Clark reiterates that learning outcomes are driven by instructional methods, not by the medium of delivery, and addresses common reactions and criticisms to this claim.
Educational Technology Research and Development • Journal
Reanalysis of a sample of CBI studies suggests reported achievement gains are overestimated due to uncontrolled instructional method differences and related confounds, rather than the computer medium itself. Argues comparable gains could be achieved with other media delivering the same methods.
Review of Educational Research • Journal
Reviews meta‑analyses and studies of media’s influence on learning and concludes that specific delivery media do not, by themselves, produce learning gains. Apparent benefits commonly arise from differences in instructional methods or novelty effects. The article critiques media‑attribute theories and recommends refocusing research on instructional methods rather than media per se.
• Book
Edited volume introducing learning analytics through contributions on theory, methods, use cases, and policy. Provides frameworks and examples for interpreting data about learners’ academic, social‑emotional, and motivational contexts to improve success and inform decision‑making.
American Educator • Journal
Summarizes research showing that, for novice learners, direct, explicit instruction with practice and feedback outperforms discovery‑oriented methods. Recommends using projects and inquiry primarily for practice and transfer after explicit teaching of essential content and procedures.
• Book
Edited collection revisiting the media‑effects debate. Brings together Clark’s key writings and critical responses to argue that instructional methods—not delivery media—explain learning outcomes, providing a foundation for evaluating technology claims in education.
Handbook of Research on Educational Communications and Technology (3rd ed.) • Chapter
Overviews methods for eliciting experts’ implicit and explicit knowledge to inform instruction and expert systems. Describes families of CTA techniques, evidence on their impact, integration with training design, and recommendations for future research.