User-Centered Assessment Design: An Integrated Methodology for Diverse Populations
Autor Madhabi Chatterjien Limba Engleză Hardback – apr 2025
Structura și metodologia volumului User-Centered Assessment Design sunt fundamentate pe necesitatea de a ancora evaluarea psihopedagogică în contextul real al utilizatorului final. Materialul este organizat riguros, pornind de la fundamentele conceptelor de design (Partea I) și evoluând spre aplicarea practică a modelelor de proces centrate pe utilizator. Remarcăm abordarea Madhabiei Chatterji care nu se limitează la validitatea statistică tradițională, ci introduce o perspectivă socio-ecologică, unde interesele, capacitățile și limitările populațiilor evaluate devin variabile centrale în construcția instrumentului.
Descoperim aici o metodologie integrată care armonizează metodele calitative cu cele cantitative, oferind proceduri clare pentru dezvoltarea testelor cu itemi multipli, dar și a evaluărilor bazate pe produs sau portofoliu. Comparabil cu Assessing Performance de Robert L. Johnson în ceea ce privește rigoarea etapelor de administrare și cotare, volumul de față este actualizat pentru cerințele contemporane de echitate și diversitate, punând un accent mai mare pe consecințele utilizării scorurilor asupra populațiilor vulnerabile. De asemenea, lucrarea este comparabilă cu Argument-Based Validation in Testing and Assessment prin efortul de a reduce decalajul dintre teorie și practica validării, însă Madhabi Chatterji extinde discuția spre utilitatea practică imediată în sănătate și afaceri.
În contextul operei autoarei, această carte reprezintă o evoluție firească a temelor explorate în Validity and Test Use. Dacă în lucrarea anterioară accentul cădea pe dialogul internațional privind echitatea și responsabilitatea în evaluare, User-Centered Assessment Design oferă instrumentele tehnice concrete pentru a implementa acele idealuri. Progresia conținutului, susținută de „Recaps” și „Reflection Breaks”, indică o destinație clară către mediul academic și profesional, unde designul de instrumente trebuie să răspundă unor standarde etice și tehnice tot mai complexe.
Preț: 531.43 lei
Preț vechi: 625.22 lei
-15%
Carte disponibilă
Livrare economică 03-17 iunie
Specificații
ISBN-10: 1462555489
Pagini: 446
Dimensiuni: 243 x 195 x 30 mm
Greutate: 0.96 kg
Ediția:1
Editura: Guilford Publications
Colecția Guilford Press
De ce să citești această carte
Această resursă este esențială pentru specialiștii în psihometrie și evaluare care doresc să creeze instrumente ce respectă nu doar criteriile de validitate, ci și pe cele de echitate socială. Cititorul câștigă o metodologie aplicată pentru a proiecta evaluări adaptate unor populații diverse, asigurându-se că rezultatele obținute sunt cu adevărat relevante pentru contextul specific al utilizatorului, fie că activează în educație, psihologie sau sănătate publică.
Despre autor
Madhabi Chatterji este profesor asociat de măsurare, evaluare și educație la Teachers College, Columbia University, având un doctorat obținut la University of South Florida. Cu o experiență de peste un deceniu în sectoarele educației, sănătății publice și mediului corporativ, expertiza sa acoperă designul instrumentelor de măsurare și validarea acestora. Activitatea sa academică se concentrează pe corelarea teoriei validității cu practicile de evaluare aplicate, fiind o voce autoritară în promovarea echității și responsabilității în utilizarea testelor la nivel internațional.
Descriere scurtă
Cuprins
1. Foundational Concepts in Assessment Design
1.1 Chapter Overview
1.2 Assessments: Old and Emerging Traditions, a Starting Definition, and Some Distinctions
1.3 Viewpoints on Assessment, Measurement, Testing, and Evaluation
1.4 Role of Assessment in Scientific, Professional, and Practical Endeavors
1.5 Evaluating the Quality of Assessments and Construct Measures: Validity, Reliability, and Utility
1.6 Integrating Assessment Design, Validation, and Use: A User-Centered Process
1.7 Summary
2. Why Assess?: Measure-Based Inferences, Uses, Users, and Consequences
2.1 Chapter Overview
2.2 Back to the Future: Early Drivers, Milestones, and Consequences of Assessment Use
2.3 Modern Drivers and Consequences of Assessment Uses in Education
2.4 Modern Drivers and Consequences of Assessment Uses in Psychology, Health, Business, and Other Fields
2.5 Applying User-Centered Principles to Improve Practices
2.6 Summary
3. Whom to Assess? and How?: Specifying the Population and the Assessment Operations
3.1 Chapter Overview
3.2 Why Population Characteristics and the Socioecological Contexts of Assessments Matter
3.3 What Is Measurement Bias?: Case Studies and Hypothetical Illustrations
3.4 Selecting Assessment Operations for Diverse Populations and Multidisciplinary Constructs
3.5 Steps and Actions: Specifying Whom to Assess? and How? with the Process Model
3.6 Summary
II. Assessment Design
4. What to Assess?: Specifying the Domains for Constructs
4.1 Chapter Overview
4.2 Domain Sampling and Domain Specification: Functional Theory and Applied Illustrations
4.3 Construct Types, Domain Conceptualizations, and Structures
4.4 Domain Specification as a Part of the Process Model: Steps, Techniques, Guidelines, and Conventions
4.5 Content-Validating Specified Domains
4.6. Summary
5. Designing Assessments with Structured and Constructed-Response Items
5.1 Chapter Overview
5.2 Why the Mechanics of Item Construction Matter
5.3 Cognitive Constructs Measured Best with Structured- or Constructed-Response Items
5.4 Writing Structured-Response Items: Principles, Guidelines, and Applied Examples
5.5 Guidelines for Designing Constructed-Response and Essay Tasks
5.6. Instrument Assembly
5.7 An Application with the Process Model: A Case Study of Cognitively Based Item and Assessment Design to Foster Learning in Long Division
5.8 Summary
6. Designing Behavior-Based, Product-Based, and Portfolio-Based Assessments
6.1 Chapter Overview
6.2 Behavior-, Product-, and Portfolio-Based Assessments: Definitions, Examples, and Origins
6.3 Advantages of the Performance Assessment Format
6.4 Disadvantages of Performance Assessments: Human Vulnerabilities, Errors, and Biases
6.5 Three Case Studies: Applying the Process Model to Design and Validate Performance Assessments
6.6 Summary
7. Designing Survey-Based and Interview-Based Assessment Tools
7.1 Chapter Overview
7.2 Self-Report Instruments: Their Defining Properties and Common Applications
7.3 Historical Origins of Questionnaires and Attitude Surveys
7.4 Measurement Issues with the Self-Report Modality
7.5 General Design Guidelines for Self-Report Instruments
7.6 Ten More Guidelines for Writing Closed-Ended Survey Items
7.7 A Case Study: Applying the Process Model to Design Two Complementary Self-Report Tools
7.8 Summary
III. Validation and Use of Assessments
8. Analyzing Data from Assessments: A Statistics Refresher
8.1 Chapter Overview
8.2 Preparing for Data Analysis
8.3 Organizing the Data
8.4 Measures of Central Tendency
8.5 Measures of Variability
8.6 Graphical Displays of Data
8.7 The Standard Normal Distribution and Its Applications
8.8 Correlation Coefficients and Their Applications
8.9 Related Statistical Techniques
8.10 Summary
9. Improving the Inferential Utility of Assessment Results: Methods and Limitations
9.1 Chapter Overview
9.2 Frames of Reference and Derived Scores
9.3 Using Norms as the Frame of Reference
9.4 Using Criterion Scores or Standards as the Frame of Reference
9.5 Using Self as the Frame of Reference
9.6 Composite Scores
9.7 Grouped Scores, Equated Scales, and Linked Tests
9.8 Summary
10. A Unified Approach to Construct Validity and Validation: Theory to Evidence
10.1 Chapter Overview
10.2 Construct Validity: An Evolving Concept
10.3 Theoretical Foundations of the Unitarian View of Validation
10.4 Main Clusters and Types of Validity Evidence
10.5 Random Errors of Measurement and Types of Reliability Evidence
10.6 Utility of Measures, Assessments, and Assessment Systems
10.7 Unified Validation Plans
10.8 Chapter Summary
11. Empirical Methods of Validation
11.1 Chapter Overview
11.2 Planning Empirical Validation Studies
11.3 Evaluating Item Performance
11.4 Examining Fairness and Measurement Bias
11.5 Gathering Evidence of Content-Based Validity
11.6 Validating Response Processes: The Cognitive Interview
11.7 Gathering Correlational Evidence of Validity
11.8 Empirical Estimation of Reliability
11.9 Methods to Examine Utility
11.10 Evaluating the Evidence: The PSQI Case Revisited
11.11 Summary
12. User-Centered Assessment Design: Revisiting the Principles, Comparisons, and Conclusions
12.1 Chapter Overview
12.2 Applying the Principles Undergirding the Process Model: A Summary by Section
12.3. A User-Centered Design Process: Comparing the Old with the New
12.4 Extended Applications of the Process Model
12.5 The Process Model Compared to Existing Models of Assessment Design
12.6 Connecting the Process Model with the 2014 Standards
12.7 Summary
Glossary
References
Author Index
Subject Index
About the Author
Recenzii
"Chatterji's process model integrates the 'why,' 'who,' 'what,' and 'how-to' of effective assessment This book provides practitioners with a conceptual framework and relevant procedures for conceptualizing and developing carefully targeted measures and establishing their technical adequacy."--Paul Yovanoff, PhD, Simmons School of Education and Human Development (Emeritus), Southern Methodist University
"This well-constructed text will be useful for graduate-level courses in testing and measurement. It outlines basic concepts of test construction quite well and presents many figures and applications to make it easier to understand the material."--Matthew K. Burns, PhD, Rose and Irving Fein Endowed Professor of Special Education, University of Florida; Assistant Director, University of Florida Literacy Institute
"This book is a 'must have' for those of us in the assessment world. It covers all the basic information that is needed for high-quality assessment development, administration, and analysis. I recommend this book for district- and state-level education decision makers and anyone who provides professional development to program specialists and classroom teachers."--Beverly Fitzpatrick, PhD, School of Pharmacy and School of Education, Memorial University of Newfoundland and Labrador, Canada-