An official website of the United States government
Official websites use .gov A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.
- Publications
- Account settings
- Advanced Search
- Journal List
Critical evaluation of publications
N gopi chander.
- Author information
- Article notes
- Copyright and License information
Address for correspondence: Dr. N. Gopi Chander, Professor, Department of Prosthodontics, SRM Dental College, SRM University, Chennai - 600 089, Tamil Nadu, India. E-mail: [email protected]
Received 2020 Nov 30; Accepted 2020 Dec 16; Issue date 2021 Jan-Mar.
This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.
Critical evaluation is the process of examining the research for the strength or weakness of the findings, validity, relevance, and usefulness of the research findings.[ 1 ] The availability of extensive information and the difficulty in differentiating the relevant information obligate the primary need of critical appraisal. In addition, it establishes superior evidence and increases the application to clinical practice.[ 2 ] More importantly, it differentiates between significant and/or insignificant data in the literature and aids in providing the updated information. The purpose of critical appraisal shall help in informed decision and improve the quality of healthcare provided to patients.[ 1 , 2 , 3 ]
The research data have three possible outcomes – true findings, random variation that occurs due to chance, and biased results due to systematic error.[ 4 ] The true findings can be of positive or negative results, but it shall be highly recognized. The random error or actual result deviation occurs due to the uncontrollable factors such as smaller sample size and confounding factors. The random error does not alter the measured value, but it is an imperfect error caused due to study design inconsistencies. These errors are unpredictable and cannot be repeated again by repeating the analysis. The biased results are deliberate deviation in the study design, methodology, or investigations. The deviations in the result can be due to poor designing, to the methodology, or in the analysis. It will be difficult to differentiate these findings without critical analysis of the literature.[ 5 , 6 ]
There are various guidelines and tools proposed to critically evaluate the literature.[ 7 , 8 , 9 ] Since the scientific literature is in constant evolution, no one guidelines or checklist is considered to be gold standard. Moreover, the appraisal varies with the type of research. The checklist provided by various organizations for designing or structuring manuscripts - case report, reviews, and original research - cannot be combined or generalized for use. Similarly, it varies with the types of study design - randomized clinical trials and observational studies –case–control, cohort, and cross-sectional studies. The methodological guidelines such as consort statements, CARE guidelines, PROSPERO, or Cochrane checklists can significantly aid in the evaluation of different types of research data.[ 10 ] The structured approach and checklists provided by the organizations can be a valuable aid to conduct research as well as critically evaluate the manuscripts. In addition to the guidelines, the simplified checklists proposed by Young and Solomon can be of adjuvant tool in critical assessment of the literature.[ 1 ] It consists of 10 simple rules. That includes relevance of study question, new information to existing literature, type of research question, appropriateness of study design, bias appraisal, adherence of study protocol, hypothesis testing, check or estimation of statistical analysis, validation of conclusion, and identification of conflicts of interest. These checklists along with updated methodological guidelines for different types of study designs can be a valuable tool for critical appraisal of the literature.[ 1 , 10 ]
Most of the tools assess the validity, reliability, bias, and clinical application of the research data. The validity aids in determining the accuracy of the results, and the reliability establishes the consistency of the results. The bias is systemic deviation of results. The bias is of many types: it can be of from the initiation of the study to manuscript publication. Various assessment tools have been proposed to determine the bias. More commonly employed are the GRADE, Grade pro, Newcastle Ottawa, jaded, ROB 2, and ARRIVE 2.[ 11 ] The bias tools vary with the type of study design, and it is significant to use the appropriate tool. The tools assess and grade the quality of bias in the manuscript. These tools are majorly used for evaluating randomized control trial employed for systematic review and meta-analysis but can be suitably employed to different study designs. These tools provide the grading of bias and provide useful data that are essential for clinical application.[ 11 , 12 ]
Rapid appraisal can be done with merit trials/rapid critical appraisal tool.[ 6 ] It is a compressed tool that basically assesses on the validity, reliability, and clinical use of the study. This is a simplified checklist for quicker assessment; however, for more accurate assessment, it is essential to appraise the entire manuscript from introduction till the conclusion. This mandates a detailed check for every component of the literature in accordance to the standard guidelines. In addition, the journal indexing and metrics can play a significant role in estimation. Higher metric journal shall possess more rigorous peer-review process that reduces the significant errors in the manuscript.[ 3 , 4 ]
The major contents to be generally assessed in the introduction of the manuscript are type and contents of research question, justification of purpose/background of the study with articles published in the last 5 years, or older articles that possess significant influences, citations of peer-reviewed journal, defined objective, and hypothesis statement. In methodology, the parameter of appraisal parameters should be on study design, inclusion and exclusion criteria, care in reduction of bias, following the acceptable procedures, control on confounding variables, and valid outcome measures. The result section should be checked for the subject and baseline - demographic, relevant statistical tests, and statistical significance. The discussion should possess adequate literature substantiation for results, study limitations, and declarations on conflicts of interest.[ 6 ]
In the prosthodontic literature, extensive reports of similar nature exist; critical analysis of the literature is a necessary skill to be mastered by researchers and clinicians.[ 10 ] It helps clinicians to make quality evidenced healthcare decisions by extensive evaluation of the literature.
- 1. Young JM, Solomon MJ. How to critically appraise an article. Nat Clin Pract Gastroenterol Hepatol. 2009;6:82–91. doi: 10.1038/ncpgasthep1331. [ DOI ] [ PubMed ] [ Google Scholar ]
- 2. du Prel JB, Röhrig B, Blettner M. Critical appraisal of scientific articles: Part 1 of a series on evaluation of scientific publications. Dtsch Arztebl Int. 2009;106:100–5. doi: 10.3238/arztebl.2009.0100. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- 3. Burns PB, Rohrich RJ, Chung KC. The levels of evidence and their role in evidence-based medicine. Plast Reconstr Surg. 2011;128:305–10. doi: 10.1097/PRS.0b013e318219c171. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- 4. Mhaskar R, Emmanuel P, Mishra S, Patel S, Naik E, Kumar A. Critical appraisal skills are essential to informed decision-making. Indian J Sex Transm Dis AIDS. 2009;30:112–9. doi: 10.4103/2589-0557.62770. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- 5. Ackley BJ, Swan BA, Ladwig G, Tucker S. Evidence-based Nursing care Guidelines: Medical-surgical Interventions. St. Louis, MO: Mosby Elsevier; 2008. p. 7. [ Google Scholar ]
- 6. Fineout-Overholt E, Melnyk BM, Stillwell SB, Williamson KM. Evidence-based practice, step by step: Critical appraisal of the evidence: Part II: digging deeper–examining the “keeper” studies. Am J Nurs. 2010;110:41–8. doi: 10.1097/01.NAJ.0000388264.49427.f9. [ DOI ] [ PubMed ] [ Google Scholar ]
- 7. Zeng X, Zhang Y, Kwong JS, Zhang C, Li S, Sun F, et al. The methodological quality assessment tools for preclinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline: A systematic review. J Evid Based Med. 2015;8:2–10. doi: 10.1111/jebm.12141. [ DOI ] [ PubMed ] [ Google Scholar ]
- 8. Buccheri RK, Sharifi C. Critical appraisal tools and reporting guidelines for evidence-based practice. Worldviews Evid Based Nurs. 2017;14:463–72. doi: 10.1111/wvn.12258. [ DOI ] [ PubMed ] [ Google Scholar ]
- 9. Abt E, Bader JD, Bonetti D. A practitioner's guide to developing critical appraisal skills: Translating research into clinical practice. J Am Dent Assoc. 2012;143:386–90. doi: 10.14219/jada.archive.2012.0181. [ DOI ] [ PubMed ] [ Google Scholar ]
- 10. Chander NG. Evidence based research in prosthodontics. J Indian Prosthodont Soc. 2016;16:113. doi: 10.4103/0972-4052.179316. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- 11. Ma LL, Wang YY, Yang ZH, Huang D, Weng H, Zeng XT. Methodological quality (risk of bias) assessment tools for primary and secondary medical studies: What are they and which is better? Mil Med Res. 2020;7:7. doi: 10.1186/s40779-020-00238-8. [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
- 12. Goldet G, Howick J. Understanding GRADE: An introduction. J Evid Based Med. 2013;6:50–4. doi: 10.1111/jebm.12018. [ DOI ] [ PubMed ] [ Google Scholar ]
- View on publisher site
- PDF (464.8 KB)
- Collections
Similar articles
Cited by other articles, links to ncbi databases.
- Download .nbib .nbib
- Format: AMA APA MLA NLM
Add to Collections
- Privacy Policy
Home » Evaluating Research – Process, Examples and Methods
Evaluating Research – Process, Examples and Methods
Table of Contents
Evaluating Research
Research evaluation is a systematic process used to assess the quality, relevance, credibility, and overall contribution of a research study. Effective evaluation allows researchers, policymakers, and practitioners to determine the reliability of findings, understand the study’s strengths and limitations, and make informed decisions based on evidence. Research evaluation is crucial across disciplines, ensuring that conclusions drawn from studies are valid, meaningful, and applicable.
Why Evaluate Research?
Evaluating research provides several benefits, including:
- Ensuring Credibility : Confirms the reliability and validity of research findings.
- Identifying Limitations : Highlights potential biases, methodological flaws, or gaps.
- Promoting Accountability : Helps allocate funding and resources to high-quality studies.
- Supporting Decision-Making : Enables stakeholders to make informed decisions based on rigorous evidence.
Process of Evaluating Research
The evaluation process typically involves several steps, from understanding the research context to assessing methodology, analyzing data quality, and interpreting findings. Below is a step-by-step guide for evaluating research.
Step 1: Understand the Research Context
- Identify the Purpose : Determine the study’s objectives and research questions.
- Contextual Relevance : Evaluate the study’s relevance to current knowledge, theory, or practice.
Example : For a study examining the effects of social media on mental health, assess whether the study addresses an important and timely issue in the field of psychology.
Step 2: Assess Research Design and Methodology
- Design Appropriateness : Determine if the research design is suitable for answering the research question (e.g., experimental, observational, qualitative, or quantitative).
- Sampling : Evaluate the sample size, sampling methods, and participant selection to ensure they are representative of the population being studied.
- Variables and Measures : Review how variables were defined and measured, and ensure that the measures are valid and reliable.
Example : In an experimental study on cognitive performance, check if participants were randomly assigned to control and treatment groups to ensure the design minimizes bias.
Step 3: Evaluate Data Collection and Analysis
- Data Collection Methods : Assess the tools, procedures, and sources used for data collection. Ensure they align with the research question and minimize bias.
- Statistical Analysis : Review the statistical methods used to analyze data. Check for appropriate use of tests, proper handling of variables, and accurate interpretation of results.
- Ethics and Integrity : Consider whether data collection and analysis adhered to ethical guidelines, including participant consent, data confidentiality, and unbiased reporting.
Example : If a study uses surveys to collect data on job satisfaction, evaluate if the survey questions are clear, unbiased, and relevant to the research objectives.
Step 4: Interpret Results and Findings
- Relevance of Findings : Determine whether the findings answer the research question and contribute meaningfully to the field.
- Consistency with Existing Knowledge : Check if the results align with or contradict previous research. If they contradict, consider potential explanations for the differences.
- Generalizability : Evaluate whether the findings are applicable to a broader population or specific to the study sample.
Example : For a study on the effects of a dietary supplement on athletic performance, assess whether the findings could be generalized to athletes of different ages, genders, or skill levels.
Step 5: Assess Limitations and Biases
- Identifying Limitations : Recognize any acknowledged limitations in the study, such as small sample size, selection bias, or short duration.
- Potential Biases : Consider potential sources of bias, including researcher bias, funding source bias, or publication bias.
- Impact on Validity : Evaluate how limitations and biases might impact the study’s internal and external validity.
Example : If a study on drug efficacy was funded by a pharmaceutical company, acknowledge the potential for funding bias and whether safeguards were in place to maintain objectivity.
Step 6: Conclude with Overall Quality and Contribution
- Summarize Strengths and Weaknesses : Provide an overview of the study’s strengths and limitations, focusing on aspects that affect the reliability and applicability of the findings.
- Contribution to the Field : Assess the overall contribution to knowledge, practice, or policy, and identify any recommendations for future research or application.
Example : Conclude by summarizing whether the study’s methodology and findings are robust and suggest areas for future research, such as longer follow-up periods or larger sample sizes.
Examples of Research Evaluation
- Purpose : To assess whether stress levels affect productivity.
- Evaluation Process : Review if the sample includes participants with varying stress levels, if the stress is accurately measured (e.g., cortisol levels), and if the analysis properly accounts for confounding variables like sleep or work environment.
- Conclusion : The study could be evaluated as robust if it uses valid measures and controlled conditions, with future research suggested on different population groups.
- Purpose : To determine if digital learning tools improve student outcomes.
- Evaluation Process : Assess the appropriateness of the sample (students with similar baseline knowledge), methodology (controlled comparisons of digital vs. traditional methods), and results interpretation.
- Conclusion : Evaluate if findings are generalizable to broader educational contexts and whether technology access could be a limitation.
- Purpose : To determine the efficacy of a new medication for treating anxiety.
- Evaluation Process : Review if participants were randomly assigned, if a placebo was used, and if double-blinding was implemented to minimize bias.
- Conclusion : If the study follows a strong experimental design, it could be deemed credible. Note potential side effects for further investigation.
Methods for Evaluating Research
Several methods are used to evaluate research, depending on the type of study, objectives, and evaluation criteria. Common methods include peer review , meta-analysis , systematic reviews , and quality assessment frameworks .
1. Peer Review
Definition : Peer review is a method in which experts in the field evaluate the study before publication. They assess the study’s quality, methodology, and contribution to the field.
Advantages :
- Increases the credibility of the research.
- Provides feedback on methodological rigor and relevance.
Example : Before publishing a study on environmental sustainability, experts in environmental science review its methods, findings, and implications.
2. Meta-Analysis
Definition : Meta-analysis is a statistical technique that combines results from multiple studies to draw broader conclusions. It focuses on studies with similar research questions or variables.
- Offers a comprehensive view of a topic by synthesizing findings from various studies.
- Identifies overall trends and potential effect sizes.
Example : Conducting a meta-analysis of studies on cognitive behavioral therapy to determine its effectiveness for treating depression across diverse populations.
3. Systematic Review
Definition : A systematic review evaluates and synthesizes findings from multiple studies, providing a high-level summary of evidence on a particular topic.
- Follows a structured, transparent process for identifying and analyzing studies.
- Helps identify gaps in research, limitations, and consistencies.
Example : A systematic review of research on the impact of exercise on mental health, summarizing evidence on exercise frequency, intensity, and outcomes.
4. Quality Assessment Frameworks
Definition : Quality assessment frameworks are tools used to evaluate the rigor and validity of research studies, often using checklists or scales.
Examples of Quality Assessment Tools :
- CASP (Critical Appraisal Skills Programme) : Provides checklists for evaluating qualitative and quantitative research.
- GRADE (Grading of Recommendations Assessment, Development and Evaluation) : Assesses the quality of evidence and strength of recommendations.
- PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) : A guideline for systematic reviews, ensuring clarity and transparency in reporting.
Example : Using the CASP checklist to evaluate a qualitative study on patient satisfaction with healthcare services by assessing sampling, ethical considerations, and data validity.
Evaluating research is a critical process that enables researchers, practitioners, and policymakers to determine the quality and applicability of study findings. By following a structured evaluation process and using established methods like peer review, meta-analysis, systematic review, and quality assessment frameworks, stakeholders can make informed decisions based on robust evidence. Effective research evaluation not only enhances the credibility of individual studies but also contributes to the advancement of knowledge across disciplines.
- Creswell, J. W., & Creswell, J. D. (2018). Research Design: Qualitative, Quantitative, and Mixed Methods Approaches (5th ed.). SAGE Publications.
- Petticrew, M., & Roberts, H. (2006). Systematic Reviews in the Social Sciences: A Practical Guide . Blackwell Publishing.
- Egger, M., Smith, G. D., & Altman, D. G. (2008). Systematic Reviews in Health Care: Meta-Analysis in Context (2nd ed.). Wiley-Blackwell.
- Greenhalgh, T. (2019). How to Read a Paper: The Basics of Evidence-Based Medicine and Healthcare (6th ed.). Wiley-Blackwell.
- Higgins, J. P. T., & Green, S. (Eds.). (2011). Cochrane Handbook for Systematic Reviews of Interventions (Version 5.1.0). The Cochrane Collaboration.
About the author
Muhammad Hassan
Researcher, Academic Writer, Web developer
You may also like
Future Research – Thesis Guide
What is Research Topic – Ideas and Examples
Tables in Research Paper – Types, Creating Guide...
Theoretical Framework – Types, Examples and...
Limitations in Research – Types, Examples and...
Context of the Study – Writing Guide and Examples
- The Open University
- Accessibility hub
- Guest user / Sign out
- Study with The Open University
My OpenLearn Profile
Personalise your OpenLearn profile, save your favourite content and get recognition for your learning
About this free course
Become an ou student, download this course, share this free course.
Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.
1 Important points to consider when critically evaluating published research papers
Simple review articles (also referred to as ‘narrative’ or ‘selective’ reviews), systematic reviews and meta-analyses provide rapid overviews and ‘snapshots’ of progress made within a field, summarising a given topic or research area. They can serve as useful guides, or as current and comprehensive ‘sources’ of information, and can act as a point of reference to relevant primary research studies within a given scientific area. Narrative or systematic reviews are often used as a first step towards a more detailed investigation of a topic or a specific enquiry (a hypothesis or research question), or to establish critical awareness of a rapidly-moving field (you will be required to demonstrate this as part of an assignment, an essay or a dissertation at postgraduate level).
The majority of primary ‘empirical’ research papers essentially follow the same structure (abbreviated here as IMRAD). There is a section on Introduction, followed by the Methods, then the Results, which includes figures and tables showing data described in the paper, and a Discussion. The paper typically ends with a Conclusion, and References and Acknowledgements sections.
The Title of the paper provides a concise first impression. The Abstract follows the basic structure of the extended article. It provides an ‘accessible’ and concise summary of the aims, methods, results and conclusions. The Introduction provides useful background information and context, and typically outlines the aims and objectives of the study. The Abstract can serve as a useful summary of the paper, presenting the purpose, scope and major findings. However, simply reading the abstract alone is not a substitute for critically reading the whole article. To really get a good understanding and to be able to critically evaluate a research study, it is necessary to read on.
While most research papers follow the above format, variations do exist. For example, the results and discussion sections may be combined. In some journals the materials and methods may follow the discussion, and in two of the most widely read journals, Science and Nature, the format does vary from the above due to restrictions on the length of articles. In addition, there may be supporting documents that accompany a paper, including supplementary materials such as supporting data, tables, figures, videos and so on. There may also be commentaries or editorials associated with a topical research paper, which provide an overview or critique of the study being presented.
Box 1 Key questions to ask when appraising a research paper
- Is the study’s research question relevant?
- Does the study add anything new to current knowledge and understanding?
- Does the study test a stated hypothesis?
- Is the design of the study appropriate to the research question?
- Do the study methods address key potential sources of bias?
- Were suitable ‘controls’ included in the study?
- Were the statistical analyses appropriate and applied correctly?
- Is there a clear statement of findings?
- Does the data support the authors’ conclusions?
- Are there any conflicts of interest or ethical concerns?
There are various strategies used in reading a scientific research paper, and one of these is to start with the title and the abstract, then look at the figures and tables, and move on to the introduction, before turning to the results and discussion, and finally, interrogating the methods.
Another strategy (outlined below) is to begin with the abstract and then the discussion, take a look at the methods, and then the results section (including any relevant tables and figures), before moving on to look more closely at the discussion and, finally, the conclusion. You should choose a strategy that works best for you. However, asking the ‘right’ questions is a central feature of critical appraisal, as with any enquiry, so where should you begin? Here are some critical questions to consider when evaluating a research paper.
Look at the Abstract and then the Discussion : Are these accessible and of general relevance or are they detailed, with far-reaching conclusions? Is it clear why the study was undertaken? Why are the conclusions important? Does the study add anything new to current knowledge and understanding? The reasons why a particular study design or statistical method were chosen should also be clear from reading a research paper. What is the research question being asked? Does the study test a stated hypothesis? Is the design of the study appropriate to the research question? Have the authors considered the limitations of their study and have they discussed these in context?
Take a look at the Methods : Were there any practical difficulties that could have compromised the study or its implementation? Were these considered in the protocol? Were there any missing values and, if so, was the number of missing values too large to permit meaningful analysis? Was the number of samples (cases or participants) too small to establish meaningful significance? Do the study methods address key potential sources of bias? Were suitable ‘controls’ included in the study? If controls are missing or not appropriate to the study design, we cannot be confident that the results really show what is happening in an experiment. Were the statistical analyses appropriate and applied correctly? Do the authors point out the limitations of methods or tests used? Were the methods referenced and described in sufficient detail for others to repeat or extend the study?
Take a look at the Results section and relevant tables and figures : Is there a clear statement of findings? Were the results expected? Do they make sense? What data supports them? Do the tables and figures clearly describe the data (highlighting trends etc.)? Try to distinguish between what the data show and what the authors say they show (i.e. their interpretation).
Moving on to look in greater depth at the Discussion and Conclusion : Are the results discussed in relation to similar (previous) studies? Do the authors indulge in excessive speculation? Are limitations of the study adequately addressed? Were the objectives of the study met and the hypothesis supported or refuted (and is a clear explanation provided)? Does the data support the authors’ conclusions? Maybe there is only one experiment to support a point. More often, several different experiments or approaches combine to support a particular conclusion. A rule of thumb here is that if multiple approaches and multiple lines of evidence from different directions are presented, and all point to the same conclusion, then the conclusions are more credible. But do question all assumptions. Identify any implicit or hidden assumptions that the authors may have used when interpreting their data. Be wary of data that is mixed up with interpretation and speculation! Remember, just because it is published, does not mean that it is right.
O ther points you should consider when evaluating a research paper : Are there any financial, ethical or other conflicts of interest associated with the study, its authors and sponsors? Are there ethical concerns with the study itself? Looking at the references, consider if the authors have preferentially cited their own previous publications (i.e. needlessly), and whether the list of references are recent (ensuring that the analysis is up-to-date). Finally, from a practical perspective, you should move beyond the text of a research paper, talk to your peers about it, consult available commentaries, online links to references and other external sources to help clarify any aspects you don’t understand.
The above can be taken as a general guide to help you begin to critically evaluate a scientific research paper, but only in the broadest sense. Do bear in mind that the way that research evidence is critiqued will also differ slightly according to the type of study being appraised, whether observational or experimental, and each study will have additional aspects that would need to be evaluated separately. For criteria recommended for the evaluation of qualitative research papers, see the article by Mildred Blaxter (1996), available online. Details are in the References.
Activity 1 Critical appraisal of a scientific research paper
A critical appraisal checklist, which you can download via the link below, can act as a useful tool to help you to interrogate research papers. The checklist is divided into four sections, broadly covering:
- some general aspects
- research design and methodology
- the results
- discussion, conclusion and references.
Science perspective – critical appraisal checklist [ Tip: hold Ctrl and click a link to open it in a new tab. ( Hide tip ) ]
- Identify and obtain a research article based on a topic of your own choosing, using a search engine such as Google Scholar or PubMed (for example).
- The selection criteria for your target paper are as follows: the article must be an open access primary research paper (not a review) containing empirical data, published in the last 2–3 years, and preferably no more than 5–6 pages in length.
- Critically evaluate the research paper using the checklist provided, making notes on the key points and your overall impression.
Critical appraisal checklists are useful tools to help assess the quality of a study. Assessment of various factors, including the importance of the research question, the design and methodology of a study, the validity of the results and their usefulness (application or relevance), the legitimacy of the conclusions, and any potential conflicts of interest, are an important part of the critical appraisal process. Limitations and further improvements can then be considered.
How to read a paper, critical review
Reading a scientific article is a complex task. The worst way to approach this task is to treat it like the reading of a textbook—reading from title to literature cited, digesting every word along the way without any reflection or criticism.
A critical review (sometimes called a critique, critical commentary, critical appraisal, critical analysis) is a detailed commentary on and critical evaluation of a text. You might carry out a critical review as a stand-alone exercise, or as part of your research and preparation for writing a literature review. The following guidelines are designed to help you critically evaluate a research article.
How to Read a Scientific Article
You should begin by skimming the article to identify its structure and features. As you read, look for the author’s main points.
- Generate questions before, during, and after reading.
- Draw inferences based on your own experiences and knowledge.
- To really improve understanding and recall, take notes as you read.
What is meant by critical and evaluation?
- To be critical does not mean to criticise in an exclusively negative manner. To be critical of a text means you question the information and opinions in the text, in an attempt to evaluate or judge its worth overall.
- An evaluation is an assessment of the strengths and weaknesses of a text. This should relate to specific criteria, in the case of a research article. You have to understand the purpose of each section, and be aware of the type of information and evidence that are needed to make it convincing, before you can judge its overall value to the research article as a whole.
Useful Downloads
- How to read a scientific paper
- How to conduct a critical review
No internet connection.
All search filters on the page have been cleared., your search has been saved..
- Sign in to my profile My Profile
Understanding and Evaluating Research: A Critical Guide
- By: Sue L. T. McGregor
- Publisher: SAGE Publications, Inc
- Publication year: 2018
- Online pub date: December 20, 2019
- Discipline: Sociology , Education , Psychology , Health , Anthropology , Social Policy and Public Policy , Social Work , Political Science and International Relations , Geography
- Methods: Theory , Research questions , Mixed methods
- DOI: https:// doi. org/10.4135/9781071802656
- Keywords: discipline , emotion , Johnson & Johnson , journals , knowledge , law , peer review Show all Show less
- Print ISBN: 9781506350950
- Online ISBN: 9781071802656
- Buy the book icon link
Subject index
Understanding and Evaluating Research: A Critical Guide shows students how to be critical consumers of research and to appreciate the power of methodology as it shapes the research question, the use of theory in the study, the methods used, and how the outcomes are reported. The book starts with what it means to be a critical and uncritical reader of research, followed by a detailed chapter on methodology, and then proceeds to a discussion of each component of a research article as it is informed by the methodology. The book encourages readers to select an article from their discipline, learning along the way how to assess each component of the article and come to a judgment of its rigor or quality as a scholarly report.
Front Matter
- Acknowledgments
- About the Author
- INTRODUCTION
- Chapter 1: Critical Research Literacy
- PHILOSOPHICAL AND THEORETICAL ASPECTS OF RESEARCH
- Chapter 2: Research Methodologies
- Chapter 3: Conceptual Frameworks, Theories, and Models
- ORIENTING AND SUPPORTIVE ELEMENTS OF RESEARCH
- Chapter 4: Orienting and Supportive Elements of a Journal Article
- Chapter 5: Peer-Reviewed Journals
- RESEARCH JUSTIFICATIONS, AUGMENTATION, AND RATIONALES
- Chapter 6: Introduction and Research Questions
- Chapter 7: Literature Review
- RESEARCH DESIGN AND RESEARCH METHODS
- Chapter 8: Overview of Research Design and Methods
- Chapter 9: Reporting Qualitative Research Methods
- Chapter 10: Reporting Quantitative Methods and Mixed Methods Research
- RESULTS AND FINDINGS
- Chapter 11: Statistical Literacy and Conventions
- Chapter 12: Descriptive and Inferential Statistics
- Chapter 13: Results and Findings
- DISCUSSION, CONCLUSIONS, AND RECOMMENDATIONS
- Chapter 14: Discussion
- Chapter 15: Conclusions
- Chapter 16: Recommendations
- ARGUMENTATIVE ESSAYS AND THEORETICAL PAPERS
- Chapter 17: Argumentative Essays: Position, Discussion, and Think-Piece Papers
- Chapter 18: Conceptual and Theoretical Papers
Back Matter
Sign in to access this content, get a 30 day free trial, more like this, sage recommends.
We found other relevant content for you on other Sage platforms.
Have you created a personal profile? Login or create a profile so that you can save clips, playlists and searches
- Sign in/register
Navigating away from this page will delete your results
Please save your results to "My Self-Assessments" in your profile before navigating away from this page.
Sign in to my profile
Please sign into your institution before accessing your profile
Sign up for a free trial and experience all Sage Learning Resources have to offer.
You must have a valid academic email address to sign up.
Get off-campus access
- View or download all content my institution has access to.
Sign up for a free trial and experience all Sage Learning Resources has to offer.
- view my profile
- view my lists
IOE - Faculty of Education and Society
Writing critically
Academic writing: Writing critically
Learn how to show critical analysis in academic writing and write critically.
Critical analysis
Writing a critique (or critical review).
What does the term “critical analysis” mean in the context of academic writing? Showing critical analysis in academic writing could mean:
- Demonstrating your understanding of reading/evidence (“this appears to demonstrate that…”; “this implies…”; “this could result in…”).
- Showing reasoning and conclusions from your reading/reflections (“therefore…”; “as such…”).
- Considering questions such as “why”, “what if” and “so what”.
- Showing you understand how different ideas/evidence/perspectives relate to each other (“this is linked to Smith's concept of X…”; Building on Jones (2012), Green (2016) suggests…”).
- Demonstrating an understanding of how theories or research apply in your practice/context.
- Identifying possible limitations of research/theory and how these relate to your own arguments or own context (“in the context of international development…”; “in terms of learning in the Science classroom…”).
- Identifying how something could be interpreted or done differently (in relation to your reading and/or practice).
Back to top
Criticality?
If you have been told your writing is not critical enough, it probably means that your writing treats the knowledge claims as if they are true, well-supported, and applicable in the context you are writing about. This may not always be the case.
In these two examples, the extracts refer to the same section of text. In each example, the section that refers to a source has been highlighted in bold. The note below the example then explains how the writer has used the source material.
Example a: " There is a strong positive effect on students, both educationally and emotionally, when the instructors try to learn to say students' names without making pronunciation errors (Kiang, 2004)". This is a simple paraphrase with no critical comment. It looks like the writer agrees with Kiang. This is not a good example of critical writing, as the writer has not made any critical comment.
Example b: "Kiang (2004) gives various examples to support his claim that 'the positive emotional and educational impact on students is clear' (p.210) when instructors try to pronounce students' names in the correct way. He quotes one student, Nguyet, as saying that he 'felt surprised and happy' (p.211) when the tutor said his name clearly . The emotional effect claimed by Kiang is illustrated in quotes such as these, although the educational impact is supported more indirectly through the chapter. Overall, he provides more examples of students being negatively affected by incorrect pronunciation, and it is difficult to find examples within the text of a positive educational impact as such". The writer describes Kiang's (2004) claim and the examples which he uses to try to support it. The writer then comments that the examples do not seem balanced and may not be enough to support the claims fully. This is a better example of writing which expresses criticality.
A critique (or critical review) is not to be mistaken for a literature review. A “critical review”, or “critique”, is a complete type of text (or genre), discussing one particular article or book in detail. In some instances, you may be asked to write a critique of two or three articles (e.g. a comparative critical review). In contrast, a “literature review”, which also needs to be “critical”, is a part of a larger type of text, such as a chapter of your dissertation. Most importantly: read your article/book as many times as possible, as this will make the critical review much easier.
Read and take notes
To improve your reading confidence and efficiency, visit our pages on reading. After you are familiar with the text, make notes on some of the following questions.
Choose the questions which seem suitable:
- What kind of article is it (for example does it present data or does it present purely theoretical arguments)?
- What is the main area under discussion?
- What are the main findings?
- What are the stated limitations?
- Where does the author's data and evidence come from? Are they appropriate/sufficient?
- What are the main issues raised by the author?
- What questions are raised?
- How well are these questions addressed?
- What are the major points/interpretations made by the author in terms of the issues raised?
- Is the text balanced? Is it fair/biased?
- Does the author contradict herself?
- How does all this relate to other literature on this topic?
- How does all this relate to your own experience, ideas and views?
- What else has this author written? Do these build/complement this text?
- (Optional) Has anyone else reviewed this article? What did they say? Do I agree with them?
Organise your writing
You first need to summarise the text that you have read. One reason to summarise the text is that the reader may not have read the text.
In your summary, you will:
- Focus on points within the article that you think are interesting.
- Summarise the author(s) main ideas or argument.
- Explain how these ideas/argument have been constructed. For example, is the author basing her arguments on data that they have collected? Are the main ideas/argument purely theoretical?
In your summary you might answer the following questions:
- Why is this topic important?
- Where can this text be located? For example, does it address policy studies?
- What other prominent authors also write about this?
Evaluation is the most important part in a critical review. Use the literature to support your views. You may also use your knowledge of conducting research, and your own experience. Evaluation can be explicit or implicit.
Explicit evaluation
Explicit evaluation involves stating directly (explicitly) how you intend to evaluate the text, e.g. "I will review this article by focusing on the following questions. First, I will examine the extent to which the authors contribute to current thought on Second Language Acquisition (SLA) pedagogy. After that, I will analyse whether the authors' propositions are feasible within overseas SLA classrooms."
Implicit evaluation
Implicit evaluation is less direct. The following section on Linguistic features of writing a critical review contains language that evaluates the text. A difficult part of the evaluation of a published text (and a professional author) is how to do this as a student. There is nothing wrong with making your position as a student explicit and incorporating it into your evaluation. Examples of how you might do this can be found in the section on Linguistic features of writing a critical review. You need to remember to locate and analyse the author's argument when you are writing your critical review. For example, you need to locate the authors' view of classroom pedagogy as presented in the book/article and not present a critique of views of classroom pedagogy in general.
Linguistic features of a critical review
The following examples come from published critical reviews. Some of them have been adapted for student use.
- This article/book is divided into two/three parts. First...
- While the title might suggest...
- The tone appears to be...
- [Title] is the first/second volume in the series [Title], edited by... The books/articles in this series address...
- The second/third claim is based on...
- The author challenges the notion that...
- The author tries to find a more middle ground/make more modest claims...
- The article/book begins with a short historical overview of...
- Numerous authors have recently suggested that... (see [Author, Year]; [Author, Year]). [Author] would also be one such author. With his/her argument that...
- To refer to [Title] as a... is not to say that it is...
- This book/article is aimed at... This intended readership...
- The author's book/article examines the... To do this, the author first...
- The author develops/suggests a theoretical/pedagogical model to…
- This book/article positions itself firmly within the field of...
- The author in a series of subtle arguments, indicates that he/she...
- The argument is therefore...
- The author asks "..."
- With a purely critical/postmodern take on...
- [Topic], as the author points out, can be viewed as...
- In this recent contribution to the field of... this British author...
- As a leading author in the field of...
- This book/article nicely contributes to the field of... and complements other work by this author...
- The second/third part of... provides/questions/asks the reader...
- [Title] is intended to encourage students/researchers to...
- The approach taken by the author provides the opportunity to examine... in a qualitative/quantitative research framework that nicely complements...
- The author notes/claims that state support/a focus on pedagogy/the adoption of...remains vital if...
- According to [Author, Year] teaching towards examinations is not as effective as it is in other areas of the curriculum. This is because, as [Author, Year] claims that examinations have undue status within the curriculum.
- According to [Author, Year]… is not as effective in some areas of the curriculum/syllabus as others. Therefore, the author believes that this is a reason for some schools…
- This argument is not entirely convincing, as...furthermore it commodifies/rationalises the...
- Over the last five/10 years the view of... has increasingly been viewed as “complicated” (see [Author, Year]; [Author, Year]).
- However, through trying to integrate... with... the author...
- There are difficulties with such a position.
- Inevitably, several crucial questions are left unanswered/glossed over by this insightful/timely/interesting/stimulating book/article. Why should...
- It might have been more relevant for the author to have written this book/article as...
- This article/book is not without disappointment from those who would view... as...
- This chosen framework enlightens/clouds...
- This analysis intends to be... but falls a little short as...
- The authors rightly conclude that if...
- A detailed, well-written and rigorous account of...
- As a Korean student I feel that this article/book very clearly illustrates...
- The beginning of... provides an informative overview of...
- The tables/figures do little to help/greatly help the reader...
- The reaction by scholars who take a... approach might not be so favourable (e.g. Author, Year).
- This explanation has a few weaknesses that other researchers have pointed out (see [Author, Year]; [Author, Year]). The first is...
- On the other hand, the author wisely suggests/proposes that... By combining these two dimensions...
- The author's brief introduction to... may leave the intended reader confused as it fails to properly...
- Despite my inability to... I was greatly interested in...
- Even where this reader/I disagree(s), the author's effort to...
- The author thus combines... with... to argue... which seems quite improbable for a number of reasons. First...
- Perhaps this aversion to... would explain the author's reluctance to...
- As a second language student from ... I find it slightly ironic that such an Anglo-centric view is...
- The reader is rewarded with...
- Less convincing is the broad-sweeping generalisation that...
- There is no denying the author's subject knowledge nor his/her...
- The author's prose is dense and littered with unnecessary jargon...
- The author's critique of...might seem harsh but is well supported within the literature (see [Author, Year]; [Author, Year]; [Author, Year]). Aligning herself with the author, [Author, Year] states that...
- As it stands, the central focus of [Title] is well/poorly supported by its empirical findings...
- Given the hesitation to generalise to... the limitation of... does not seem problematic...
- For instance, the term... is never properly defined and the reader is left to guess as to whether...
- Furthermore, to label... as... inadvertently misguides...
- In addition, this research proves to be timely/especially significant to... as recent government policy/proposals has/have been enacted to...
- On this well-researched/documented basis the author emphasises/proposes that...
- Nonetheless, other research/scholarship/data tend to counter/contradict this possible trend/assumption... (see [Author, Year]; [Author, Year]).
- Without entering into details of the..., it should be stated that [Title] should be read by... others will see little value in...
- As experimental conditions were not used in the study the word “significant” misleads the reader.
- The article/book becomes repetitious in its assertion that...
- The thread of the author's argument becomes lost in an overuse of empirical data...
- Almost every argument presented in the final section is largely derivative, providing little to say about...
- She/he does not seem to take into consideration; however, that there are fundamental differences in the conditions of…
- As [Author, Year] points out, however, it seems to be necessary to look at…
- This suggests that having low… does not necessarily indicate that… is ineffective.
- Therefore, the suggestion made by [Author, Year]… is difficult to support.
- When considering all the data presented… it is not clear that the low scores of some students, indeed, reflect…
- Overall, this article/book is an analytical look at... which within the field of... is often overlooked.
- Despite its problems, [Title] offers valuable theoretical insights/interesting examples/a contribution to pedagogy and a starting point for students/researchers of... with an interest in...
- This detailed and rigorously argued...
- This first/second volume/book/article by... with an interest in... is highly informative...
An important note
We recommend that you do not search for other university guidelines on critical reviews. This is because the expectations may be different at other institutions. Ask your tutor for more guidance or examples if you have further questions.
IMAGES
VIDEO
COMMENTS
Critical evaluation is the process of examining the research for the strength or weakness of the findings, validity, relevance, and usefulness of the research findings. [1] The availability of extensive information and the difficulty in differentiating the relevant information obligate the primary need of critical appraisal.
critiquing the literature, critical analysis, reviewing the literature, evaluation and appraisal of the literature which are in essence the same thing (Bassett and Bassett, 2003). Terminology in research can be confusing for the novice research reader where a term like 'random' refers to an organized manner of selecting items or participants ...
Evaluating research is the process of assessing the quality of research studies. There are a number of factors to consider when evaluating.. ... Evaluating research is a critical process that enables researchers, practitioners, and policymakers to determine the quality and applicability of study findings. ... How to Read a Paper: The Basics of ...
Critically reviewing the literature is an indispensible skill which is used throughout a research career. This article demystifies the processes involved in systematically and critically reviewing the literature to demonstrate knowledge, identify research ideas, position research and develop theory. Although aimed primarily at research students ...
appraisal, critical analysis) is a detailed commentary on and critical evaluation of a text. You might carry out a critical review as a stand-alone exercise, or as part of your research and preparation for writing a literature review. The following guidelines are designed to help you critically evaluate a research article. What is meant by ...
Academic Journals. Evaluating Research in Academic Journals is a guide for students who are learning how to. evaluate reports of empirical research published in academic journals. It breaks down ...
Critically evaluate the research paper using the checklist provided, making notes on the key points and your overall impression. Discussion. Critical appraisal checklists are useful tools to help assess the quality of a study. Assessment of various factors, including the importance of the research question, the design and methodology of a study ...
To be critical of a text means you question the information and opinions in the text, in an attempt to evaluate or judge its worth overall. An evaluation is an assessment of the strengths and weaknesses of a text. This should relate to specific criteria, in the case of a research article. You have to understand the purpose of each section, and ...
Understanding and Evaluating Research: A Critical Guide shows students how to be critical consumers of research and to appreciate the power of methodology as it shapes the research question, the use of theory in the study, the methods used, and how the outcomes are reported. The book starts with what it means to be a critical and uncritical ...
Evaluation. Evaluation is the most important part in a critical review. Use the literature to support your views. You may also use your knowledge of conducting research, and your own experience. Evaluation can be explicit or implicit. Explicit evaluation. Explicit evaluation involves stating directly (explicitly) how you intend to evaluate the ...