Skip to Main Content

Evidence-Based Practice

Appraise

Occupational practices based on scientific evidence.

Appraise

Critical appraisal is the process of carefully and systematically examining research to judge its credibility, its value, and its relevance in a specific context.

Why appraise resources?

The aim of critical appraisal is to understand the strengths, weaknesses, and potential for bias in clinical research before it is applied to a patient. Validity, applicability, and clinical importance should be considered during critical appraisal to ensure that research evidence is used reliably and efficiently and that false conclusions are not drawn.

Critical appraisal is necessary to:

  • assess benefits and strengths for research against flaws and weaknesses
  • decide whether studies have been undertaken in a way that makes their findings reliable
  • make sense of the result
  • know what these results mean in the context of the clinical decision making
  • assess the usefulness of the evidence for clinical decisions.

The selection and appraisal process includes:

  • applying inclusion and exclusion criteria
  • critically appraising selected articles
  • identifying bias
  • using a second reviewer and comparing the two reviews and outcomes
  • documenting your selection process.

Becoming a critical reader is crucial because it enhances your understanding of content, develops essential cognitive and research skills, improves communication, empowers you to discern reliable information, exposes you to diverse perspectives, fosters lifelong learning, and contributes to problem-solving. This skill is valuable in various contexts, promoting success in both personal and professional endeavours.

A critical reader:​ ​

  • Does not believe everything they read​ ​
  • Questions what they read​ ​
  • Rereads if necessary​ ​
  • Understands the influence of style​ ​
  • Analyses arguments​ ​
  • Discounts arguments that are unsupported or based on faulty reasoning.

When reading critically, focus on the purpose of your literature:

  • Think about what you expect from the article or chapter, before reading it
  • Skim the abstract, headings, conclusion, and the first sentence of each paragraph
  • Focus on the arguments presented rather than facts
  • Take notes as you read and start to organise your review around themes and ideas
  • Consider using a table, matrix or concept map to identify how the different sources relate to each other
  • Note four to six points for each study that summarises the main points and conclusions
  • Be as objective as possible.

By carefully evaluating your resources, you can refine your research, strengthen your arguments, and contribute to the advancement of knowledge in your field.

During the process you may discover:

  • your original query might be too specific or too broad
  • results reveal additional search words or aspects of a topic to further explore
  • using different databases provides different search results
  • search filters can be modified to limit or to broaden search results

After constructing your search, review the search itself and consider:

  • Have you used the correct syntax (e.g. field codes, truncation, etc) for this database?
  • Are there any spelling errors or typos?
  • Have you been comprehensive with your search terms?
  • If the database uses subject headings: have you included relevant subject headings and also added them as keywords?
  • Have you combined your terms and concepts correctly with OR & AND?

Next, run the search and review the results. When reviewing your search results, consider whether:

  • they can answer the review question
  • the results are relevant
  • too many or too few results are being retrieved
  • key articles are being found.

To give valid results, a piece of research needs to:

  •     have clear objectives
  •     include methodology to minimise bias
  •     draw the appropriate conclusions from the data.

Adjust your search if it is not retrieving many relevant results, if it is retrieving too many or too few results and if it not is not finding key articles.

If you are conducting a Literature or Systematic Review, refer to our guides for additional resources.

How to appraise resources?

Once you have located relevant literature, you will need to critically appraise the evidence. The Develop your research skills pages provide a start to evaluating resources. 

To develop your critical appraisal skills further, refer to the below information: 

Types of Resources

There are many different types of study designs, which would affect how you would appraise the individual studies, and thus the weight you give each study. This is commonly represented in what is known as the Hierarchy of evidence pyramid.


Hierarchy of evidence pyramid. (Aslam, Georgiev, Mehta & Kumar, 2012) CC BY-NC-SA 3.0

The Hierarchy of Evidence Pyramid is a visual representation that ranks research study designs based on their level of reliability and strength of evidence they provide. At the top are systematic reviews and meta-analyses, followed by randomised controlled trials, observational studies, and expert opinions at the base. The pyramid helps researchers and practitioners assess the strength of evidence supporting specific interventions or conclusions.

Some key resource for appraising literature are:

Is it peer-reviewed?

Peer-reviewed articles undergo a rigorous evaluation process by experts in the field, ensuring a higher standard of quality and reliability, but not all journals go through a rigorous peer-review process. If you are unsure, there are several ways to discover the peer review process. 

  • Visit the website of the journal or publisher: Many reputable journals explicitly state that they use a peer-review system to evaluate and select articles. These can be found in the "About Us" or "Author Guidelines" sections on the journal's website. Peer-reviewed journals typically outline their editorial policies, including details about the review process.
  • Utilise database tags and filters: When you find an article in a database, there may be indications of whether the journal is peer-reviewed. Look for symbols like a referee jersey icon, "Peer-Reviewed" tags, or similar indicators in the database interface. Some databases have filters or options to display only peer-reviewed content.
  • Confirm via Ulrichsweb global serials directory: A comprehensive database that provides information about journals, including whether they are peer-reviewed. You can search for the specific journal title to find details about its editorial policies. The Library provides access to Ulrichsweb via databases search page. 

You may wish to take this process a step further by analysing the resource metrics to assess the quality, influence, and credibility of the resource.

Consider the elements of the resource
  • Abstract: this is what the author wants the reader to take away from their article - what is the starting point? ​ ​
  • Introduction: provides background and a starting point - how does it guide the reader?​ ​
  • Materials and methods: often overlooked but very important - is the methodology understandable, reproducible, direct, and robust?
  • Results: summary and analysis of the data but the statistical reporting is just as important as the words: What do the tables, figures, and legends actually report? what do you think the data means? Decide before reading the discussion​. 
  • Conclusion: were the objectives achieved? Were the hypotheses tested? How do the results relate to other studies you have found?​ ​Do the authors openly discuss any limitations of their study?​ ​What else needs be studied in the future?
Question the research

The following questions may be asked to appraise the validity of research:

  1. What is the research question? Are the objectives of the study clearly stated? Why was this research necessary?
  2. Is the research original or important? Does the study have new findings? Is a treatment outcome clinically relevant?
  3. Does the research question consider the group or population of patients, the intervention or therapy, and the outcome.
  4. Did the authors use the relevant type of study for the research question?
  5. Did the study design minimise the risk of bias in its methodology, reporting, and patient selection? Did the study use best practice design such as a randomised controlled trial or a systematic review? Blinding of patients and outcome assessors, randomization, concealment, intention-to-treat analysis, similarity of patients for known prognostic factors, and completeness of follow-up are indicators of validity.
  6. Was the study designed in line with the original protocol? Is the focus of the report in keeping with the study objectives? Were changes made to the inclusion or exclusion criteria?
  7. Has the study's hypothesis been tested?
  8. Is the analysis of the data accurate? What level of uncertainty surrounds any results?
  9. Are the conclusions based on the data and analysis? Do the authors draw conclusions that are supported by the data? Have the authors discussed other work that both supports and contradicts their findings? Have the authors identified any limitations to their study?
  10. Does the study contribute to the understanding of the problem being investigated? What are the strengths and limitations of the study? Are the findings of the study useful for clinical practice? Do the risks of a treatment or diagnostic procedure outweigh the potential benefits?

Remember that while general principles apply, critical appraisal may be a little different for each type of study.

Critical appraisal tools

The following tools are available to assist in the critical appraisal of resources: 

Resource metrics 

Metrics are useful in assessing the quality, influence, and credibility of academic resources. They help researchers gauge the significance of a work, evaluate author expertise, and understand the reputation of the journals in which research is published, aiding in informed decision-making about resource reliability and relevance.

  • Citation metrics measure how often a work is cited
  • Author impact assesses an individual's scholarly impact
  • Journal quality rankings evaluate the reputation of a publication.

These metrics provide valuable quantitative data that aids researchers in making informed decisions about the reliability and significance of the information they incorporate into their work.

Citation metrics

Citation metrics involve counting how often a particular work is cited in other academic publications. These metrics are used to measure the impact and influence of a specific piece of research within the scholarly community. High citation counts typically suggest that a work has been widely recognised, referenced, and deemed influential in its field. Researchers and institutions often use citation metrics to assess the significance and relevance of academic contributions.

There is no single citation analysis tool that collects all publications and their cited references. For a thorough analysis of the impact of an author or a publication, search multiple databases to find possible cited references.​ ​

A number of resources are available via the Library that help identify cited works. These include Web of Science, Scopus, Google Scholar, Murdoch Research Portal, and other databases with limited citation data. 

Author impact metrics

Author metrics evaluate the scholarly output and impact of individual authors. These may include metrics like the h-index, which considers both the number of publications and their citation impact. Author metrics help assess an author's contribution to their field and their standing in the academic community.

Locate an author on databases such as Scopus or Web of Science to discover:

  • how many of their publications have been indexed by Scopus
  • how many times their publications have been cited
  • which of their publications are most highly cited
  • who they have co-authored papers with
  • their publication and citation trends for the past nine years
  • compare the research output of more than one author
  • their h-index.

The h-index

The h-index is an author metric that measures both the productivity and citation impact of an author's publications. This metric attempts to measure an individual's impact on the research community based upon the number of papers published and the number of citations these papers have received.

The method of calculating a h-index is the number of articles in a database that have received the same number or more citations over time. As an example, a researcher with an h-index of 20 has (of their total number of publications) 20 papers which have been cited at least 20 times each. Therefore, a h-index is not skewed by a single highly cited paper nor by a large number of poorly cited papers.

Note: The h-index may vary between databases as it is calculated by publications and metrics within that particular database. For a thorough analysis of the impact of an author or a publication, look in multiple databases to compare author h-indices.​ ​

Journal quality rankings

Journal quality rankings are used to evaluate an academic journal's impact and quality. These metrics measure the place of a journal within its research field, the relative difficulty of being published in that journal, and the prestige associated with it. Common metrics include the journal impact factor, which assesses the average number of citations to articles in a journal. 

Databases such as Scopus and Web of Science (including CAB Abstracts) can be used to determine the quality of journals in a discipline or field of research. Each database has its own way of measuring impact and you may need to consult and compare multiple database sources. 

CiteScore provided via Scopus calculates the average number of citations received in a calendar year by all items published in that journal in the preceding three years. The calendar year to which a serial title’s issues are assigned is determined by their cover dates, and not the dates that the serial issues were made available online.

Note: CiteScore and CiteScore percentile should not be used to compare journals from different subject areas as they are not field-normalised.

Library guides

The Library provides detailed methodology on a range of bibliometrics including citation metrics, alternative metrics, researcher impact, journal quality and impact, book quality and impact, and university rankings - including access to a number of resources via the below guides:

Note taking 

Taking clear, legible notes will help to focus your critical reading and analysis of your literature review sources. When taking notes, avoid plagiarism by:

  •  keeping track of the difference between information from your sources and from your own ideas
  •  providing clear references, including page numbers.

Note taking methods

Some effective methods of note-taking include:

  • Outlining method: Use headings, sub-headings and bullet points to organize topics
  • Cornell method: Use two columns - in one column write your summary of the authors' conclusions and evidence, and in the other column write down your own analysis and other comments
  • Charting method: Create a list of topics or points you want to write about and use a column for each one. As you read, add references and make notes in the appropriate column
  • Sentence method: Simply write down new ideas and bits of information as numbered sentences
  • Mapping method: Write down key concepts and terms, with related ideas that radiate rom them.

You may consider using the matrix below for your note taking and analysis:

Critical Reading & Analysis Checklist

  1. Does your literature review highlight flaws, gaps, or shortcomings of specific texts or groups of texts?
  2. Have you identified areas that have not yet been researched or have not yet been researched sufficiently?
  3. Does the literature demonstrate a change over time or recent developments that make your research relevant now?
  4. Are you able to discuss research methods used to study the topic and/or related topics?
  5. Can you clearly state why your research is necessary?

Your interpretation: 

  • Read critically​ ​
  • Note 2-4 bullet points for each study that summarises the main points and conclusions​ ​
  • Use a matrix to analyse findings, relevance, and importance of each text​ ​
  • Draw attention to studies that are important, influential, or that bring a new understanding or method of studying your area of research.

Now that you understand why and how to critically appraise your research, you need to apply it to your situation

Study and research support

Beginner study and research support:

Intermediate and advanced research support:

Library resources

Access resources provided by Murdoch University Library: 

Ask our Librarians

Ask our Librarians: Get study, research or teaching support from our friendly Librarians