Skip to Main Content

Anatomy: Journals & Databases

Some key databases for Anatomy

Some key Heath Sciences databases and Anatomy databases are listed below.

Have you got a smart phone or tablet? Did you know that many of our key databases are also available via mobile? When you look up the database in the Health Sciences database list or the Anatomy database list and it has this icon: : then it's available for iPhone /iPod or iPad or Android.

Help with Google Scholar

Remember to always access Google Scholar through the Library webpage - that way you are recognised as a student from the University of Otago and you are given access to all our subscriptions.


Follow the [PDF] link or Otago Article Link on the right-hand side to get the full text of the article.

Help with PubMed

Find out how to get the most out of PubMed by using the tutorials below:

Article not online? Interloan it!

If you can't access an article online, search the journal title in Library Search | Ketu to see if the issue you need is in print.

If the article is not available at Otago, use the Library's free Interloan Service and have it delivered from another library to your desktop.

Assess your findings

Evaluating and thinking critically about sources of information are important skills to develop and apply while undertaking research.

Not all information is reliable and appropriate for academic work, and not all information is relevant to your particular topic.

You should challenge and reflect on information that you find; don’t just accept everything you read.

Assess sources, based on:
OROKOHANGA - ‘The Origins’: The source of the information
MANA - ‘The Authority’ of the information
WHAKAPAPA - ‘The Background’ of the information
MĀRAMATANGA - ‘The Content’ and usability of the
information
ARONGA - ‘The Lens’ or objectivity of the information

Tutorial link

 

Work through this tutorial to develop your skills in evaluating information that you find online:

SIFT - Evaluating Information Tutorial

Even though the library databases are good sources of information, we still need to evaluate that information before we decide to use it. You can do this by asking the following questions:

  • Is the information relevant to your topic?​
  • Who are the authors, are they experts in the field? Who do they work for? What else have they written?​
  • What evidence is given, what references are given, and what methodology is used?​
  • How is the study funded? Is there a bias?​
  • When was the information written, is it still relevant? Has it been updated or amended in light of new evidence?

Use the acronym BADURL to help you evaluate online sources:
B ... Bias
A ... Authority
D ... Date
U ... URL
R ... Relevance
L ... Links

Work through this tutorial to develop your skills in evaluating information that you find online:

TRAAP Test

Or apply these terms to assess if the information you have found answers your research question.

Timeliness

Relevance

Authority

Accuracy

Purpose

Download the TRAAP Test Questions

Here are some resources to help develop your evaluating skills:

  • For a simple 'commonsense' approach to evaluating claims made by the news media, read this short article by Doug Specht & Julio Gimenez from the University of Westminster, and pay close attention to the 6 'steps for reading like a scientist'.
  • If you need to verify a claim, you can check it on a fact-checking website. Check out this guide to Fact Checkers, curated by the University of California Berkley Library, for ideas on what websites to use if you are not sure.
  • Work through this excellent module on 'evaluating information and critical thinking' created by The University of Sheffield Library.
  • Check out this fun, short, easy game, created by a Canadian civics charity organisation 'CIVIX'. The game is designed to improve your verifying sources skills, by teaching you tricks for checking a claim, a source and an image: FAKEOUT

Journal metric terms

CiteScore and SNIP are found using Scopus - Compare sources or Scopus Journal Sources

  • CiteScore calculates the number of citations received by all peer reviewed items published in the preceding four years. Do not compare CiteScores across subject fields.

              2023 CiteScore data - available from 6 June 2024

  • CiteScore Tracker is updated monthly, indicating a journal's performance in the current year.
  • SNIP (Source Normalized Impact per Paper) measures contextual citation impact by weighting citations based on the total number of citations in a subject field. It aims to allow direct comparison of sources in different subject fields. In subject areas where citations are less likely, a single citation will raise the impact of the SNIP value.

Note: The SNIP indicator may change for current and previous years when extra journal content is added to Scopus.

Learn more:

Journal Impact Factor (JIF or IF) is a term from Clarivate Analytics, formerly Thomson Reuters and ISI.

= average number of times articles from a journal, published in the last 2 years, have been cited in the JCR year                    

          2018 JIF

5 year JIF calculation is also available.

2023 Journal Impact Factor (JIF) data - available from 20 June 2024

Journal Citation Reports (JCR) from Incites is available to compare journal metrics across a range of years, e.g. Journal Impact Factor (JIF), Journal Citation Indicator (JCI), Journal ranking by Subject category...

This well known journal metric uses citation data from the Web of Science Core Collection. Use it for the latest JIF data.

Journal Citation Indicator (JCI) is the average Category Normalized Citation Impact (CNCI) of citable items (articles & reviews) published by a journal over a recent three year period. Only available through Journal Citation Reports (JCR).

Researchers are often encouraged to publish in, and read, journals with high impact factors, to enhance their research profile and awareness.

Learn more:

h-index

The journal h-index measures a journal’s impact based on the number of citations to it.

The h-index is the number of papers (N) with at least N citations each. For example, a journal with an h-index of 9 has at least nine published papers, each of which has been cited at least nine times.

Two key factors may impact on a journal h-index:

1. The journals indexed by the database - journals not indexed by the database will not be included in the h-index; no database indexes every journal. Search by journal title, including subtitles, and journal abbreviations.

2. The date range over which the journal history h-index is being measured; long standing journals will likely have a higher h-index.

Web of Science and SJR are the main source databases for h-index information for journals used by scientists.

Publish or Perish - merge this data with that from Web of Science and SJR, then remove duplicate results.

The h-index is not widely used outside the Sciences. At Otago, Humanities scholars are advised not to use the h-index. Citation counts do not necessarily reflect research impact in the Humanities. Social Science scholars may find the journal h-index useful when selecting where to publish.

Learn more:

SCImago Journal Rank (SJR) demonstrates journal rankings or prestige, based on Scopus data. SJR normalises for differences in citation behaviour between different subjects. Subject field, quality, and reputation of the journal have a direct effect on the value of a citation.

= average number of citations per document by total journal citations within a 3-year period, in a subject category, while assigning higher value to citations from more prestigious journals.

Note: The SJR may change for current and previous years when extra journal content is added to Scopus.

Learn more:

Field Weighted Citation Impact (FWCI) shows how well cited an article is, when compared to similar articles in the same discipline, publication year or output type. FWCI takes into account the differences in research behaviour across disciplines. It is particularly valuable for cross-disciplinary research. Citations received up to 3 complete calendar years after publication are considered.

  

If the FWCI = 1, the output performs just as expected for the global average; if the FWCI = 1.48, it means 48% more cited than expected.

Find the FWCI using Scopus at article level; and SciVal (updated weekly from Scopus data) at author, institution and regional level.

Learn more:  

Research Metrics Guidebook - see p46-48.

What is Field Weighted Citation Impact? - tutorial

Category Normalised Citation Impact (CNCI) uses the actual count of citing items, divided by the expected citation rate for documents with the same document type, year of publication and subject area. When a document is assigned to more than one subject area an average of the ratios of the actual to expected citations is used.

   

If the CNCI = 1, this represents performance at par with world average. A CNCI value of 2 is considered twice world average.

Find the CNCI and JNCI (Journal Normalised Citation Impact) using Incites, from Clarivate Analytics, formerly Thomson Reuters and ISI. This metric is useful for benchmarking at author, institutional or regional level.

Learn more:  

Incites Indicators Handbook - see p13-15

About InCites Data: Understanding the Metrics 

 

Complementary Indicators - consider using:

  • FWCI by Subject area/s
  • Outputs in top citation percentiles
  • Publications in top journal percentiles
  • International collaboration
  • % of Articles in the top 1% (or 10%) of journals
  • Documents in Q1 indicators (# or %)
  • Highly cited papers or Hot papers

Eigenfactor Score (EF) - a measure of a journal's importance to the scientific community, based on past 5 years' data from Clarivate; available for 1997-2015. A journal's Scores are scaled so that the sum of all journal scores is 100.  

  • Intended to reflect the influence and prestige of journals.  
  • Considers which journals have contributed to these citations so that highly cited journals, and those with lots of articles, will influence the network more than lesser cited, or smaller journals.
  • References from one article in a journal to another article from the same journal are removed, so that Eigenfactor Scores are not influenced by journal self-citation. 
  • Like the Journal Impact Factor but with weighting for high cites. Metrics are currently available up to JCR 2015 data.

Normalised Eigenfactor Score (EFn) - scaled so that the journal’s mean score = 1.00.  A journal with a Normalized Eigenfactor Score of 3 has three times the total influence of the average journal in the JCR. In 2014, PLoS One had the highest EFn Score of 217.451.

=

 

Article Influence Score: The mean Article Influence Score is 1.00.  An Article Influence Score greater than 1.00 indicates that the articles in a journal have an above-average influence.

  • Measures the average influence, per article, of the papers published in a journal.  
  • Determines the average influence of a journal’s articles over the first five years after publication =               

    

See Citation Metrics: Alternative Metrics as it applies to journals.

Limitations

Obvious factors can heavily influence journal impact factors, such as journal title changes, or publishers gaming the system by requiring authors to cite articles from other journals by the same publisher.  However there are more systematic problems:

  • Journals with review articles have higher impact factors (more people read and cite these).
  • Impact factors can be biased estimates because non-articles increase a journal’s impact factor, e.g. letter to the editor.
  • Wide variations between different research fields inherent to the different patterns of publication in those fields.
  • Differences in journal metric terms used by different companies, e.g. Journal Impact Factor v. CiteScore.
  • Scopus indexes nearly twice as many journals as Web of Science so some metrics will differ, e.g. journal h-index.
  • Different subject categories of journals are used so comparison of ranking metrics is fraught.
  • A journal’s impact factor is no indication of how heavily a specific article within has been used.
  • Increased metrics when a journal publishes more issues per year, and more articles in one issue.
  • Higher impact journals tend to have higher retraction frequencies - check the Retraction Watch database.