Skip to content

Projects metadata

TREC document-to-document relevance assessment

Get JSON-LD Get JSON-LD

Get RO-Crate Get RO Crate

RO-Crate HTML Preview RO-Crate HTML Preview

Started in 2022-06-01 Concluded in 2023-03-31

Description

TREC 2005 Genomics Track data provide document-to-topic relevance assessment. In this project we analyze a document-to-document relevance assessment for a subset of the TREC collection using manual annotation for the judgement. The inter-annotator agreement is evaluated with Fleiss' Kappa.

Keywords

document relevance, doc2doc relevance

Url

Previous project members

Department

Semantic Technologies team at ZB MED

Visit URL Visit ResearchOrganization

Parent organization, consortium or research project

Deutsche Zentralbibliothek für Medizin (ZB MED) - Informationszentrum Lebenswissenschaften

Visit URL Visit ResearchOrganization

STELLA Living Labs Project

Visit URL Visit ResearchProject

Funding

Visit URL Visit Grant

  • Identifier: 407518790
  • Description: Project no. 407518790 (corresponding to the STELLA project)

Outcomes

Document-to-document relevance assessment for TREC Genomics Track 2005

Visit URL Visit ScholarlyArticle

TREC-doc-2-doc-relevance assessment interface

Visit URL Visit SoftwareApplication

TREC-doc-2-doc-relevance

Visit URL Visit SoftwareSourceCode

Fleiss kappa for doc-2-doc relevance assessment

Visit URL Visit Dataset

  • Conformsto: https://bioschemas.org/profiles/Dataset/1.1-DRAFT
  • Identifier: DOI:10.5281/zenodo.7338056
  • Cite as: Giraldo O, Solanki D, Rebholz-Schuhmann D, Castro LJ. Fleiss kappa for doc-2-doc relevance assessment. Zenodo; 2022. doi:10.5281/zenodo.7338056
  • Description: Fleiss' kappa measuring inter-annotator agreement on a document-to-document relevance assessment task. The table contains 7 columns, the first one presents the topics, 8 in total. The second column shows the “reference articles”, represented by their PubMed-ID and organized by topic. The third column shows the Fleiss’ Kappa results. The fourth column shows the interpretation of the Fleiss' Kappa results being: i) “Poor” results <0.20, ii) “Fair” results within 0.21 - 0.40, and iii) “Moderate” results within 0.41 - 0.60. The fifth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Relevant” regarding its corresponding “reference article”. The sixth column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Partially relevant” regarding its corresponding “reference article”. The seventh column shows the PubMed-IDs of evaluation articles rated by the four annotators as “Non-relevant” regarding its corresponding “reference article”
  • Keywords: Fleiss' Kappa, Inter-annoator agreement, TREC Genomics Track 2005, relevance assessment
  • License: http://spdx.org/licenses/CC-BY-4.0
  • URL: https://zenodo.org/record/7338056

  • Datepublished: 2022-11-19

  • Authors: https://orcid.org/0000-0003-2978-8922. https://orcid.org/0009-0004-1529-0095. https://orcid.org/0000-0002-1018-0370. https://orcid.org/0000-0003-3986-0510.

Document-to-document relevant assessment for TREC Genomics Track 2005

Visit URL Visit Dataset