Context: Head and neck cancers are diagnosed at an annual rate of 3% to 7% with respect to the total number of cancers, and 50% to 75% of such new tumours occur in the upper aerodigestive tract. Purpose: In this paper we propose formal methods based approach aimed to identify the head and neck tumour treatment stage by means of model checking. We exploit a set of radiomic features to model medical imaging as a labelled transition system to verify treatment stage properties.Main findings: We experiment the proposed method using a public dataset related to computed tomography images obtained in different treatment stages, reaching an accuracy ranging from 0.924 to 0.978 in treatment stage detection.Principal conclusions: The study confirms the effectiveness of the adoption of formal methods in the head and neck carcinoma treatment stage detection to support radiologists and pathologists.

A novel methodology for head and neck carcinoma treatment stage detection by means of model checking

Reginelli, Alfonso;
2022

Abstract

Context: Head and neck cancers are diagnosed at an annual rate of 3% to 7% with respect to the total number of cancers, and 50% to 75% of such new tumours occur in the upper aerodigestive tract. Purpose: In this paper we propose formal methods based approach aimed to identify the head and neck tumour treatment stage by means of model checking. We exploit a set of radiomic features to model medical imaging as a labelled transition system to verify treatment stage properties.Main findings: We experiment the proposed method using a public dataset related to computed tomography images obtained in different treatment stages, reaching an accuracy ranging from 0.924 to 0.978 in treatment stage detection.Principal conclusions: The study confirms the effectiveness of the adoption of formal methods in the head and neck carcinoma treatment stage detection to support radiologists and pathologists.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11591/488228
Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact