User Tools

Site Tools


neuroimagen:centiloid

This is an old revision of the document!


SUVR a Centiloid

Modelo lineal

Segun Rowe et. al. la transformacion de SUVR_FBB a Centiloid sigue la relación,

$ CL = 153.4 \times SUVR_{FBB} - 154.9 $

Esto es sencillo de implementar pero antes hay que calibrar el metodo de obtencion de SUVR de la pipeline con las imagenes procedentes de GAAIN.

Procesando GAAIN

Basicamente descargamos las imagenes y los valores de centiloid calculados en GAAIN,

https://www.gaaindata.org/data/centiloid/FBBproject_E-25_MR.zip

https://www.gaaindata.org/data/centiloid/FBBproject_E-25_FBB_90110.zip

https://www.gaaindata.org/data/centiloid/FBBproject_SupplementaryTable.xlsx

Y hemos de compara los valores de centiloid obtenidos por nuestra pipeline con los valores obtenidos en GAAIN.

Voy a hacer un proyecto nuevo para esto y voy a copiar alli todos los archivos. Las imagenes vienen DICOM, asi que hay que convertirlas,

[osotolongo@detritus centiloid]$ tree MRDCM/
MRDCM/
├── 1008_MR
│   ├── 100.dcm
│   ├── 101.dcm
│   ├── 102.dcm
│   ├── 103.dcm
│   ├── 104.dcm
│   ├── 105.dcm
│   ├── 106.dcm
│   ├── 107.dcm
│   ├── 108.dcm
│   ├── 109.dcm
│   ├── 10.dcm
........
 
[osotolongo@detritus centiloid]$ tree FBBDCM/
FBBDCM/
├── 1008_PET_FBB
│   ├── 10.dcm
│   ├── 11.dcm
│   ├── 12.dcm
│   ├── 13.dcm
│   ├── 14.dcm
│   ├── 15.dcm
│   ├── 16.dcm
│   ├── 17.dcm
│   ├── 18.dcm
│   ├── 19.dcm
│   ├── 1.dcm
..............

Alla vamos. Creo el csv del proyecto,

[osotolongo@detritus centiloid]$ ls MRDCM/ | sed 's/\(.*\)_MR/\1;sub/' > centiloid.csv

A ver como convertimos,

[osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ MRDCM/1008_MR/
Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux)
Found 176 DICOM file(s)
Convert 176 DICOM as tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2 (256x256x176x1)
compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii"
Conversion required 1.217848 seconds (0.450000 for core code).
[osotolongo@detritus centiloid]$ ls tmp/
1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.json	1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii.gz
 
[osotolongo@detritus centiloid]$ for x in MRDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_.*/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} mri/${y}.nii.gz; mv ${t%.nii.gz}.json mri/${y}.json; done
 
[osotolongo@detritus centiloid]$ ls mri
sub1008s0001.json    sub1015s0001.json	  sub1022s0001.json    sub1026s0001.json    sub1030s0001.json	 sub1034s0001.json    sub1038s0001.json    sub2017s0001.json	sub2032s0001.json
sub1008s0001.nii.gz  sub1015s0001.nii.gz  sub1022s0001.nii.gz  sub1026s0001.nii.gz  sub1030s0001.nii.gz  sub1034s0001.nii.gz  sub1038s0001.nii.gz  sub2017s0001.nii.gz	sub2032s0001.nii.gz
sub1009s0001.json    sub1018s0001.json	  sub1023s0001.json    sub1028s0001.json    sub1031s0001.json	 sub1036s0001.json    sub2002s0001.json    sub2029s0001.json
sub1009s0001.nii.gz  sub1018s0001.nii.gz  sub1023s0001.nii.gz  sub1028s0001.nii.gz  sub1031s0001.nii.gz  sub1036s0001.nii.gz  sub2002s0001.nii.gz  sub2029s0001.nii.gz
sub1010s0001.json    sub1019s0001.json	  sub1024s0001.json    sub1029s0001.json    sub1032s0001.json	 sub1037s0001.json    sub2005s0001.json    sub2030s0001.json
sub1010s0001.nii.gz  sub1019s0001.nii.gz  sub1024s0001.nii.gz  sub1029s0001.nii.gz  sub1032s0001.nii.gz  sub1037s0001.nii.gz  sub2005s0001.nii.gz  sub2030s0001.nii.gz
 
[osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ FBBDCM/1008_PET_FBB/
Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux)
Found 90 DICOM file(s)
Convert 90 DICOM as tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180 (128x128x90x1)
compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii"
Conversion required 1.233496 seconds (0.170000 for core code).
[osotolongo@detritus centiloid]$ ls tmp
1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.json  1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii.gz
 
[osotolongo@detritus centiloid]$ for x in FBBDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_PET_FBB/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} fbb/${y}.nii.gz; mv ${t%.nii.gz}.json fbb/${y}.json; done
[osotolongo@detritus centiloid]$ ls fbb
sub1008s0001.json    sub1015s0001.json	  sub1022s0001.json    sub1026s0001.json    sub1030s0001.json	 sub1034s0001.json    sub1038s0001.json    sub2017s0001.json	sub2032s0001.json
sub1008s0001.nii.gz  sub1015s0001.nii.gz  sub1022s0001.nii.gz  sub1026s0001.nii.gz  sub1030s0001.nii.gz  sub1034s0001.nii.gz  sub1038s0001.nii.gz  sub2017s0001.nii.gz	sub2032s0001.nii.gz
sub1009s0001.json    sub1018s0001.json	  sub1023s0001.json    sub1028s0001.json    sub1031s0001.json	 sub1036s0001.json    sub2002s0001.json    sub2029s0001.json
sub1009s0001.nii.gz  sub1018s0001.nii.gz  sub1023s0001.nii.gz  sub1028s0001.nii.gz  sub1031s0001.nii.gz  sub1036s0001.nii.gz  sub2002s0001.nii.gz  sub2029s0001.nii.gz
sub1010s0001.json    sub1019s0001.json	  sub1024s0001.json    sub1029s0001.json    sub1032s0001.json	 sub1037s0001.json    sub2005s0001.json    sub2030s0001.json
sub1010s0001.nii.gz  sub1019s0001.nii.gz  sub1024s0001.nii.gz  sub1029s0001.nii.gz  sub1032s0001.nii.gz  sub1037s0001.nii.gz  sub2005s0001.nii.gz  sub2030s0001.nii.gz

Preparamos y lanzamos FS,

[osotolongo@detritus centiloid]$ fsl2fs.pl centiloid
 
[osotolongo@detritus centiloid]$ precon.pl centiloid
Submitted batch job 15673
[osotolongo@detritus centiloid]$ squeue
             JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
             15673     devel fs_recon osotolon PD       0:00      1 (Dependency)
             15648     devel fs_recon osotolon  R       0:05      1 brick01
             15649     devel fs_recon osotolon  R       0:05      1 brick01
             15650     devel fs_recon osotolon  R       0:05      1 brick01
             15651     devel fs_recon osotolon  R       0:05      1 brick01
             15652     devel fs_recon osotolon  R       0:05      1 brick01
             15653     devel fs_recon osotolon  R       0:05      1 brick01
             15654     devel fs_recon osotolon  R       0:05      1 brick01
             15655     devel fs_recon osotolon  R       0:05      1 brick01
             15656     devel fs_recon osotolon  R       0:05      1 brick01
             15657     devel fs_recon osotolon  R       0:05      1 brick01
             15658     devel fs_recon osotolon  R       0:05      1 brick01
             15659     devel fs_recon osotolon  R       0:05      1 brick01
             15660     devel fs_recon osotolon  R       0:05      1 brick01
             15661     devel fs_recon osotolon  R       0:05      1 brick01
             15662     devel fs_recon osotolon  R       0:05      1 brick01
             15663     devel fs_recon osotolon  R       0:05      1 brick01
             15664     devel fs_recon osotolon  R       0:02      1 brick01
             15665     devel fs_recon osotolon  R       0:02      1 brick01
             15666     devel fs_recon osotolon  R       0:02      1 brick01
             15667     devel fs_recon osotolon  R       0:02      1 brick01
             15668     devel fs_recon osotolon  R       0:02      1 brick01
             15669     devel fs_recon osotolon  R       0:02      1 brick01
             15670     devel fs_recon osotolon  R       0:02      1 brick01
             15671     devel fs_recon osotolon  R       0:02      1 brick01
             15672     devel fs_recon osotolon  R       0:02      1 brick01

Ahora, las imagenes PEt vieen en formato distinto al usado habitualmente (1 solo slice de 20 min) por lo que hay que retocar el script fbbtemp_reg.sh un poco para añadirle la posibilidad de proesar esto correctamente.

fbbtemp_reg.sh
#!/bin/sh
 
study=$1
shift
 
id=$1
shift
 
tdir=$1
shift
 
wdir=$1
shift
 
items=(`ls ${tdir}/${id}* | grep -v "_" | grep -v ".json"`)
#shift
 
debug=0
 
#Now get the uncorrected PETs and register to user space MRI
for i in ${!items[*]}; do
        tf=`printf "${id}s%04d" $i`
        #${FSLDIR}/bin/fslreorient2std ${tdir}/${tf} ${tdir}/${id}_tmp
        ${FSLDIR}/bin/imcp ${tdir}/${tf} ${tdir}/${id}_tmp
        ${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${tdir}/${id}_tmp -omat ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg
        #${FSLDIR}/bin/flirt -ref ${wdir}/${id}_brain -in ${tdir}/${id}_tmp -init ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg
done
if [ ${#items[@]} -gt 1 ]; then
echo ${#items[@]}
a=`for i in ${!items[*]}; do printf " ${tdir}/${id}s%04d_reg " $i; done`
${FSLDIR}/bin/fslmerge -t ${wdir}/${id}_tmp_mvc $a
#${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -thr 0 -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc
#${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc
 
${FSLDIR}/bin/mcflirt -in ${wdir}/${id}_tmp_mvc -out ${wdir}/${id}_tmp_corr
${PIPEDIR}/bin/4dmean.pl ${wdir}/${id}_tmp_corr
${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${wdir}/${id}_mean -omat ${wdir}/${id}_fbb2struc.mat -out ${wdir}/${id}_fbb
else
tf=`printf "${id}s%04d" ${item[0]}`
${FSLDIR}/bin/mcflirt -in ${tdir}/${tf}_reg -out ${wdir}/${id}_tmp_corr
${FSLDIR}/bin/imcp ${wdir}/${id}_tmp_corr ${wdir}/${id}_fbb
fi
 
if [ $debug = 0 ] ; then
    rm ${tdir}/${id}_tmp*
    rm ${wdir}/${id}_tmp*
fi

De aqui se puede hacer lo usual,

[osotolongo@detritus centiloid]$ fbb_correct.pl centiloid

Y podemos revisar el report,  registration report Todo parece ir bien asi que,

[osotolongo@detritus ~]$ parallel_fbb_rois_metrics.pl centiloid

Ahora solo hay que compara los valores globales de SUVR con los de la tabla de GAAIN.

[osotolongo@detritus centiloid]$ awk -F ";" '{print $1,$2,$4}' FBB_suvr_centiloid.csv | grep -v Y | sort -n > reference.dat
[osotolongo@detritus centiloid]$ join calcs.dat reference.dat > toreview.dat

Lo voy a hacer con gnuplot que es mas facil,

gnuplot> fit f(x) "toreview.dat" u 2:3 via m,n
iter      chisq       delta/lim  lambda   m             n            
   0 2.2640655922e+00   0.00e+00  1.03e+00    8.467408e-01   1.869890e-02
   1 1.5434204471e-01  -1.37e+06  1.03e-01    1.015137e+00   1.868981e-02
   2 1.5172756904e-01  -1.72e+03  1.03e-02    1.021604e+00   1.342157e-02
   3 1.3453722070e-01  -1.28e+04  1.03e-03    1.072796e+00  -7.770453e-02
   4 1.3400723332e-01  -3.95e+02  1.03e-04    1.083478e+00  -9.671955e-02
   5 1.3400723101e-01  -1.72e-03  1.03e-05    1.083501e+00  -9.675931e-02
iter      chisq       delta/lim  lambda   m             n            

After 5 iterations the fit converged.
final sum of squares of residuals : 0.134007
rel. change during last iteration : -1.72206e-08

degrees of freedom    (FIT_NDF)                        : 22
rms of residuals      (FIT_STDFIT) = sqrt(WSSR/ndf)    : 0.0780464
variance of residuals (reduced chisquare) = WSSR/ndf   : 0.00609124

Final set of parameters            Asymptotic Standard Error
=======================            ==========================
m               = 1.0835           +/- 0.03745      (3.456%)
n               = -0.0967593       +/- 0.0646       (66.76%)

correlation matrix of the fit parameters:
                m      n      
m               1.000 
n              -0.969  1.000 

Y con pendiente de $1.08 \pm 0.04$ creo que estamos bien.

Resumen: Hemos validado correctamente el metodo que se usapara sacar los SUVR.

Implementando pipeline Centiloid

Con el objetivo de compara con otros estudios hemos de implementar el metodo usando las plantillas originales de Klunk et. al.

neuroimagen/centiloid.1554972408.txt.gz · Last modified: 2020/08/04 10:46 (external edit)