User Tools

Site Tools


neuroimagen:centiloid

This is an old revision of the document!


SUVR a Centiloid

Modelo lineal

Segun Rowe et. al. la transformacion de SUVR_FBB a Centiloid sigue la relación,

$ CL = 153.4 \times SUVR_{FBB} - 154.9 $

Esto es sencillo de implementar pero antes hay que calibrar el metodo de obtencion de SUVR de la pipeline con las imagenes procedentes de GAAIN.

Procesando GAAIN

Basicamente descargamos las imagenes y los valores de centiloid calculados en GAAIN,

https://www.gaaindata.org/data/centiloid/FBBproject_E-25_MR.zip

https://www.gaaindata.org/data/centiloid/FBBproject_E-25_FBB_90110.zip

https://www.gaaindata.org/data/centiloid/FBBproject_SupplementaryTable.xlsx

Y hemos de compara los valores de centiloid obtenidos por nuestra pipeline con los valores obtenidos en GAAIN.

Voy a hacer un proyecto nuevo para esto y voy a copiar alli todos los archivos. Las imagenes vienen DICOM, asi que hay que convertirlas,

[osotolongo@detritus centiloid]$ tree MRDCM/
MRDCM/
├── 1008_MR
│   ├── 100.dcm
│   ├── 101.dcm
│   ├── 102.dcm
│   ├── 103.dcm
│   ├── 104.dcm
│   ├── 105.dcm
│   ├── 106.dcm
│   ├── 107.dcm
│   ├── 108.dcm
│   ├── 109.dcm
│   ├── 10.dcm
........
 
[osotolongo@detritus centiloid]$ tree FBBDCM/
FBBDCM/
├── 1008_PET_FBB
│   ├── 10.dcm
│   ├── 11.dcm
│   ├── 12.dcm
│   ├── 13.dcm
│   ├── 14.dcm
│   ├── 15.dcm
│   ├── 16.dcm
│   ├── 17.dcm
│   ├── 18.dcm
│   ├── 19.dcm
│   ├── 1.dcm
..............

Alla vamos. Creo el csv del proyecto,

[osotolongo@detritus centiloid]$ ls MRDCM/ | sed 's/\(.*\)_MR/\1;sub/' > centiloid.csv

A ver como convertimos,

[osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ MRDCM/1008_MR/
Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux)
Found 176 DICOM file(s)
Convert 176 DICOM as tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2 (256x256x176x1)
compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii"
Conversion required 1.217848 seconds (0.450000 for core code).
[osotolongo@detritus centiloid]$ ls tmp/
1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.json	1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii.gz
 
[osotolongo@detritus centiloid]$ for x in MRDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_.*/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} mri/${y}.nii.gz; mv ${t%.nii.gz}.json mri/${y}.json; done
 
[osotolongo@detritus centiloid]$ ls mri
sub1008s0001.json    sub1015s0001.json	  sub1022s0001.json    sub1026s0001.json    sub1030s0001.json	 sub1034s0001.json    sub1038s0001.json    sub2017s0001.json	sub2032s0001.json
sub1008s0001.nii.gz  sub1015s0001.nii.gz  sub1022s0001.nii.gz  sub1026s0001.nii.gz  sub1030s0001.nii.gz  sub1034s0001.nii.gz  sub1038s0001.nii.gz  sub2017s0001.nii.gz	sub2032s0001.nii.gz
sub1009s0001.json    sub1018s0001.json	  sub1023s0001.json    sub1028s0001.json    sub1031s0001.json	 sub1036s0001.json    sub2002s0001.json    sub2029s0001.json
sub1009s0001.nii.gz  sub1018s0001.nii.gz  sub1023s0001.nii.gz  sub1028s0001.nii.gz  sub1031s0001.nii.gz  sub1036s0001.nii.gz  sub2002s0001.nii.gz  sub2029s0001.nii.gz
sub1010s0001.json    sub1019s0001.json	  sub1024s0001.json    sub1029s0001.json    sub1032s0001.json	 sub1037s0001.json    sub2005s0001.json    sub2030s0001.json
sub1010s0001.nii.gz  sub1019s0001.nii.gz  sub1024s0001.nii.gz  sub1029s0001.nii.gz  sub1032s0001.nii.gz  sub1037s0001.nii.gz  sub2005s0001.nii.gz  sub2030s0001.nii.gz
 
[osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ FBBDCM/1008_PET_FBB/
Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux)
Found 90 DICOM file(s)
Convert 90 DICOM as tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180 (128x128x90x1)
compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii"
Conversion required 1.233496 seconds (0.170000 for core code).
[osotolongo@detritus centiloid]$ ls tmp
1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.json  1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii.gz
 
[osotolongo@detritus centiloid]$ for x in FBBDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_PET_FBB/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} fbb/${y}.nii.gz; mv ${t%.nii.gz}.json fbb/${y}.json; done
[osotolongo@detritus centiloid]$ ls fbb
sub1008s0001.json    sub1015s0001.json	  sub1022s0001.json    sub1026s0001.json    sub1030s0001.json	 sub1034s0001.json    sub1038s0001.json    sub2017s0001.json	sub2032s0001.json
sub1008s0001.nii.gz  sub1015s0001.nii.gz  sub1022s0001.nii.gz  sub1026s0001.nii.gz  sub1030s0001.nii.gz  sub1034s0001.nii.gz  sub1038s0001.nii.gz  sub2017s0001.nii.gz	sub2032s0001.nii.gz
sub1009s0001.json    sub1018s0001.json	  sub1023s0001.json    sub1028s0001.json    sub1031s0001.json	 sub1036s0001.json    sub2002s0001.json    sub2029s0001.json
sub1009s0001.nii.gz  sub1018s0001.nii.gz  sub1023s0001.nii.gz  sub1028s0001.nii.gz  sub1031s0001.nii.gz  sub1036s0001.nii.gz  sub2002s0001.nii.gz  sub2029s0001.nii.gz
sub1010s0001.json    sub1019s0001.json	  sub1024s0001.json    sub1029s0001.json    sub1032s0001.json	 sub1037s0001.json    sub2005s0001.json    sub2030s0001.json
sub1010s0001.nii.gz  sub1019s0001.nii.gz  sub1024s0001.nii.gz  sub1029s0001.nii.gz  sub1032s0001.nii.gz  sub1037s0001.nii.gz  sub2005s0001.nii.gz  sub2030s0001.nii.gz

Preparamos y lanzamos FS,

[osotolongo@detritus centiloid]$ fsl2fs.pl centiloid
 
[osotolongo@detritus centiloid]$ precon.pl centiloid
Submitted batch job 15673
[osotolongo@detritus centiloid]$ squeue
             JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
             15673     devel fs_recon osotolon PD       0:00      1 (Dependency)
             15648     devel fs_recon osotolon  R       0:05      1 brick01
             15649     devel fs_recon osotolon  R       0:05      1 brick01
             15650     devel fs_recon osotolon  R       0:05      1 brick01
             15651     devel fs_recon osotolon  R       0:05      1 brick01
             15652     devel fs_recon osotolon  R       0:05      1 brick01
             15653     devel fs_recon osotolon  R       0:05      1 brick01
             15654     devel fs_recon osotolon  R       0:05      1 brick01
             15655     devel fs_recon osotolon  R       0:05      1 brick01
             15656     devel fs_recon osotolon  R       0:05      1 brick01
             15657     devel fs_recon osotolon  R       0:05      1 brick01
             15658     devel fs_recon osotolon  R       0:05      1 brick01
             15659     devel fs_recon osotolon  R       0:05      1 brick01
             15660     devel fs_recon osotolon  R       0:05      1 brick01
             15661     devel fs_recon osotolon  R       0:05      1 brick01
             15662     devel fs_recon osotolon  R       0:05      1 brick01
             15663     devel fs_recon osotolon  R       0:05      1 brick01
             15664     devel fs_recon osotolon  R       0:02      1 brick01
             15665     devel fs_recon osotolon  R       0:02      1 brick01
             15666     devel fs_recon osotolon  R       0:02      1 brick01
             15667     devel fs_recon osotolon  R       0:02      1 brick01
             15668     devel fs_recon osotolon  R       0:02      1 brick01
             15669     devel fs_recon osotolon  R       0:02      1 brick01
             15670     devel fs_recon osotolon  R       0:02      1 brick01
             15671     devel fs_recon osotolon  R       0:02      1 brick01
             15672     devel fs_recon osotolon  R       0:02      1 brick01

Ahora, las imagenes PEt vieen en formato distinto al usado habitualmente (1 solo slice de 20 min) por lo que hay que retocar el script fbbtemp_reg.sh un poco para añadirle la posibilidad de proesar esto correctamente.

fbbtemp_reg.sh
#!/bin/sh
 
study=$1
shift
 
id=$1
shift
 
tdir=$1
shift
 
wdir=$1
shift
 
items=(`ls ${tdir}/${id}* | grep -v "_" | grep -v ".json"`)
#shift
 
debug=0
 
#Now get the uncorrected PETs and register to user space MRI
for i in ${!items[*]}; do
        tf=`printf "${id}s%04d" $i`
        #${FSLDIR}/bin/fslreorient2std ${tdir}/${tf} ${tdir}/${id}_tmp
        ${FSLDIR}/bin/imcp ${tdir}/${tf} ${tdir}/${id}_tmp
        ${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${tdir}/${id}_tmp -omat ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg
        #${FSLDIR}/bin/flirt -ref ${wdir}/${id}_brain -in ${tdir}/${id}_tmp -init ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg
done
if [ ${#items[@]} -gt 1 ]; then
echo ${#items[@]}
a=`for i in ${!items[*]}; do printf " ${tdir}/${id}s%04d_reg " $i; done`
${FSLDIR}/bin/fslmerge -t ${wdir}/${id}_tmp_mvc $a
#${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -thr 0 -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc
#${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc
 
${FSLDIR}/bin/mcflirt -in ${wdir}/${id}_tmp_mvc -out ${wdir}/${id}_tmp_corr
${PIPEDIR}/bin/4dmean.pl ${wdir}/${id}_tmp_corr
${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${wdir}/${id}_mean -omat ${wdir}/${id}_fbb2struc.mat -out ${wdir}/${id}_fbb
else
tf=`printf "${id}s%04d" ${item[0]}`
${FSLDIR}/bin/mcflirt -in ${tdir}/${tf}_reg -out ${wdir}/${id}_tmp_corr
${FSLDIR}/bin/imcp ${wdir}/${id}_tmp_corr ${wdir}/${id}_fbb
fi
 
if [ $debug = 0 ] ; then
    rm ${tdir}/${id}_tmp*
    rm ${wdir}/${id}_tmp*
fi

De aqui se puede hacer lo usual,

[osotolongo@detritus centiloid]$ fbb_correct.pl centiloid

Y podemos revisar el report,  registration report Todo parece ir bien asi que,

[osotolongo@detritus ~]$ parallel_fbb_rois_metrics.pl centiloid

Ahora solo hay que compara los valores globales de SUVR con los de la tabla de GAAIN.

[osotolongo@detritus centiloid]$ awk -F ";" '{print $1,$2,$4}' FBB_suvr_centiloid.csv | grep -v Y | sort -n > reference.dat
[osotolongo@detritus centiloid]$ join calcs.dat reference.dat > toreview.dat

Lo voy a hacer con gnuplot que es mas facil,

gnuplot> fit f(x) "toreview.dat" u 2:3 via m,n
iter      chisq       delta/lim  lambda   m             n            
   0 2.2640655922e+00   0.00e+00  1.03e+00    8.467408e-01   1.869890e-02
   1 1.5434204471e-01  -1.37e+06  1.03e-01    1.015137e+00   1.868981e-02
   2 1.5172756904e-01  -1.72e+03  1.03e-02    1.021604e+00   1.342157e-02
   3 1.3453722070e-01  -1.28e+04  1.03e-03    1.072796e+00  -7.770453e-02
   4 1.3400723332e-01  -3.95e+02  1.03e-04    1.083478e+00  -9.671955e-02
   5 1.3400723101e-01  -1.72e-03  1.03e-05    1.083501e+00  -9.675931e-02
iter      chisq       delta/lim  lambda   m             n            

After 5 iterations the fit converged.
final sum of squares of residuals : 0.134007
rel. change during last iteration : -1.72206e-08

degrees of freedom    (FIT_NDF)                        : 22
rms of residuals      (FIT_STDFIT) = sqrt(WSSR/ndf)    : 0.0780464
variance of residuals (reduced chisquare) = WSSR/ndf   : 0.00609124

Final set of parameters            Asymptotic Standard Error
=======================            ==========================
m               = 1.0835           +/- 0.03745      (3.456%)
n               = -0.0967593       +/- 0.0646       (66.76%)

correlation matrix of the fit parameters:
                m      n      
m               1.000 
n              -0.969  1.000 

Y con pendiente de $1.08 \pm 0.04$ creo que estamos bien.

Resumen: Hemos validado correctamente el metodo que se usapara sacar los SUVR.

Implementando pipeline Centiloid

Con el objetivo de compara con otros estudios hemos de implementar el metodo usando las plantillas originales de Klunk et. al.. La arte buena es que el preproceso y el registro al espacio de sujeto lo tenemos hecho. Solo habria que registrar al espacio MNI y sacar las metricas usando las plantillas.

Voy a hacer un script nuevo para esto, reutilizando el antiguo. Primero hago un script sencillo que registre con ANTS al espacio MNI,

fbb2std.sh
#!/bin/sh
 
id=$1
shift
 
wdir=$1
shift
 
echo "I need the FBB image at MNI space"                                                                                                                                               
ANTS 3 -m CC[${FSLDIR}/data/standard/MNI152_T1_2mm.nii.gz, ${wdir}/${id}_struc.nii.gz, 1, 4] -r Gauss[0,3] -t Elast[1.5] -i 30x20x10 -o ${wdir}/${id}_fbb_t1_mni.nii.gz
WarpImageMultiTransform 3 ${wdir}/${id}_fbb.nii.gz ${wdir}/${id}_fbb_mni.nii.gz -R ${FSLDIR}/data/standard/MNI152_T1_2mm.nii.gz ${wdir}/${id}_fbb_t1_mniWarp.nii.gz ${wdir}/${id}_fbb_t1_mniAffine.txt

y despues hago un wrapper que lo lance en paralelo y haga las metricas,

parallel_fbb_cl_metrics.pl
#!/usr/bin/perl
 
use strict; use warnings;
 
use File::Find::Rule;
use NEURO qw(print_help get_pair load_study achtung shit_done get_lut check_or_make centiloid_fbb);
use Data::Dump qw(dump);
use File::Remove 'remove';
use File::Basename qw(basename);
use Parallel::ForkManager;
 
my %ROI_Comps = (
	"Cgray" => "voi_CerebGry_2mm.nii",
	"WCbl" => "voi_WhlCbl_2mm.nii",
	"WcblBS" => "voi_WhlCblBrnStm_2mm.nii",
	"Pons" => "voi_Pons_2mm.nii",
	"ctx" => "voi_ctx_2mm.nii",
 );
 
my $roi_paths = $ENV{'PIPEDIR'}.'/lib/Centiloid_Std_VOI/nifti/2mm/';
my $attach = 1;
my $reduce = 0;
my $withstd = 0;
my $cfile;
 
@ARGV = ("-h") unless @ARGV;
 
while (@ARGV and $ARGV[0] =~ /^-/) {
    $_ = shift;
    last if /^--$/;
    if (/^-l$/) { $attach = 0;}
    if (/^-r$/) { $reduce = 1;}
    if (/^-std$/) {$withstd = 1;}
    if (/^-cut/) { $cfile = shift; chomp($cfile);}
    if (/^-h$/) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;}
}
 
my $study = shift;
unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;}
my %std = load_study($study);
my $w_dir=$std{'WORKING'};
my $data_dir=$std{'DATA'};
my $max_processes = 20;
 
# Redirect ouput to logfile (do it only when everything is fine)
#my $debug = "$data_dir/.debug_pet_fs_metrics.log";
#open STDOUT, ">$debug" or die "Can't redirect stdout";
#open STDERR, ">&STDOUT" or die "Can't dup stdout";
 
my $pm = new Parallel::ForkManager($max_processes);
our %subjects;
 
$pm->run_on_finish(
	sub { my ($pid, $exit_code, $ident, $exit_signal, $core_dump, $data) = @_;
		foreach my $tag (sort keys %{$data}){
			$subjects{$ident}{$tag}=${$data}{$tag}
		}
	}
);
my @plist = find(file => 'name' => "*_fbb.nii.gz", '!name' => "*tmp*", in => $w_dir);
my $ofile = $data_dir."/".$study."_fbb_cl.csv";
my $patt = '([A-Z,a-z]{1,4})(\d{1,6})_fbb';
 
my @pets;
 
if ($cfile){
	my %cuts = get_pair($data_dir."/".$cfile);
	foreach my $cut (keys %cuts){
		if(grep {/$cut/} @plist){
			$pets[$cut] = $plist[$cut];
		}
	}
}else{
	@pets = @plist;
}
 
 
foreach my $pet (sort @pets){   
    (my $dg,my $subject) = $pet =~ /$patt/;
    if($subject){
		$subjects{$subject}{'dg'} = $dg;
		$subjects{$subject}{'pet'} = $pet;
	}
}
 
foreach my $subject (sort keys %subjects){
	my $care;
	my $norm;
	my $dg = $subjects{$subject}{'dg'};
	my @care;
	my %sdata;
	$pm->start($subject) and next;
 
		# Get FBB image into MNI space
		my $order = "fbb2std.sh ".$dg.$subject." ".$w_dir;
		print "$order\n";
		system($order);
		# Apply masks to FBB
		foreach my $npf (sort keys %ROI_Comps){
			my $roi_mask = $roi_paths.$ROI_Comps{$npf};
			# get mean and std for mask
			$order = "fslstats ".$dg.$subject."_fbb_mni -k ".$roi_mask." -M -S";
			print "$order\n";
			(my $mean, my $std) = map{/(\d+\.\d*)\s*(\d+\.\d*)/} qx/$order/;
            $sdata{$npf.'mean'} = $mean;
            $sdata{$npf.'std'} = $std;
		}			
	$pm->finish($subject, \%sdata);	
	# remove temp dir
	#remove( \1, ($mdir));
}
$pm->wait_all_children;
 
open OF, ">$ofile";
 
print OF "Subject";
 
foreach my  $npf (sort keys %ROI_Comps){
        if($withstd){
                print OF ";$npf","_Mean;","$npf","_STD",";$npf","_c_Mean;","$npf","_c_STD";
        }else{  
                print OF ";$npf",";$npf","_c";
        }
}
print OF "\n";
foreach my $subject (sort keys %subjects){
	print OF "$subject";
	foreach my  $npf (sort keys %ROI_Comps){
		my $mean = $subjects{$subject}{$npf.'mean'};
		my $std = $subjects{$subject}{$npf.'std'};
		if($withstd){
			print OF ";$mean",";$std";
			print OF ";", centiloid_fbb($mean), ";", centiloid_fbb($std); 
		}else{
			print OF ";$mean";
			print OF ";", centiloid_fbb($mean);
		}
	}
	print OF "\n";
}
close OF;
 
my $zfile = $ofile.'.gz';
system("gzip -c $ofile > $zfile");
 
if ($attach){
	shit_done basename($ENV{_}), $study, $zfile;
}else{
	achtung basename($ENV{_}), $ofile, $study;
}

Problemas: Ekl procedimiento de registro demora bastante asi que estaria bien paralelizarlo en el cluster y hacer las mascaras luego. Para ello hay que separar los procesos en distintos scripts, pero no deberia ser dificil.

Usando el cluster

fbb_cl_metrics.pl
#!/usr/bin/perl
# Copyright 2019 O. Sotolongo <asqwerty@gmail.com>
use strict; use warnings;
 
use File::Find::Rule;
use NEURO qw(print_help get_pair load_study achtung shit_done get_lut check_or_make centiloid_fbb);
use Data::Dump qw(dump);
use File::Remove 'remove';
use File::Basename qw(basename);
 
my $withstd = 0;
my $cfile;
 
@ARGV = ("-h") unless @ARGV;
 
while (@ARGV and $ARGV[0] =~ /^-/) {
    $_ = shift;
    last if /^--$/;
    if (/^-std$/) {$withstd = 1;}
    if (/^-cut/) { $cfile = shift; chomp($cfile);}
    if (/^-h$/) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;}
}
 
my $study = shift;
unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;}
my %std = load_study($study);
my $w_dir=$std{'WORKING'};
my $data_dir=$std{'DATA'};
my $outdir = "$std{'DATA'}/slurm";
check_or_make($outdir);
our %subjects;
 
my @plist = find(file => 'name' => "*_fbb.nii.gz", '!name' => "*tmp*", in => $w_dir);
my $ofile = $data_dir."/".$study."_fbb_cl.csv";
my $patt = '([A-Z,a-z]{1,4})(\d{1,6})_fbb';
 
my @pets;
 
if ($cfile){
	my %cuts = get_pair($data_dir."/".$cfile);
	foreach my $cut (keys %cuts){
		if(grep {/$cut/} @plist){
			$pets[$cut] = $plist[$cut];
		}
	}
}else{
	@pets = @plist;
}
 
 
foreach my $pet (sort @pets){   
    (my $dg,my $subject) = $pet =~ /$patt/;
    if($subject){
		$subjects{$subject}{'dg'} = $dg;
		$subjects{$subject}{'pet'} = $pet;
	}
}
 
foreach my $subject (sort keys %subjects){
	my $norm;
	my $dg = $subjects{$subject}{'dg'};
	my $subj = $dg.$subject;
	#Making sbatch scripts
	# Get FBB image into MNI space
	##### me quede aqui ---> quitaresto y hacer como en el proc de FS
	my $order = $ENV{'PIPEDIR'}."/bin/fbb2std.sh ".$subj." ".$w_dir;
	my $orderfile = $outdir.'/'.$subj.'_fbb_reg.sh';
	open ORD, ">$orderfile";
	print ORD '#!/bin/bash'."\n";
	print ORD '#SBATCH -J fbb2std_'.$study."\n";
	print ORD '#SBATCH --time=4:0:0'."\n"; #si no ha terminado en X horas matalo
	print ORD '#SBATCH --mail-type=FAIL,TIME_LIMIT,STAGE_OUT'."\n"; #no quieres que te mande email de todo
	print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n";
	print ORD '#SBATCH -o '.$outdir.'/fbb2std-'.$subj.'-%j'."\n";
	print ORD "srun $order\n";
	close ORD;
	system("sbatch $orderfile");
}
my $order = $ENV{'PIPEDIR'}."/bin/fbb_cl_masks.pl ".$study." ".($withstd?"-std":"");
my $orderfile = $outdir.'/fbb_masks.sh';
open ORD, ">$orderfile";
print ORD '#!/bin/bash'."\n";
print ORD '#SBATCH -J fbb2std_'.$study."\n";
print ORD '#SBATCH --time=24:0:0'."\n"; #si no ha terminado en X horas matalo
print ORD '#SBATCH --mail-type=FAIL,END'."\n"; #email cuando termine o falle
print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n";
print ORD '#SBATCH -o '.$outdir.'/fbbmasks-%j'."\n";
print ORD "srun $order\n";
close ORD;
my $xorder = 'sbatch --dependency=singleton'.' '.$orderfile;
exec($xorder);
fbb_cl_masks.pl
#!/usr/bin/perl
# Copyright 2019 O. Sotolongo <asqwerty@gmail.com>
use strict; use warnings;
 
use File::Find::Rule;
use NEURO qw(print_help get_pair load_study achtung shit_done get_lut check_or_make centiloid_fbb);
use Data::Dump qw(dump);
use File::Remove 'remove';
use File::Basename qw(basename);
 
my %ROI_Comps = (
	"Cgray" => "voi_CerebGry_2mm.nii",
	"WCbl" => "voi_WhlCbl_2mm.nii",
	"WcblBS" => "voi_WhlCblBrnStm_2mm.nii",
	"Pons" => "voi_Pons_2mm.nii",
	"ctx" => "voi_ctx_2mm.nii",
 );
 
my $roi_paths = $ENV{'PIPEDIR'}.'/lib/Centiloid_Std_VOI/nifti/2mm/';
my $withstd = 0;
my $cfile;
 
@ARGV = ("-h") unless @ARGV;
 
while (@ARGV and $ARGV[0] =~ /^-/) {
    $_ = shift;
    last if /^--$/;
    if (/^-std$/) {$withstd = 1;}
    if (/^-cut/) { $cfile = shift; chomp($cfile);}
}
 
my $study = shift;
unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;}
my %std = load_study($study);
my $w_dir=$std{'WORKING'};
my $data_dir=$std{'DATA'};
 
our %subjects;
 
my @plist = find(file => 'name' => "*_fbb.nii.gz", '!name' => "*tmp*", in => $w_dir);
my $ofile = $data_dir."/".$study."_fbb_cl.csv";
my $patt = '([A-Z,a-z]{1,4})(\d{1,6})_fbb';
 
my @pets;
 
if ($cfile){
	my %cuts = get_pair($data_dir."/".$cfile);
	foreach my $cut (keys %cuts){
		if(grep {/$cut/} @plist){
			$pets[$cut] = $plist[$cut];
		}
	}
}else{
	@pets = @plist;
}
 
foreach my $pet (sort @pets){   
    (my $dg,my $subject) = $pet =~ /$patt/;
    if($subject){
		$subjects{$subject}{'dg'} = $dg;
		$subjects{$subject}{'pet'} = $pet;
	}
}
my %sdata;
foreach my $subject (sort keys %subjects){
	my $care;
	my $norm;
	my $dg = $subjects{$subject}{'dg'};
		# Apply masks to FBB
		foreach my $npf (sort keys %ROI_Comps){
			my $roi_mask = $roi_paths.$ROI_Comps{$npf};
			# get mean and std for mask
			my $order = "fslstats ".$w_dir."/".$dg.$subject."_fbb_mni -k ".$roi_mask." -M -S";
			print "$order\n";
			(my $mean, my $std) = map{/(\d+\.\d*)\s*(\d+\.\d*)/} qx/$order/;
            $subjects{$subject}{$npf.'mean'} = $mean;
            $subjects{$subject}{$npf.'std'} = $std;
		}			
}
 
open OF, ">$ofile";
print OF "Subject";
foreach my  $npf (sort keys %ROI_Comps){
        if($withstd){
                print OF ";$npf","_Mean;","$npf","_STD",";$npf","_c_Mean;","$npf","_c_STD";
        }else{  
                print OF ";$npf",";$npf","_c";
        }
}
print OF "\n";
foreach my $subject (sort keys %subjects){
	print OF "$subject";
	foreach my  $npf (sort keys %ROI_Comps){
		my $mean = $subjects{$subject}{$npf.'mean'};
		my $std = $subjects{$subject}{$npf.'std'};
		if($withstd){
			print OF ";$mean",";$std";
			print OF ";", centiloid_fbb($mean), ";", centiloid_fbb($std);
		}else{
			print OF ";$mean";
			print OF ";", centiloid_fbb($mean);
		}
	}
	print OF "\n";
}
close OF;
[osotolongo@detritus centiloid]$ fbb_cl_metrics.pl centiloid
.....
[osotolongo@detritus centiloid]$ squeue
             JOBID PARTITION     NAME     USER ST       TIME  NODES NODELIST(REASON)
             15944     devel fbb2std_ osotolon PD       0:00      1 (Dependency)
             15920     devel fbb2std_ osotolon  R       0:04      1 brick01
             15921     devel fbb2std_ osotolon  R       0:04      1 brick01
             15922     devel fbb2std_ osotolon  R       0:04      1 brick01
             15923     devel fbb2std_ osotolon  R       0:04      1 brick01
             15924     devel fbb2std_ osotolon  R       0:04      1 brick01
             15925     devel fbb2std_ osotolon  R       0:04      1 brick01
             15926     devel fbb2std_ osotolon  R       0:04      1 brick01
             15927     devel fbb2std_ osotolon  R       0:04      1 brick01
             15928     devel fbb2std_ osotolon  R       0:04      1 brick01
             15929     devel fbb2std_ osotolon  R       0:04      1 brick01
             15930     devel fbb2std_ osotolon  R       0:01      1 brick01
             15931     devel fbb2std_ osotolon  R       0:01      1 brick01
             15932     devel fbb2std_ osotolon  R       0:01      1 brick01
             15933     devel fbb2std_ osotolon  R       0:01      1 brick01
             15934     devel fbb2std_ osotolon  R       0:01      1 brick01
             15935     devel fbb2std_ osotolon  R       0:01      1 brick01
             15936     devel fbb2std_ osotolon  R       0:01      1 brick01
             15937     devel fbb2std_ osotolon  R       0:01      1 brick01
             15938     devel fbb2std_ osotolon  R       0:01      1 brick01
             15939     devel fbb2std_ osotolon  R       0:01      1 brick01
             15940     devel fbb2std_ osotolon  R       0:01      1 brick01
             15941     devel fbb2std_ osotolon  R       0:01      1 brick01
             15942     devel fbb2std_ osotolon  R       0:01      1 brick01
             15943     devel fbb2std_ osotolon  R       0:01      1 brick01
neuroimagen/centiloid.1555101307.txt.gz · Last modified: 2020/08/04 10:46 (external edit)