This is an old revision of the document!
Table of Contents
SUVR a Centiloid
Modelo lineal
Segun Rowe et. al. la transformacion de SUVR_FBB a Centiloid sigue la relación,
$ CL = 153.4 \times SUVR_{FBB} - 154.9 $
Esto es sencillo de implementar pero antes hay que calibrar el metodo de obtencion de SUVR de la pipeline con las imagenes procedentes de GAAIN.
Procesando GAAIN
Basicamente descargamos las imagenes y los valores de centiloid calculados en GAAIN,
https://www.gaaindata.org/data/centiloid/FBBproject_E-25_MR.zip
https://www.gaaindata.org/data/centiloid/FBBproject_E-25_FBB_90110.zip
https://www.gaaindata.org/data/centiloid/FBBproject_SupplementaryTable.xlsx
Y hemos de compara los valores de centiloid obtenidos por nuestra pipeline con los valores obtenidos en GAAIN.
Voy a hacer un proyecto nuevo para esto y voy a copiar alli todos los archivos. Las imagenes vienen DICOM, asi que hay que convertirlas,
[osotolongo@detritus centiloid]$ tree MRDCM/ MRDCM/ ├── 1008_MR │ ├── 100.dcm │ ├── 101.dcm │ ├── 102.dcm │ ├── 103.dcm │ ├── 104.dcm │ ├── 105.dcm │ ├── 106.dcm │ ├── 107.dcm │ ├── 108.dcm │ ├── 109.dcm │ ├── 10.dcm ........ [osotolongo@detritus centiloid]$ tree FBBDCM/ FBBDCM/ ├── 1008_PET_FBB │ ├── 10.dcm │ ├── 11.dcm │ ├── 12.dcm │ ├── 13.dcm │ ├── 14.dcm │ ├── 15.dcm │ ├── 16.dcm │ ├── 17.dcm │ ├── 18.dcm │ ├── 19.dcm │ ├── 1.dcm ..............
Alla vamos. Creo el csv del proyecto,
[osotolongo@detritus centiloid]$ ls MRDCM/ | sed 's/\(.*\)_MR/\1;sub/' > centiloid.csv
A ver como convertimos,
[osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ MRDCM/1008_MR/ Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux) Found 176 DICOM file(s) Convert 176 DICOM as tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2 (256x256x176x1) compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii" Conversion required 1.217848 seconds (0.450000 for core code). [osotolongo@detritus centiloid]$ ls tmp/ 1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.json 1008_MR_t1_mprage_sag_p2_iso_1.0_20161003101650_2.nii.gz [osotolongo@detritus centiloid]$ for x in MRDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_.*/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} mri/${y}.nii.gz; mv ${t%.nii.gz}.json mri/${y}.json; done [osotolongo@detritus centiloid]$ ls mri sub1008s0001.json sub1015s0001.json sub1022s0001.json sub1026s0001.json sub1030s0001.json sub1034s0001.json sub1038s0001.json sub2017s0001.json sub2032s0001.json sub1008s0001.nii.gz sub1015s0001.nii.gz sub1022s0001.nii.gz sub1026s0001.nii.gz sub1030s0001.nii.gz sub1034s0001.nii.gz sub1038s0001.nii.gz sub2017s0001.nii.gz sub2032s0001.nii.gz sub1009s0001.json sub1018s0001.json sub1023s0001.json sub1028s0001.json sub1031s0001.json sub1036s0001.json sub2002s0001.json sub2029s0001.json sub1009s0001.nii.gz sub1018s0001.nii.gz sub1023s0001.nii.gz sub1028s0001.nii.gz sub1031s0001.nii.gz sub1036s0001.nii.gz sub2002s0001.nii.gz sub2029s0001.nii.gz sub1010s0001.json sub1019s0001.json sub1024s0001.json sub1029s0001.json sub1032s0001.json sub1037s0001.json sub2005s0001.json sub2030s0001.json sub1010s0001.nii.gz sub1019s0001.nii.gz sub1024s0001.nii.gz sub1029s0001.nii.gz sub1032s0001.nii.gz sub1037s0001.nii.gz sub2005s0001.nii.gz sub2030s0001.nii.gz [osotolongo@detritus centiloid]$ dcm2niix -z y -o tmp/ FBBDCM/1008_PET_FBB/ Chris Rorden's dcm2niiX version v1.0.20180622 (JP2:OpenJPEG) (JP-LS:CharLS) GCC5.5.0 (64-bit Linux) Found 90 DICOM file(s) Convert 90 DICOM as tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180 (128x128x90x1) compress: "/usr/local/mricron/pigz_mricron" -n -f -6 "tmp/1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii" Conversion required 1.233496 seconds (0.170000 for core code). [osotolongo@detritus centiloid]$ ls tmp 1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.json 1008_PET_FBB_Austin_18F_Neuro_Res_20160627143414_43180.nii.gz [osotolongo@detritus centiloid]$ for x in FBBDCM/*; do y=$(echo ${x} | sed 's/.*\/\(.*\)_PET_FBB/sub\1s0001/'); dcm2niix -z y -o tmp/ ${x}; t=$(ls tmp/*.nii.gz); mv ${t} fbb/${y}.nii.gz; mv ${t%.nii.gz}.json fbb/${y}.json; done [osotolongo@detritus centiloid]$ ls fbb sub1008s0001.json sub1015s0001.json sub1022s0001.json sub1026s0001.json sub1030s0001.json sub1034s0001.json sub1038s0001.json sub2017s0001.json sub2032s0001.json sub1008s0001.nii.gz sub1015s0001.nii.gz sub1022s0001.nii.gz sub1026s0001.nii.gz sub1030s0001.nii.gz sub1034s0001.nii.gz sub1038s0001.nii.gz sub2017s0001.nii.gz sub2032s0001.nii.gz sub1009s0001.json sub1018s0001.json sub1023s0001.json sub1028s0001.json sub1031s0001.json sub1036s0001.json sub2002s0001.json sub2029s0001.json sub1009s0001.nii.gz sub1018s0001.nii.gz sub1023s0001.nii.gz sub1028s0001.nii.gz sub1031s0001.nii.gz sub1036s0001.nii.gz sub2002s0001.nii.gz sub2029s0001.nii.gz sub1010s0001.json sub1019s0001.json sub1024s0001.json sub1029s0001.json sub1032s0001.json sub1037s0001.json sub2005s0001.json sub2030s0001.json sub1010s0001.nii.gz sub1019s0001.nii.gz sub1024s0001.nii.gz sub1029s0001.nii.gz sub1032s0001.nii.gz sub1037s0001.nii.gz sub2005s0001.nii.gz sub2030s0001.nii.gz
Preparamos y lanzamos FS,
[osotolongo@detritus centiloid]$ fsl2fs.pl centiloid [osotolongo@detritus centiloid]$ precon.pl centiloid Submitted batch job 15673 [osotolongo@detritus centiloid]$ squeue JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON) 15673 devel fs_recon osotolon PD 0:00 1 (Dependency) 15648 devel fs_recon osotolon R 0:05 1 brick01 15649 devel fs_recon osotolon R 0:05 1 brick01 15650 devel fs_recon osotolon R 0:05 1 brick01 15651 devel fs_recon osotolon R 0:05 1 brick01 15652 devel fs_recon osotolon R 0:05 1 brick01 15653 devel fs_recon osotolon R 0:05 1 brick01 15654 devel fs_recon osotolon R 0:05 1 brick01 15655 devel fs_recon osotolon R 0:05 1 brick01 15656 devel fs_recon osotolon R 0:05 1 brick01 15657 devel fs_recon osotolon R 0:05 1 brick01 15658 devel fs_recon osotolon R 0:05 1 brick01 15659 devel fs_recon osotolon R 0:05 1 brick01 15660 devel fs_recon osotolon R 0:05 1 brick01 15661 devel fs_recon osotolon R 0:05 1 brick01 15662 devel fs_recon osotolon R 0:05 1 brick01 15663 devel fs_recon osotolon R 0:05 1 brick01 15664 devel fs_recon osotolon R 0:02 1 brick01 15665 devel fs_recon osotolon R 0:02 1 brick01 15666 devel fs_recon osotolon R 0:02 1 brick01 15667 devel fs_recon osotolon R 0:02 1 brick01 15668 devel fs_recon osotolon R 0:02 1 brick01 15669 devel fs_recon osotolon R 0:02 1 brick01 15670 devel fs_recon osotolon R 0:02 1 brick01 15671 devel fs_recon osotolon R 0:02 1 brick01 15672 devel fs_recon osotolon R 0:02 1 brick01
Ahora, las imagenes PEt vieen en formato distinto al usado habitualmente (1 solo slice de 20 min) por lo que hay que retocar el script fbbtemp_reg.sh un poco para añadirle la posibilidad de proesar esto correctamente.
- fbbtemp_reg.sh
#!/bin/sh study=$1 shift id=$1 shift tdir=$1 shift wdir=$1 shift items=(`ls ${tdir}/${id}* | grep -v "_" | grep -v ".json"`) #shift debug=0 #Now get the uncorrected PETs and register to user space MRI for i in ${!items[*]}; do tf=`printf "${id}s%04d" $i` #${FSLDIR}/bin/fslreorient2std ${tdir}/${tf} ${tdir}/${id}_tmp ${FSLDIR}/bin/imcp ${tdir}/${tf} ${tdir}/${id}_tmp ${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${tdir}/${id}_tmp -omat ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg #${FSLDIR}/bin/flirt -ref ${wdir}/${id}_brain -in ${tdir}/${id}_tmp -init ${tdir}/${tf}_pet2struc.mat -out ${tdir}/${tf}_reg done if [ ${#items[@]} -gt 1 ]; then echo ${#items[@]} a=`for i in ${!items[*]}; do printf " ${tdir}/${id}s%04d_reg " $i; done` ${FSLDIR}/bin/fslmerge -t ${wdir}/${id}_tmp_mvc $a #${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -thr 0 -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc #${FSLDIR}/bin/fslmaths ${dir}/${id}_tmp_pet_in_struc -mas ${dir}/${id}_brain ${dir}/${id}_pet_in_struc ${FSLDIR}/bin/mcflirt -in ${wdir}/${id}_tmp_mvc -out ${wdir}/${id}_tmp_corr ${PIPEDIR}/bin/4dmean.pl ${wdir}/${id}_tmp_corr ${FSLDIR}/bin/flirt -ref ${wdir}/${id}_struc -in ${wdir}/${id}_mean -omat ${wdir}/${id}_fbb2struc.mat -out ${wdir}/${id}_fbb else tf=`printf "${id}s%04d" ${item[0]}` ${FSLDIR}/bin/mcflirt -in ${tdir}/${tf}_reg -out ${wdir}/${id}_tmp_corr ${FSLDIR}/bin/imcp ${wdir}/${id}_tmp_corr ${wdir}/${id}_fbb fi if [ $debug = 0 ] ; then rm ${tdir}/${id}_tmp* rm ${wdir}/${id}_tmp* fi
De aqui se puede hacer lo usual,
[osotolongo@detritus centiloid]$ fbb_correct.pl centiloid
Y podemos revisar el report,
Todo parece ir bien asi que,
[osotolongo@detritus ~]$ parallel_fbb_rois_metrics.pl centiloid
Ahora solo hay que compara los valores globales de SUVR con los de la tabla de GAAIN.
[osotolongo@detritus centiloid]$ awk -F ";" '{print $1,$2,$4}' FBB_suvr_centiloid.csv | grep -v Y | sort -n > reference.dat [osotolongo@detritus centiloid]$ join calcs.dat reference.dat > toreview.dat
A ver que tal ese modelo,
> d <- read.csv("toreview.dat", sep = " ", header = TRUE) > m <- lm(d$Global ~ d$WC_suvr) > summary(m) Call: lm(formula = d$Global ~ d$WC_suvr) Residuals: Min 1Q Median 3Q Max -0.121024 -0.042402 0.000411 0.047290 0.126656 Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) 0.01999 0.05799 0.345 0.734 d$WC_suvr 1.15167 0.03919 29.390 <2e-16 *** --- Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1 Residual standard error: 0.07003 on 22 degrees of freedom Multiple R-squared: 0.9752, Adjusted R-squared: 0.974 F-statistic: 863.8 on 1 and 22 DF, p-value: < 2.2e-16
Voy a hacer con gnuplot que es mas facil,
gnuplot> f(x) = m*x +n gnuplot> fit f(x) "toreview.dat" u 2:3 via m,n iter chisq delta/lim lambda m n 0 3.6936711672e+01 0.00e+00 1.41e+00 1.000000e+00 1.000000e+00 1 3.6992605848e-01 -9.88e+06 1.41e-01 6.105751e-01 4.583953e-01 2 7.9414981509e-02 -3.66e+05 1.41e-02 8.423745e-01 2.635659e-02 3 7.9329087727e-02 -1.08e+02 1.41e-03 8.467400e-01 1.870027e-02 4 7.9329087724e-02 -3.50e-06 1.41e-04 8.467408e-01 1.869890e-02 iter chisq delta/lim lambda m n After 4 iterations the fit converged. final sum of squares of residuals : 0.0793291 rel. change during last iteration : -3.5012e-11 degrees of freedom (FIT_NDF) : 22 rms of residuals (FIT_STDFIT) = sqrt(WSSR/ndf) : 0.0600489 variance of residuals (reduced chisquare) = WSSR/ndf : 0.00360587 Final set of parameters Asymptotic Standard Error ======================= ========================== m = 0.846741 +/- 0.02881 (3.403%) n = 0.0186989 +/- 0.0497 (265.8%) correlation matrix of the fit parameters: m n m 1.000 n -0.969 1.000
Tengo un ajuste con $r^2 = 0.97$ y una pendiente de $0.85 \pm 0.02$. La diferencia en los procedimientos esta en que en nuestra pipeline, siguiendo el metodo de ADNI, se toma como region de referencia la materia gris del cerebelo y en el caso de GAAIN se toma todo el cerebelo. No obstante la correspondencia entre las mediciones es buena.
Implementando pipeline Centiloid
Con el objetivo de compara con otros estudios hemos de implementar el metodo usando las plantillas originales de Klunk et. al.. La parte buena es que el preproceso y el registro al espacio de sujeto lo tenemos hecho. Solo habria que registrar al espacio MNI y sacar las metricas usando las plantillas.
De cualquier manera voy a reescribir el proceso de registro para integrarlo en el cluster,
- fbb_correct.pl
#!/usr/bin/perl use strict; use warnings; use Data::Dump qw(dump); use File::Find::Rule; use File::Copy::Recursive qw(dirmove); use NEURO qw(populate check_subj load_study print_help get_petdata get_pair achtung shit_done check_or_make); #my $study; my $cfile; my $sok = 0; @ARGV = ("-h") unless @ARGV; while (@ARGV and $ARGV[0] =~ /^-/) { $_ = shift; last if /^--$/; #if (/^-e/) { $study = shift; chomp($study);} if (/^-cut/) { $cfile = shift; chomp($cfile);} if (/^-h/) { print_help $ENV{'PIPEDIR'}.'/doc/fbb_reg.hlp'; exit;} } my $study = shift; unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/fbb_reg.hlp'; exit;} my %std = load_study($study); my $subj_dir = $ENV{'SUBJECTS_DIR'}; #Run this script ONLY on "Detritus" #or change the paths to the correct ones my $w_dir=$std{'WORKING'}; #my $pet_dir = $spect?$std{'SPECT'}:$std{'PET-FDG'}; my $pet_dir = $std{'PET-FBB'}; #my $petnc_dir = $std{'FBB-NC'}; my $db = $std{'DATABASE'}; my $data_dir=$std{'DATA'}; my $outdir = "$std{'DATA'}/slurm"; check_or_make($outdir); my %plist = populate($db); my @ok_pets; print "Collecting needed files\n"; my %pets; if ($cfile){ my %cuts = get_pair($data_dir."/".$cfile); foreach my $cut (keys %cuts){ if(grep {/$cut/} %plist){ $pets{$cut} = $plist{$cut}; } } }else{ %pets = %plist; } my $err_file = $data_dir."/.fbb_errors.debug"; open ERRF, ">$err_file" or die "Could not open error file!"; foreach my $subject (sort keys %pets){ my @names = find(file => 'name' => "$pets{$subject}$subject*", '!name' => '*_*', 'name' => '*.nii.gz', in => $pet_dir); dump @names; if(@names){ my @mri = find(file => 'name' => "brain.mgz", in => $subj_dir."/".$study."_".$pets{$subject}.$subject."/mri"); if(@mri){ push @ok_pets, $subject; }else{ print ERRF "Subject $subject has no MRI brain extracted!!!\n"; } } } foreach my $subject (@ok_pets){ #my @mri = find(file => 'name' => $pets{$subject}.$subject."_brain.nii.gz", in => "$w_dir/"); my $orderfile = $outdir.'/'.$subject.'_get_subj.sh'; my $order = $ENV{'PIPEDIR'}."/bin/get_fs_subj.sh ".$study." ".$pets{$subject}.$subject." ".$w_dir; open ORD, ">$orderfile"; print ORD '#!/bin/bash'."\n"; print ORD '#SBATCH -J fbb_correct_'.$study."\n"; print ORD '#SBATCH --time=1:0:0'."\n"; #si no ha terminado en X horas matalo print ORD '#SBATCH --mail-type=FAIL,STAGE_OUT'."\n"; #no quieres que te mande email de todo print ORD '#SBATCH -o '.$outdir.'/fbb_get_subj-%j'."\n"; print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n"; print ORD "srun $order\n"; close ORD; my $jobid = `sbatch $orderfile`; $jobid = ( split ' ', $jobid )[ -1 ]; $order = $ENV{'PIPEDIR'}."/bin/fbbtemp_reg.sh ".$study." ".$pets{$subject}.$subject." ".$pet_dir." ".$w_dir; $orderfile = $outdir.'/'.$subject.'_fbb_reg.sh'; open ORD, ">$orderfile"; print ORD '#!/bin/bash'."\n"; print ORD '#SBATCH -J fbb_correct_'.$study."\n"; print ORD '#SBATCH --time=1:0:0'."\n"; #si no ha terminado en X horas matalo print ORD '#SBATCH --mail-type=FAIL,STAGE_OUT'."\n"; #no quieres que te mande email de todo print ORD '#SBATCH -o '.$outdir.'/fbb_reg-%j'."\n"; print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n"; print ORD "srun $order\n"; close ORD; my $xorder = 'sbatch --depend=afterok:'.$jobid.' '.$orderfile; system("$xorder"); } my $order = $ENV{'PIPEDIR'}."/bin/make_fbb_report.pl ".$study; my $orderfile = $outdir.'/fbb_report.sh'; open ORD, ">$orderfile"; print ORD '#!/bin/bash'."\n"; print ORD '#SBATCH -J fbb_correct_'.$study."\n"; print ORD '#SBATCH --time=2:0:0'."\n"; #si no ha terminado en X horas matalo print ORD '#SBATCH --mail-type=FAIL,END'."\n"; #email cuando termine o falle print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n"; print ORD '#SBATCH -o '.$outdir.'/fbb_rep-%j'."\n"; print ORD "srun $order\n"; close ORD; my $xorder = 'sbatch --dependency=singleton'.' '.$orderfile; exec($xorder);
Y ahora la parte nueva
Primero hago un script sencillo que registre con ANTS al espacio MNI,
- fbb2std.sh
#!/bin/sh id=$1 shift wdir=$1 shift echo "I need the FBB image at MNI space" ANTS 3 -m CC[${FSLDIR}/data/standard/MNI152_T1_2mm.nii.gz, ${wdir}/${id}_struc.nii.gz, 1, 4] -r Gauss[0,3] -t Elast[1.5] -i 30x20x10 -o ${wdir}/${id}_fbb_t1_mni.nii.gz WarpImageMultiTransform 3 ${wdir}/${id}_fbb.nii.gz ${wdir}/${id}_fbb_mni.nii.gz -R ${FSLDIR}/data/standard/MNI152_T1_2mm.nii.gz ${wdir}/${id}_fbb_t1_mniWarp.nii.gz ${wdir}/${id}_fbb_t1_mniAffine.txt
la relacion reportada en el paper de referencia es tomando como ROI de referencia el cerebelo completo. Asi que solo vale la pena calcular esa.
- fbb_cl_masks.pl
#!/usr/bin/perl # Copyright 2019 O. Sotolongo <asqwerty@gmail.com> use strict; use warnings; use File::Find::Rule; use NEURO qw(print_help get_pair load_study achtung shit_done get_lut check_or_make centiloid_fbb); use Data::Dump qw(dump); use File::Remove 'remove'; use File::Basename qw(basename); my $ref_roi = "voi_WhlCbl_2mm.nii"; my $ctx_roi = "voi_ctx_2mm.nii"; my $roi_paths = $ENV{'PIPEDIR'}.'/lib/Centiloid_Std_VOI/nifti/2mm/'; my $withstd = 0; my $cfile; @ARGV = ("-h") unless @ARGV; while (@ARGV and $ARGV[0] =~ /^-/) { $_ = shift; last if /^--$/; if (/^-cut/) { $cfile = shift; chomp($cfile);} } my $study = shift; unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;} my %std = load_study($study); my $w_dir=$std{'WORKING'}; my $data_dir=$std{'DATA'}; our %subjects; my @plist = find(file => 'name' => "*_fbb.nii.gz", '!name' => "*tmp*", in => $w_dir); my $ofile = $data_dir."/".$study."_fbb_cl.csv"; my $patt = '([A-Z,a-z]{1,4})(\d{1,6})_fbb'; my @pets; if ($cfile){ my %cuts = get_pair($data_dir."/".$cfile); foreach my $cut (keys %cuts){ if(grep {/$cut/} @plist){ $pets[$cut] = $plist[$cut]; } } }else{ @pets = @plist; } foreach my $pet (sort @pets){ (my $dg,my $subject) = $pet =~ /$patt/; if($subject){ $subjects{$subject}{'dg'} = $dg; $subjects{$subject}{'pet'} = $pet; } } my %sdata; foreach my $subject (sort keys %subjects){ my $care; my $norm; my $dg = $subjects{$subject}{'dg'}; # Apply masks to FBB my $roi_mask = $roi_paths.$ctx_roi; my $order = "fslstats ".$w_dir."/".$dg.$subject."_fbb_mni -k ".$roi_mask." -M"; my $ctx = qx/$order/; my $roi_mask = $roi_paths.$ctx_roi; $order = "fslstats ".$w_dir."/".$dg.$subject."_fbb_mni -k ".$roi_mask." -M"; print "$order\n"; my $norm = qx/$order/; $subjects{$subject}{$npf.'mean'} = $ctx/$norm; } open OF, ">$ofile"; print OF "Subject; SUVR; Centilod"; print OF "\n"; foreach my $subject (sort keys %subjects){ print OF "$subject"; my $mean = $subjects{$subject}{$npf.'mean'}; print OF ";$mean"; print OF ";", centiloid_fbb($mean); print OF "\n"; } close OF;
Ahora hago un wrapper que lance todo,
- fbb_cl_metrics.pl
#!/usr/bin/perl # Copyright 2019 O. Sotolongo <asqwerty@gmail.com> use strict; use warnings; use File::Find::Rule; use NEURO qw(print_help get_pair load_study achtung shit_done get_lut check_or_make centiloid_fbb); use Data::Dump qw(dump); use File::Remove 'remove'; use File::Basename qw(basename); my $wcut = 0; my $cfile; @ARGV = ("-h") unless @ARGV; while (@ARGV and $ARGV[0] =~ /^-/) { $_ = shift; last if /^--$/; if (/^-cut/) { $cfile = shift; chomp($cfile); $wcut = 1;} if (/^-h$/) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;} } my $study = shift; unless ($study) { print_help $ENV{'PIPEDIR'}.'/doc/pet_metrics.hlp'; exit;} my %std = load_study($study); my $w_dir=$std{'WORKING'}; my $data_dir=$std{'DATA'}; my $outdir = "$std{'DATA'}/slurm"; check_or_make($outdir); our %subjects; my @plist = find(file => 'name' => "*_fbb.nii.gz", '!name' => "*tmp*", in => $w_dir); my $ofile = $data_dir."/".$study."_fbb_cl.csv"; my $patt = '([A-Z,a-z]{1,4})(\d{1,6})_fbb'; my @pets; if ($cfile){ my %cuts = get_pair($data_dir."/".$cfile); foreach my $cut (keys %cuts){ if(grep {/$cut/} @plist){ $pets[$cut] = $plist[$cut]; } } }else{ @pets = @plist; } foreach my $pet (sort @pets){ (my $dg,my $subject) = $pet =~ /$patt/; if($subject){ $subjects{$subject}{'dg'} = $dg; $subjects{$subject}{'pet'} = $pet; } } foreach my $subject (sort keys %subjects){ my $norm; my $dg = $subjects{$subject}{'dg'}; my $subj = $dg.$subject; #Making sbatch scripts # Get FBB image into MNI space my $order = $ENV{'PIPEDIR'}."/bin/fbb2std.sh ".$subj." ".$w_dir; my $orderfile = $outdir.'/'.$subj.'_fbb_reg.sh'; open ORD, ">$orderfile"; print ORD '#!/bin/bash'."\n"; print ORD '#SBATCH -J fbb2std_'.$study."\n"; print ORD '#SBATCH --time=4:0:0'."\n"; #si no ha terminado en X horas matalo print ORD '#SBATCH --mail-type=FAIL,TIME_LIMIT,STAGE_OUT'."\n"; #no quieres que te mande email de todo print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n"; print ORD '#SBATCH -o '.$outdir.'/fbb2std-'.$subj.'-%j'."\n"; print ORD "srun $order\n"; close ORD; system("sbatch $orderfile"); } my $order = $ENV{'PIPEDIR'}."/bin/fbb_cl_masks.pl ".$study." ".($wcut?"-cut $cfile":""); my $orderfile = $outdir.'/fbb_masks.sh'; open ORD, ">$orderfile"; print ORD '#!/bin/bash'."\n"; print ORD '#SBATCH -J fbb2std_'.$study."\n"; print ORD '#SBATCH --time=24:0:0'."\n"; #si no ha terminado en X horas matalo print ORD '#SBATCH --mail-type=FAIL,END'."\n"; #email cuando termine o falle print ORD '#SBATCH --mail-user='."$ENV{'USER'}\n"; print ORD '#SBATCH -o '.$outdir.'/fbbmasks-%j'."\n"; print ORD "srun $order\n"; close ORD; my $xorder = 'sbatch --dependency=singleton'.' '.$orderfile; exec($xorder);
[osotolongo@detritus centiloid]$ fbb_cl_metrics.pl centiloid ..... [osotolongo@detritus centiloid]$ squeue JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON) 15944 devel fbb2std_ osotolon PD 0:00 1 (Dependency) 15920 devel fbb2std_ osotolon R 0:04 1 brick01 15921 devel fbb2std_ osotolon R 0:04 1 brick01 15922 devel fbb2std_ osotolon R 0:04 1 brick01 15923 devel fbb2std_ osotolon R 0:04 1 brick01 15924 devel fbb2std_ osotolon R 0:04 1 brick01 15925 devel fbb2std_ osotolon R 0:04 1 brick01 15926 devel fbb2std_ osotolon R 0:04 1 brick01 15927 devel fbb2std_ osotolon R 0:04 1 brick01 15928 devel fbb2std_ osotolon R 0:04 1 brick01 15929 devel fbb2std_ osotolon R 0:04 1 brick01 15930 devel fbb2std_ osotolon R 0:01 1 brick01 15931 devel fbb2std_ osotolon R 0:01 1 brick01 15932 devel fbb2std_ osotolon R 0:01 1 brick01 15933 devel fbb2std_ osotolon R 0:01 1 brick01 15934 devel fbb2std_ osotolon R 0:01 1 brick01 15935 devel fbb2std_ osotolon R 0:01 1 brick01 15936 devel fbb2std_ osotolon R 0:01 1 brick01 15937 devel fbb2std_ osotolon R 0:01 1 brick01 15938 devel fbb2std_ osotolon R 0:01 1 brick01 15939 devel fbb2std_ osotolon R 0:01 1 brick01 15940 devel fbb2std_ osotolon R 0:01 1 brick01 15941 devel fbb2std_ osotolon R 0:01 1 brick01 15942 devel fbb2std_ osotolon R 0:01 1 brick01 15943 devel fbb2std_ osotolon R 0:01 1 brick01