This is an old revision of the document!
Table of Contents
FreeSurfer Longitudinal Analysis (implementacion en detritus)
https://surfer.nmr.mgh.harvard.edu/fswiki/LongitudinalProcessing
https://surfer.nmr.mgh.harvard.edu/fswiki/LongitudinalStatistics
Tengo dos puntos del proyecto FACEHBI asi que puedo hacer un proyecto longitudinal (apenas). A tener en cuenta,
- En cada punto temporal del proyecto un sujeto recibe un ID de procesamiento distinto, esta ID corresponde al sujeto dentro del pool en ese punto de tiempo. Asi que hay que cruzar los IDs de procesamiento con los IDs de proyecto que si son los mismos a lo largo del tiempo. Esto es bastante sencillo pero hay que hacerlo.
- Para el analisis longitudinal hay que crear un proyecto distinto y crear IDs nuevos.Este proyecto sera especial ya que no hace falta ningun input mas alla de la correspondencia entre los puntos temporales. Todaslas imagenes estan ya procesadas. Solo hay que reprocesar punto temporal respecto a un template del sujeto que se hace con ellos.
Asi que tenemos algo asi,
[osotolongo@detritus facehbi_long_v02]$ head match02.csv Psubject_v0,SubjID_v0,SubjID_v2 F001,facehbi_0001,f2cehbi_0001 F002,facehbi_0002,f2cehbi_0002 F003,facehbi_0003,f2cehbi_0003 F005,facehbi_0005,f2cehbi_0004 F006,facehbi_0006,f2cehbi_0005 F007,facehbi_0007,f2cehbi_0006 F008,facehbi_0008,f2cehbi_0007 F009,facehbi_0009,f2cehbi_0008 F010,facehbi_0010,f2cehbi_0009
Voy a probar con uno solo, despues ya veremos la logistica.
Single subject example
[osotolongo@detritus facehbi_long_v02]$ recon-all -base F001 -tp facehbi_0001 -tp f2cehbi_0001 -all [osotolongo@detritus facehbi_long_v02]$ recon-all -long facehbi_0001 F001 -all [osotolongo@detritus facehbi_long_v02]$ recon-all -long f2cehbi_0001 F001 -all
programando el grupo
Lo primero que necesito es algo un archivo que enlace los distintos puntos temporales,
[osotolongo@detritus facehbi_long_v02]$ head flong.csv flong_subject;SubjID_v0;SubjID_v2 flong_0001;facehbi_0001;f2cehbi_0001 flong_0002;facehbi_0002;f2cehbi_0002 flong_0003;facehbi_0003;f2cehbi_0003 flong_0005;facehbi_0005;f2cehbi_0004 flong_0006;facehbi_0006;f2cehbi_0005 flong_0007;facehbi_0007;f2cehbi_0006 flong_0008;facehbi_0008;f2cehbi_0007 flong_0009;facehbi_0009;f2cehbi_0008 flong_0010;facehbi_0010;f2cehbi_0009
Despues se hace un script para lanzar todo esto en el cluster
Creo un proyecto fantasma,
[osotolongo@detritus lfacehbi]$ make_proj lfacehbi /nas/corachan/facehbi
y lanzo el longitudinal,
[osotolongo@detritus lfacehbi]$ plong.pl -i flong.csv lfacehbi
Sacando las tablas
Probando con una tabla, aseg, por ejemplo.
[osotolongo@detritus lfacehbi]$ asegstats2table --subjects `ls -d /nas/data/subjects/facehbi_*.long.flong_* | awk -F"/" '{print $5}' | sed ':a;N;$!ba;s/\n/ /g'` --meas volume --skip --statsfile aseg.stats --all-segs --tablefile facehbi_aseg_stats.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Building the table.. Writing the table to facehbi_aseg_stats.txt [osotolongo@detritus lfacehbi]$ asegstats2table --subjects `ls -d /nas/data/subjects/f2cehbi_*.long.flong_* | awk -F"/" '{print $5}' | sed ':a;N;$!ba;s/\n/ /g'` --meas volume --skip --statsfile aseg.stats --all-segs --tablefile f2cehbi_aseg_stats.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Building the table.. Writing the table to f2cehbi_aseg_stats.txt [osotolongo@detritus lfacehbi]$ sed 's/Measure:volume/Visit, Subject/;s/facehbi_.*\.long\./0, /' facehbi_aseg_stats.csv > facehbi_aseg_stats_clear.csv [osotolongo@detritus lfacehbi]$ sed 's/Measure:volume/Visit, Subject/;s/f2cehbi_.*\.long\./2, /' f2cehbi_aseg_stats.csv > f2cehbi_aseg_stats_clear.csv [osotolongo@detritus lfacehbi]$ tail -n +2 f2cehbi_aseg_stats_clear.csv > f2cehbi_aseg_stats_clear_nohead.csv [osotolongo@detritus lfacehbi]$ cat facehbi_aseg_stats_clear.csv f2cehbi_aseg_stats_clear_nohead.csv [osotolongo@detritus lfacehbi]$ grep flong_0001 facehbi_long_aseg_stats.csv 0, flong_0001, 11069.1, 335.9, 17456.1, 55694.4, 7314.9, 3132.9, 5000.3, 2141.7, 1048.7, 1651.0, 24749.2, 3472.6, 1279.8, 1564.1, 671.2, 3900.3, 159.1, 1318.8, 12173.8, 312.2, 16618.2, 53669.2, 7055.8, 3722.3, 5238.6, 2174.4, 3642.3, 1624.8, 657.0, 3968.7, 201.4, 1371.7, 0.0, 3905.4, 0.0, 0.0, 18.6, 0.0, 0.0, 290.4, 1108.3, 772.0, 760.4, 463.8, 1146.2, 1264641.0, 1234001.0, 1233494.22632, 239473.467727, 242179.420796, 481652.888523, 273169.386984, 275225.950812, 548395.337795, 56835.0, 651931.888523, 1118979.22632, 1092494.22632, 1087419.0, 1719786.0, 0.725663, 0.986829, 1742738.86176 2, flong_0001, 12198.3, 421.5, 17554.5, 54551.3, 6946.3, 2947.5, 4946.3, 2109.3, 1106.2, 1746.9, 24346.9, 3394.6, 1203.1, 1573.0, 653.2, 3740.2, 161.3, 1289.8, 13200.3, 365.7, 16464.4, 52906.5, 6795.9, 3625.8, 5045.0, 2093.4, 3558.4, 1529.8, 640.9, 3838.2, 230.3, 1391.3, 0.0, 4726.2, 0.0, 0.0, 34.1, 0.0, 0.0, 305.1, 1095.1, 681.4, 711.8, 459.0, 1099.7, 1225467.0, 1192355.0, 1192475.46698, 225610.010964, 226004.976163, 451614.987126, 275230.775936, 265861.703922, 541092.479858, 54917.0, 618316.987126, 1082197.46698, 1053410.46698, 1047595.0, 1718068.0, 0.703185, 0.985844, 1742738.86176
Me voy a R ahora y a ver si puedo sacar un ejemplo, (ver How to correct by Intracranial Volume (ICV))
aseg <- read.csv("facehbi_long_aseg_stats.csv", header = TRUE, sep=",") aseg$Hippocampus = 0.5*(aseg$Left.Hippocampus + aseg$Right.Hippocampus) a <- lm(aseg$Hippocampus ~ aseg$EstimatedTotalIntraCranialVol) b = a$coefficients[[2]] aseg$adj_Hippocampus = aseg$Hippocampus - b*(aseg$EstimatedTotalIntraCranialVol -mean(aseg$EstimatedTotalIntraCranialVol, na.rm=TRUE)) interaction.plot(aseg$Visit, aseg$Subject, aseg$adj_Hippocampus, legend=F, col=c(1:200), xlab="Visit", ylab="adjusted Hippocampus")
Statistics
Qdec Table
Tengo que sacar una tabla con los años entre cada experimento MRI. Primero saco las echas de cada experimento,
[osotolongo@detritus lfacehbi]$ for x in `ls /nas/corachan/facehbi`; do f=$(find /nas/corachan/facehbi/${x}/ -type f | head -n 1); d=$(dckey ${f} -k "AcquisitionDate" 2>&1); echo "${x}, ${d}"; done > v0_dates.csv [osotolongo@detritus lfacehbi]$ for x in `ls /nas/corachan/facehbi_2`; do f=$(find /nas/corachan/facehbi_2/${x}/ -type f | head -n 1); d=$(dckey ${f} -k "AcquisitionDate" 2>&1); echo "${x}, ${d}"; done > v2_dates.csv [osotolongo@detritus lfacehbi]$ join -t"," v0_dates.csv v2_dates.csv > long_dates.csv [osotolongo@detritus lfacehbi]$ head long_dates.csv F001, 20141205, 20170124 F002, 20141205, 20170323 F003, 20141211, 20170123 F005, 20150107, 20170123 F006, 20141223, 20170124 F007, 20141219, 20170120 F008, 20141220, 20170125 F009, 20150110, 20170207 F010, 20150109, 20170208 F011, 20150110, 20170127
y para sacar la diferencia entre las fechas (es mas dificl de lo que parece), me escribo un script en perl,
- get_years.pl
#!/usr/bin/perl use strict; use warnings; use Date::Manip; use Math::Round; my $ifile = shift; open IDF, "<$ifile" or die "No such file!"; while(<IDF>){ my ($subject, $date1, $date2) = /^(.*), (\d+), (\d+)$/; my $d1 = ParseDate($date1); my $d2 = ParseDate($date2); my $diff = nearest(0.01, Delta_Format(DateCalc($d1,$d2),2,"%dh")/365.2425); print "$subject, $diff\n"; }
ya esta casi,
[osotolongo@detritus lfacehbi]$ ./get_years.pl long_dates.csv > facehbi_years_between.csv [osotolongo@detritus lfacehbi]$ head facehbi_years_between.csv F001, 2.14 F002, 2.3 F003, 2.12 F005, 2.05 F006, 2.09 F007, 2.09 F008, 2.1 F009, 2.08 F010, 2.08 F011, 2.05
A ver si pegamos esto,
[osotolongo@detritus lfacehbi]$ sed 's/,/;/' facehbi_years_between.csv > facehbi_years_between_sc.csv [osotolongo@detritus lfacehbi]$ sed -i '1iPSubject; years' facehbi_years_between_sc.csv [osotolongo@detritus lfacehbi]$ awk -F";" '{print $2";"$0}' flong.csv | sed 's/facehbi_//;s/SubjID_v0/Subject/' > flong_2concat.csv [osotolongo@detritus lfacehbi]$ cp /nas/data/facehbi/facehbi_mri.csv ./ [osotolongo@detritus lfacehbi]$ sed -i '1iSubject;PSubject' facehbi_mri.csv [osotolongo@detritus lfacehbi]$ join -t";" -1 2 -2 1 facehbi_mri.csv facehbi_years_between_sc.csv > fyears_2concat.csv [osotolongo@detritus lfacehbi]$ sed 's/,/;/g;s/edat_v0/age/;s/Anyos_Escolaridad_FAC_v0/Education/;s/Sex_1H_0M_v0/Gender/' demographics.csv > dg_2concat.csv [osotolongo@detritus lfacehbi]$ join -t";" fyears_2concat.csv dg_2concat.csv | sed 's/ //' > data_2concat.csv [osotolongo@detritus lfacehbi]$ join -t";" -1 1 -2 2 flong_2concat.csv data_2concat.csv | tail -n +2 |awk -F";" '{print $3" "$2" 0 "$7" "$8" "$9"\n"$4" "$2" "$6" "$7+$6" "$8" "$9}' > qdec/long.qdec.table.dat [osotolongo@detritus lfacehbi]$ sed -i '1ifsid fsid-base years age education gender' qdec/long.qdec.table.dat [osotolongo@detritus lfacehbi]$ head qdec/long.qdec.table.dat fsid fsid-base years age education gender facehbi_0001 flong_0001 0 71 8 0 f2cehbi_0001 flong_0001 2.14 73.14 8 0 facehbi_0002 flong_0002 0 70 12 1 f2cehbi_0002 flong_0002 2.3 72.3 12 1 facehbi_0003 flong_0003 0 70 8 0 f2cehbi_0003 flong_0003 2.12 72.12 8 0 facehbi_0005 flong_0005 0 68 20 1 f2cehbi_0004 flong_0005 2.05 70.05 20 1 facehbi_0006 flong_0006 0 64 14 0
y ya tenemos la tabla qdec lista
Two Stage Model
Antes de proceder aqui habria que comprobar que existen todos los sujetos y puntos de la tabla,
[osotolongo@detritus lfacehbi]$ for x in `tail -n +2 flong.csv | awk -F";" {'print $2".long."$1'}`; do ls -d /nas/data/subjects/${x}; done ... [osotolongo@detritus lfacehbi]$ for x in `tail -n +2 flong.csv | awk -F";" {'print $2'}`; do ls -d /nas/data/subjects/${x}; done ... [osotolongo@detritus lfacehbi]$ for x in `tail -n +2 flong.csv | awk -F";" {'print $3".long."$1'}`; do ls -d /nas/data/subjects/${x}; done ... [osotolongo@detritus lfacehbi]$ for x in `tail -n +2 flong.csv | awk -F";" {'print $3'}`; do ls -d /nas/data/subjects/${x}; done ...
Si no hay ninguna linea de error en estos comandos, seguimos. En caso de faltar uno de estos directorios, hay que correr el analisis correspondiente de nuevo.
Ahora si,
[osotolongo@detritus lfacehbi]$ long_mris_slopes --sd /nas/data/subjects/ --qdec ./qdec/long.qdec.table.dat --meas thickness --hemi lh --do-avg --do-rate --do-pc1 --do-spc --do-stack --do-label --time years --qcache fsaverage
Esto es largo, dado el numero de sujetos, asi que mejor armarse de paciencia.
QDEC way
Ahora hay que preparar la tabla de QDEC,
$ long_qdec_table --qdec qdec/long.qdec.table.dat --cross --out qdec/cross.qdec.table.dat
y se ha de hacer un archov ~/.Qdec con la lista de variables,
MEASURE1 = long.thickness-avg MEASURE2 = long.thickness-rate MEASURE3 = long.thickness-pc1 MEASURE4 = long.thickness-spc
Ahora,
$ qdec --table qdec/cross.qdec.table.dat
y aqui, maniobrando con QDEC podemos obtener varios modelos.
GLM way
Aqui se miden los cambios en el grosor cortical ocurridos entre las dos visitas,
[osotolongo@detritus lfacehbi]$ long_mris_slopes --qdec qdec/long.qdec.table.dat --meas thickness --hemi lh --sd $SUBJECTS_DIR --do-pc1 --do-label --generic-time --fwhm 15 --qcache fsaverage --stack-pc1 lh.lfacehbi.thickness-pc1.stack.mgh --isec-labels lh.lfacehbi.fsaverage.cortex.label
Hacemos el modelo,
[osotolongo@detritus lfacehbi]$ mri_glmfit --osgm --glmdir lh.lfacehbi.thickness-pc1.fwhm15 --y lh.lfacehbi.thickness-pc1.stack.fwhm15.mgh --label lh.lfacehbi.fsaverage.cortex.label --surf fsaverage lh
Lo miramos,
freeview -f /nas/data/subjects/fsaverage/surf/lh.inflated:annot=aparc.annot:annot_outline=1:overlay=lh.lfacehbi.thickness-pc1.fwhm15/osgm/sig.mgh:overlay_threshold=4,5 -viewport 3d
Corregimos por multiples comparaciones,
[osotolongo@detritus lfacehbi]$ mri_glmfit-sim --glmdir lh.lfacehbi.thickness-pc1.fwhm15/ --cache 4 neg --cwp 0.05 --2spaces
y vemos que clusters quedan,
freeview -f /nas/data/subjects/fsaverage/surf/lh.inflated:overlay=lh.lfacehbi.thickness-pc1.fwhm15/osgm/cache.th40.neg.sig.cluster.mgh:overlay_threshold=2,5:annot=lh.lfacehbi.thickness-pc1.fwhm15/osgm/cache.th40.neg.sig.ocn.annot -viewport 3d
La informacion de los clusters queda tambien almacenada,
[osotolongo@detritus lfacehbi]$ cat lh.lfacehbi.thickness-pc1.fwhm15/osgm/cache.th40.neg.sig.cluster.summary # Cluster Growing Summary (mri_surfcluster) # $Id: mri_surfcluster.c,v 1.57.2.3 2016/11/17 18:19:42 zkaufman Exp $ # $Id: mrisurf.c,v 1.781.2.6 2016/12/27 16:47:14 zkaufman Exp $ # CreationTime 2020/07/10-08:03:33-GMT # cmdline mri_surfcluster.bin --in lh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh --mask lh.lfacehbi.thickness-pc1.fwhm15//mask.mgh --cwsig lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.mgh --sum lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.summary --ocn lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.mgh --annot aparc --cwpvalthresh 0.05 --o lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.masked.mgh --no-fixmni --csd /usr/local/freesurfer/average/mult-comp-cor/fsaverage/lh/cortex/fwhm17/neg/th40/mc-z.csd --csdpdf lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.pdf.dat --vwsig lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.mgh --vwsigmax lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.max.dat --oannot lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.annot --bonferroni 2 --surf white # cwd /nas/data/lfacehbi # sysname Linux # hostname detritus.fundacioace.com # machine x86_64 # FixVertexAreaFlag 1 # FixSurfClusterArea 1 # # Input lh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh # Frame Number 0 # srcsubj fsaverage # hemi lh # surface white # group_avg_surface_area 82220 # group_avg_vtxarea_loaded 1 # annot aparc # SUBJECTS_DIR /nas/data/subjects # SearchSpace_mm2 75384.9 # SearchSpace_vtx 147377 # Bonferroni 2 # Minimum Threshold 4 # Maximum Threshold infinity # Threshold Sign neg # AdjustThreshWhenOneTail 1 # CW PValue Threshold: 0.05 # Area Threshold 0 mm^2 # CSD thresh 4.000000 # CSD nreps 10000 # CSD simtype null-z # CSD contrast NA # CSD confint 90.000000 # Overall max 3.84849 at vertex 122407 # Overall min -6.15084 at vertex 106728 # NClusters 6 # FixMNI = 0 # # ClusterNo Max VtxMax Size(mm^2) MNIX MNIY MNIZ CWP CWPLow CWPHi NVtxs WghtVtx Annot 1 -5.335 111613 886.69 -44.2 -69.7 9.3 0.00020 0.00000 0.00040 1561 -6564.19 inferiorparietal 2 -6.151 106728 721.02 -43.3 -56.5 25.3 0.00020 0.00000 0.00040 1591 -7085.05 inferiorparietal 3 -5.789 104223 577.10 -43.7 -39.2 11.4 0.00020 0.00000 0.00040 1440 -6316.08 superiortemporal 4 -5.241 148689 225.71 -57.9 -33.7 -16.5 0.00080 0.00040 0.00140 355 -1496.33 middletemporal 5 -4.453 135429 105.36 -51.6 -34.7 0.1 0.02938 0.02642 0.03253 249 -997.82 bankssts 6 -4.944 72121 102.69 -40.1 -31.3 35.2 0.03155 0.02840 0.03469 321 -1362.45 supramarginal
Para el hemisferio derecho se hace exactamente los mismo
[osotolongo@detritus lfacehbi]$ cat rh.lfacehbi.thickness-pc1.fwhm15/osgm/cache.th40.neg.sig.cluster.summary # Cluster Growing Summary (mri_surfcluster) # $Id: mri_surfcluster.c,v 1.57.2.3 2016/11/17 18:19:42 zkaufman Exp $ # $Id: mrisurf.c,v 1.781.2.6 2016/12/27 16:47:14 zkaufman Exp $ # CreationTime 2020/07/11-10:21:49-GMT # cmdline mri_surfcluster.bin --in rh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh --mask rh.lfacehbi.thickness-pc1.fwhm15//mask.mgh --cwsig rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.mgh --sum rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.summary --ocn rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.mgh --annot aparc --cwpvalthresh 0.05 --o rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.masked.mgh --no-fixmni --csd /usr/local/freesurfer/average/mult-comp-cor/fsaverage/rh/cortex/fwhm17/neg/th40/mc-z.csd --csdpdf rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.pdf.dat --vwsig rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.mgh --vwsigmax rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.max.dat --oannot rh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.annot --bonferroni 2 --surf white # cwd /nas/data/lfacehbi # sysname Linux # hostname detritus.fundacioace.com # machine x86_64 # FixVertexAreaFlag 1 # FixSurfClusterArea 1 # # Input rh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh # Frame Number 0 # srcsubj fsaverage # hemi rh # surface white # group_avg_surface_area 82167.6 # group_avg_vtxarea_loaded 1 # annot aparc # SUBJECTS_DIR /nas/data/subjects # SearchSpace_mm2 74448.5 # SearchSpace_vtx 146051 # Bonferroni 2 # Minimum Threshold 4 # Maximum Threshold infinity # Threshold Sign neg # AdjustThreshWhenOneTail 1 # CW PValue Threshold: 0.05 # Area Threshold 0 mm^2 # CSD thresh 4.000000 # CSD nreps 10000 # CSD simtype null-z # CSD contrast NA # CSD confint 90.000000 # Overall max 2.48051 at vertex 151862 # Overall min -10.6092 at vertex 53693 # NClusters 8 # FixMNI = 0 # # ClusterNo Max VtxMax Size(mm^2) MNIX MNIY MNIZ CWP CWPLow CWPHi NVtxs WghtVtx Annot 1 -10.609 53693 3009.16 51.4 1.9 -28.1 0.00020 0.00000 0.00040 5686 -29777.36 middletemporal 2 -5.368 66629 232.21 19.8 37.8 40.7 0.00060 0.00020 0.00100 352 -1559.05 superiorfrontal 3 -4.418 3639 191.12 50.5 3.8 17.7 0.00260 0.00180 0.00360 415 -1653.58 precentral 4 -4.710 145684 151.25 44.1 29.2 12.8 0.00878 0.00719 0.01057 275 -1142.94 rostralmiddlefrontal 5 -6.086 138124 149.07 25.4 -11.9 48.9 0.00898 0.00739 0.01077 384 -1799.79 precentral 6 -4.892 48409 137.53 22.1 12.8 48.0 0.01296 0.01097 0.01494 304 -1282.61 superiorfrontal 7 -5.038 154726 98.25 9.8 62.1 2.0 0.03115 0.02800 0.03430 123 -527.80 superiorfrontal 8 -4.289 152608 83.01 36.8 3.4 35.4 0.04898 0.04508 0.05288 147 -570.54 caudalmiddlefrontal
añadiendo variables
[osotolongo@detritus lfacehbi]$ cp /nas/data/facehbi/fbb_results/fbb_centiloid.csv fbb_centiloid_v0.csv [osotolongo@detritus lfacehbi]$ cp /nas/data/f2cehbi/fbb_results/fbb_centiloid.csv fbb_centiloid_v2.csv [osotolongo@detritus lfacehbi]$ awk -F "," '{print $1";"$3}' fbb_centiloid_v0.csv | sed 's/Centiloid/CLv0/' > centiloid_v0.csv [osotolongo@detritus lfacehbi]$ awk -F "," '{print $1";"$3}' fbb_centiloid_v2.csv | sed 's/Centiloid/CLv2/' > centiloid_v2.csv [osotolongo@detritus lfacehbi]$ head data_2concat.csv PSubject;Subject;years;age;Education;Gender F001;0001;2.14;71;8;0 F002;0002;2.3;70;12;1 F003;0003;2.12;70;8;0 F005;0005;2.05;68;20;1 F006;0006;2.09;64;14;0 F007;0007;2.09;59;19;1 F008;0008;2.1;55;16;0 F009;0009;2.08;67;16;0 F010;0010;2.08;68;20;1 [osotolongo@detritus lfacehbi]$ head centiloid_v0.csv Subject; CLv0 0001;-5.5932502480232 0002;12.0568081099059 0003;-7.34233394547346 0004;10.1249519364766 0005;-8.16586361015402 0006;37.3631081538974 0007;75.8928076873738 0008;-17.1955789921458 0009;3.61335051283359 [osotolongo@detritus lfacehbi]$ join -t";" -1 2 -2 1 data_2concat.csv centiloid_v0.csv > data_2concat_temp.csv [osotolongo@detritus lfacehbi]$ head data_2concat_temp.csv Subject;PSubject;years;age;Education;Gender; CLv0 0001;F001;2.14;71;8;0;-5.5932502480232 0002;F002;2.3;70;12;1;12.0568081099059 0003;F003;2.12;70;8;0;-7.34233394547346 0005;F005;2.05;68;20;1;-8.16586361015402 0006;F006;2.09;64;14;0;37.3631081538974 0007;F007;2.09;59;19;1;75.8928076873738 0008;F008;2.1;55;16;0;-17.1955789921458 0009;F009;2.08;67;16;0;3.61335051283359 0010;F010;2.08;68;20;1;-11.4216448690239 [osotolongo@detritus lfacehbi]$ join -t";" data_2concat_temp.csv centiloid_v2.csv > data_2concat_extra.csv [osotolongo@detritus lfacehbi]$ head data_2concat_extra.csv Subject;PSubject;years;age;Education;Gender; CLv0; CLv2 0001;F001;2.14;71;8;0;-5.5932502480232;-3.01177159588474 0002;F002;2.3;70;12;1;12.0568081099059;9.07793600565404 0003;F003;2.12;70;8;0;-7.34233394547346;-7.74822070800795 0005;F005;2.05;68;20;1;-8.16586361015402;39.4041710834628 0006;F006;2.09;64;14;0;37.3631081538974;80.3079451944081 0007;F007;2.09;59;19;1;75.8928076873738;-20.4246399521569 0008;F008;2.1;55;16;0;-17.1955789921458;5.61946211270271 0009;F009;2.08;67;16;0;3.61335051283359;-14.8767332876036 0010;F010;2.08;68;20;1;-11.4216448690239;0.505619291014511 [osotolongo@detritus lfacehbi]$ head facehbi_apoe4_sorted.csv PSubject;ApoeE4 F001;1 F002;0 F003;0 F004;0 F005;0 F006;1 F007;0 F008;0 F009;0 [osotolongo@detritus lfacehbi]$ join -t";" -1 2 -2 1 data_2concat_extra.csv facehbi_apoe4_sorted.csv > data_2concat_wvars.csv [osotolongo@detritus lfacehbi]$ head data_2concat_wvars.csv PSubject;Subject;years;age;Education;Gender; CLv0; CLv2;ApoeE4 F001;0001;2.14;71;8;0;-5.5932502480232;-3.01177159588474;1 F002;0002;2.3;70;12;1;12.0568081099059;9.07793600565404;0 F003;0003;2.12;70;8;0;-7.34233394547346;-7.74822070800795;0 F005;0005;2.05;68;20;1;-8.16586361015402;39.4041710834628;0 F006;0006;2.09;64;14;0;37.3631081538974;80.3079451944081;1 F007;0007;2.09;59;19;1;75.8928076873738;-20.4246399521569;0 F008;0008;2.1;55;16;0;-17.1955789921458;5.61946211270271;0 F009;0009;2.08;67;16;0;3.61335051283359;-14.8767332876036;0 [osotolongo@detritus lfacehbi]$ head flong_2concat.csv Subject;flong_subject;SubjID_v0;SubjID_v2 0001;flong_0001;facehbi_0001;f2cehbi_0001 0002;flong_0002;facehbi_0002;f2cehbi_0002 0003;flong_0003;facehbi_0003;f2cehbi_0003 0005;flong_0005;facehbi_0005;f2cehbi_0004 0006;flong_0006;facehbi_0006;f2cehbi_0005 0007;flong_0007;facehbi_0007;f2cehbi_0006 0008;flong_0008;facehbi_0008;f2cehbi_0007 0009;flong_0009;facehbi_0009;f2cehbi_0008 0010;flong_0010;facehbi_0010;f2cehbi_0009 [osotolongo@detritus lfacehbi]$ join -t";" -1 1 -2 2 flong_2concat.csv data_2concat_wvars.csv | tail -n +2 |awk -F";" '{print $3" "$2" 0 "$7" "$8" "$9" "$10" "$12"\n"$4" "$2" "$6" "$7+$6" "$8" "$9" "$11" "$12}' > qdec/long.qdec.table.extra.dat [osotolongo@detritus lfacehbi]$ sed -i '1ifsid fsid-base years age education gender centiloid apoe' qdec/long.qdec.table.extra.dat [osotolongo@detritus lfacehbi]$ head qdec/long.qdec.table.extra.dat fsid fsid-base years age education gender centiloid apoe facehbi_0001 flong_0001 0 71 8 0 -5.5932502480232 1 f2cehbi_0001 flong_0001 2.14 73.14 8 0 -3.01177159588474 1 facehbi_0002 flong_0002 0 70 12 1 12.0568081099059 0 f2cehbi_0002 flong_0002 2.3 72.3 12 1 9.07793600565404 0 facehbi_0003 flong_0003 0 70 8 0 -7.34233394547346 0 f2cehbi_0003 flong_0003 2.12 72.12 8 0 -7.74822070800795 0 facehbi_0005 flong_0005 0 68 20 1 -8.16586361015402 0 f2cehbi_0004 flong_0005 2.05 70.05 20 1 39.4041710834628 0 facehbi_0006 flong_0006 0 64 14 0 37.3631081538974 1
Voy a hacer el run de nuevo par asegurarme de que sean los mismos sujetos. ¿Porqué? Porque si falta algun valor de Centiloid o de APOE la tabla es distinta de la plantilla de sujetos y el analisis no sera valido.
[osotolongo@detritus lfacehbi]$ long_mris_slopes --qdec qdec/long.qdec.table.extra.dat --meas thickness --hemi lh --sd $SUBJECTS_DIR --do-pc1 --do-label --generic-time --fwhm 15 --qcache fsaverage --stack-pc1 lh.lfacehbi.thickness-pc1.sta ck.mgh --isec-labels lh.lfacehbi.fsaverage.cortex.label ... [osotolongo@detritus lfacehbi]$ mri_glmfit --osgm --glmdir lh.lfacehbi.thickness-pc1.fwhm15 --y lh.lfacehbi.thickness-pc1.stack.fwhm15.mgh --label lh.lfacehbi.fsaverage.cortex.label --surf fsaverage lh [osotolongo@detritus lfacehbi]$ mri_glmfit-sim --glmdir lh.lfacehbi.thickness-pc1.fwhm15/ --cache 4 neg --cwp 0.05 --2spaces
[osotolongo@detritus lfacehbi]$ cat lh.lfacehbi.thickness-pc1.fwhm15/osgm/cache.th40.neg.sig.cluster.summary # Cluster Growing Summary (mri_surfcluster) # $Id: mri_surfcluster.c,v 1.57.2.3 2016/11/17 18:19:42 zkaufman Exp $ # $Id: mrisurf.c,v 1.781.2.6 2016/12/27 16:47:14 zkaufman Exp $ # CreationTime 2020/07/16-07:40:57-GMT # cmdline mri_surfcluster.bin --in lh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh --mask lh.lfacehbi.thickness-pc1.fwhm15//mask.mgh --cwsig lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.mgh --sum lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.cluster.summary --ocn lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.mgh --annot aparc --cwpvalthresh 0.05 --o lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.masked.mgh --no-fixmni --csd /usr/local/freesurfer/average/mult-comp-cor/fsaverage/lh/cortex/fwhm16/neg/th40/mc-z.csd --csdpdf lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.pdf.dat --vwsig lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.mgh --vwsigmax lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.voxel.max.dat --oannot lh.lfacehbi.thickness-pc1.fwhm15//osgm/cache.th40.neg.sig.ocn.annot --bonferroni 2 --surf white # cwd /nas/data/lfacehbi # sysname Linux # hostname detritus.fundacioace.com # machine x86_64 # FixVertexAreaFlag 1 # FixSurfClusterArea 1 # # Input lh.lfacehbi.thickness-pc1.fwhm15//osgm/sig.mgh # Frame Number 0 # srcsubj fsaverage # hemi lh # surface white # group_avg_surface_area 82220 # group_avg_vtxarea_loaded 1 # annot aparc # SUBJECTS_DIR /nas/data/subjects # SearchSpace_mm2 75466.5 # SearchSpace_vtx 147613 # Bonferroni 2 # Minimum Threshold 4 # Maximum Threshold infinity # Threshold Sign neg # AdjustThreshWhenOneTail 1 # CW PValue Threshold: 0.05 # Area Threshold 0 mm^2 # CSD thresh 4.000000 # CSD nreps 10000 # CSD simtype null-z # CSD contrast NA # CSD confint 90.000000 # Overall max 3.29752 at vertex 122410 # Overall min -6.1539 at vertex 79545 # NClusters 5 # FixMNI = 0 # # ClusterNo Max VtxMax Size(mm^2) MNIX MNIY MNIZ CWP CWPLow CWPHi NVtxs WghtVtx Annot 1 -4.666 69976 543.01 -46.8 -63.9 -5.4 0.00020 0.00000 0.00040 950 -3790.21 inferiortemporal 2 -6.154 79545 343.66 -59.2 -33.7 -16.6 0.00020 0.00000 0.00040 554 -2475.20 middletemporal 3 -5.453 122049 309.70 -43.5 -57.4 26.6 0.00020 0.00000 0.00040 664 -2881.33 inferiorparietal 4 -5.080 120104 241.40 -42.6 -38.3 10.1 0.00040 0.00000 0.00080 683 -2914.49 superiortemporal 5 -5.054 72121 120.14 -40.1 -31.3 35.2 0.01475 0.01256 0.01693 367 -1580.34 supramarginal
Voy a intentar algo aqui
[osotolongo@detritus lfacehbi]$ sed 's/fsid fsid-base/Variables/;s/facehbi_\([0-9]*\) flong_\([0-9]*\)/Input facehbi_\1.long.flong_\2 Main/;s/f2cehbi_\([0-9]*\) flong_\([0-9]*\)/Input f2cehbi_\1.long.flong_\2 Main/' qdec/long.qdec.table.extra.dat > qdec.body [osotolongo@detritus lfacehbi]$ cat headers.txt qdec.body > qdec.fsgd [osotolongo@detritus lfacehbi]$ head qdec.fsgd Variables years age education gender centiloid apoe Input facehbi_0001.long.flong_0001 Main 0 71 8 0 -5.5932502480232 1 Input f2cehbi_0001.long.flong_0001 Main 2.14 73.14 8 0 -3.01177159588474 1 Input facehbi_0002.long.flong_0002 Main 0 70 12 1 12.0568081099059 0 Input f2cehbi_0002.long.flong_0002 Main 2.3 72.3 12 1 9.07793600565404 0 Input facehbi_0003.long.flong_0003 Main 0 70 8 0 -7.34233394547346 0 Input f2cehbi_0003.long.flong_0003 Main 2.12 72.12 8 0 -7.74822070800795 0 Input facehbi_0005.long.flong_0005 Main 0 68 20 1 -8.16586361015402 0 Input f2cehbi_0004.long.flong_0005 Main 2.05 70.05 20 1 39.4041710834628 0 Input facehbi_0006.long.flong_0006 Main 0 64 14 0 37.3631081538974 1 [osotolongo@detritus lfacehbi]$ cat age.mtx 0 0 1 0 0 0 0 [osotolongo@detritus lfacehbi]$ cat apoe.mtx 0 0 0 0 0 0 1 [osotolongo@detritus lfacehbi]$ cat centiloid.mtx 0 0 0 0 0 1 0 [osotolongo@detritus lfacehbi]$ cat years.mtx 0 1 0 0 0 0 0
Primero voy a hacer el cache,
[osotolongo@detritus lfacehbi]$ mris_preproc --fsgd qdec.fsgd --target fsaverage --hemi lh --meas thickness --out lh.long.thickness.00.mgh [osotolongo@detritus lfacehbi]$ mri_surf2surf --hemi lh --s fsaverage --sval lh.long.thickness.00.mgh --fwhm 10 --cortex --tval lh.long.thickness.10.mgh
y lanzo el analisis ahora,
[osotolongo@detritus lfacehbi]$ mri_glmfit --fsgd qdec.fsgd --glmdir lh.long.thickness.fwhm10 --y lh.long.thickness.10.mgh --C centiloid.mtx --C apoe.mtx --C years.mtx --C age.mtx --surf fsaverage lh
Voy a ver,
Edad:
$ freeview -f /nas/data/subjects/fsaverage/surf/lh.inflated:annot=aparc.annot:annot_outline=1:overlay=lh.long.thickness.fwhm10/age/sig.mgh:overlay_threshold=4,5 -viewport 3d
APOEe4:
$ freeview -f /nas/data/subjects/fsaverage/surf/lh.inflated:annot=aparc.annot:annot_outline=1:overlay=lh.long.thickness.fwhm10/apoe/sig.mgh:overlay_threshold=4,5 -viewport 3d
Paired Analysis
Esto es una especie de ANOVA simplificada para dos puntos.