Table of Contents
ACE Pipeline (v0.4.0)
User Guide
pipeline version 0.4 includes some advantages,
- simple use and faster
- BIDS format, compatible with almost any neuroimaging software out there
- Fully integrated into the cluster
Important In order to manage the cluster we use slurm. Jobs can be managed with simple commands such as squeue and scancel.
Making the project
First you must be sure that the project directory exists and the user has full access.
[osotolongo@detritus mopead]$ pwd /nas/data/mopead [osotolongo@detritus mopead]$ ls -la total 33 drwxr-xr-x 2 osotolongo osotolongo 64 Dec 15 12:28 . drwxr-xr-x 2 root root 32768 Dec 15 12:28 ..
Now, just make the project,
[osotolongo@detritus mopead]$ make_proj.pl mopead /nas/corachan/MOPEAD
where mopead is the project name and /nas/corachan/MOPEAD the full path to the DICOMS.
Then, create the database,
[osotolongo@detritus mopead]$ update_mri_db.pl mopead
Here the proper database is created from the DICOM data.
BIDS
At this point we should create the hierarchical BIDS structure.
This is the more delicate part of project making. We should do it slowly and review it until we are sure everything is OK. After this, everything is more or less automatic and drop or add subjects is very easy.
First, we enter the bids directory,
[osotolongo@detritus mopead]$ cd bids/
and there we should run the dcm2bids helper,
[osotolongo@detritus bids]$ dcm2bids_scaffold [osotolongo@detritus bids]$ dcm2bids_helper -d /nas/corachan/MOPEAD/24A8DVSH/ Example in: /nas/data/mopead/bids/tmp_dcm2bids/helper
No we should manually edit,
the dataset_description.json file
Now, we can proceed,
[osotolongo@detritus ~]$ bulk2bids.pl mopead Submitted batch job 114634 Submitted batch job 114635 Submitted batch job 114636 Submitted batch job 114637 Submitted batch job 114638 Submitted batch job 114639 Submitted batch job 114640 Submitted batch job 114641 ... ... ... [osotolongo@detritus ~]$ squeue | wc -l 128 [osotolongo@detritus mopead]$ ls bids/ CHANGES sourcedata sub-0008 sub-0016 sub-0024 sub-0032 sub-0040 sub-0048 sub-0056 sub-0064 sub-0072 sub-0080 sub-0088 sub-0096 sub-0104 sub-0112 sub-0120 code sub-0001 sub-0009 sub-0017 sub-0025 sub-0033 sub-0041 sub-0049 sub-0057 sub-0065 sub-0073 sub-0081 sub-0089 sub-0097 sub-0105 sub-0113 sub-0121 conversion.json sub-0002 sub-0010 sub-0018 sub-0026 sub-0034 sub-0042 sub-0050 sub-0058 sub-0066 sub-0074 sub-0082 sub-0090 sub-0098 sub-0106 sub-0114 sub-0122 dataset_description.json sub-0003 sub-0011 sub-0019 sub-0027 sub-0035 sub-0043 sub-0051 sub-0059 sub-0067 sub-0075 sub-0083 sub-0091 sub-0099 sub-0107 sub-0115 sub-0123 derivatives sub-0004 sub-0012 sub-0020 sub-0028 sub-0036 sub-0044 sub-0052 sub-0060 sub-0068 sub-0076 sub-0084 sub-0092 sub-0100 sub-0108 sub-0116 sub-0124 participants.json sub-0005 sub-0013 sub-0021 sub-0029 sub-0037 sub-0045 sub-0053 sub-0061 sub-0069 sub-0077 sub-0085 sub-0093 sub-0101 sub-0109 sub-0117 sub-0125 participants.tsv sub-0006 sub-0014 sub-0022 sub-0030 sub-0038 sub-0046 sub-0054 sub-0062 sub-0070 sub-0078 sub-0086 sub-0094 sub-0102 sub-0110 sub-0118 sub-0126 README sub-0007 sub-0015 sub-0023 sub-0031 sub-0039 sub-0047 sub-0055 sub-0063 sub-0071 sub-0079 sub-0087 sub-0095 sub-0103 sub-0111 sub-0119 tmp_dcm2bids
MRI
The basic procedure is quite simple,
[osotolongo@detritus mopead]$ precon.pl mopead
Freesurfer errors should be got via email.
Options:
- -cut text_file : just the subjects included into the supply file are analyzed. The file should be a simple IDs list.
- -h : print the help
Metrics
FS metrics can be resumed with fs_metrics.pl script. The script also reports those subjects that can't be processed (due to errors or so). Results are stored into fsrecon directory.
[osotolongo@detritus facehbi]$ fs_metrics.pl facehbi SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/lh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_area_lh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/rh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_area_rh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/lh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_thickness_lh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/rh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_thickness_rh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/lh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_volume_lh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/rh.aparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aparc_volume_rh.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/aseg.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/aseg_stats.txt SUBJECTS_DIR : /nas/data/subjects Parsing the .stats files Skipping /nas/data/subjects/facehbi_0047/stats/wmparc.stats Building the table.. Writing the table to /nas/data/facehbi/fsrecon/wmparc_stats.txt tar: Removing leading `/' from member names [osotolongo@detritus facehbi]$ ls fsrecon/ aparc_area_lh.csv aparc_area_rh.csv aparc_thickness_lh.csv aparc_thickness_rh.csv aparc_volume_lh.csv aparc_volume_rh.csv aseg_stats.csv wmparc_stats.csv aparc_area_lh.txt aparc_area_rh.txt aparc_thickness_lh.txt aparc_thickness_rh.txt aparc_volume_lh.txt aparc_volume_rh.txt aseg_stats.txt wmparc_stats.txt [osotolongo@detritus facehbi]$ head fsrecon/aseg_stats.csv Subject,Left-Lateral-Ventricle,Left-Inf-Lat-Vent,Left-Cerebellum-White-Matter,Left-Cerebellum-Cortex,Left-Thalamus-Proper,Left-Caudate,Left-Putamen,Left-Pallidum,3rd-Ventricle,4th-Ventricle,Brain-Stem,Left-Hippocampus,Left-Amygdala,CSF,Left-Accumbens-area,Left-VentralDC,Left-vessel,Left-choroid-plexus,Right-Lateral-Ventricle,Right-Inf-Lat-Vent,Right-Cerebellum-White-Matter,Right-Cerebellum-Cortex,Right-Thalamus-Proper,Right-Caudate,Right-Putamen,Right-Pallidum,Right-Hippocampus,Right-Amygdala,Right-Accumbens-area,Right-VentralDC,Right-vessel,Right-choroid-plexus,5th-Ventricle,WM-hypointensities,Left-WM-hypointensities,Right-WM-hypointensities,non-WM-hypointensities,Left-non-WM-hypointensities,Right-non-WM-hypointensities,Optic-Chiasm,CC_Posterior,CC_Mid_Posterior,CC_Central,CC_Mid_Anterior,CC_Anterior,BrainSegVol,BrainSegVolNotVent,BrainSegVolNotVentSurf,lhCortexVol,rhCortexVol,CortexVol,lhCerebralWhiteMatterVol,rhCerebralWhiteMatterVol,CerebralWhiteMatterVol,SubCortGrayVol,TotalGrayVol,SupraTentorialVol,SupraTentorialVolNotVent,SupraTentorialVolNotVentVox,MaskVol,BrainSegVol-to-eTIV,MaskVol-to-eTIV,lhSurfaceHoles,rhSurfaceHoles,SurfaceHoles,EstimatedTotalIntraCranialVol 0001,11314.6,331.2,16315.9,57445.7,7262.1,3077.5,4603.6,2230.9,1054.8,1569.8,24259.0,3519.7,1358.0,1557.5,501.1,4081.0,35.5,626.4,12413.4,354.2,15677.6,55496.8,7033.9,3419.9,4960.3,2282.5,3663.4,1569.7,550.4,3946.2,87.5,756.4,0.0,3464.0,0.0,0.0,0.0,0.0,0.0,214.6,1149.8,771.0,831.3,491.1,1181.7,1252355.0,1222932.0,1222747.43605,232317.424583,232514.157273,464831.581856,277310.513512,277363.34068,554673.854191,55791.0,637328.581856,1105706.43605,1080296.43605,1075390.0,1716706.0,0.713002,0.977371,14.0,26.0,40.0,1756453.4998 0002,8015.3,592.7,10877.3,41680.1,7077.2,3249.0,4513.0,1899.2,1416.3,1622.4,19452.7,3421.4,1136.6,1185.6,531.2,3608.6,39.2,742.7,6839.3,498.9,10588.6,42450.1,7061.2,3369.2,4406.8,1786.1,3480.3,1419.8,445.1,3496.7,17.9,880.1,0.0,1378.7,0.0,0.0,0.0,0.0,0.0,196.4,1051.5,670.1,567.6,584.1,750.8,1118741.0,1097792.0,1097891.3851,219928.9907,223894.378738,443823.369439,251041.253828,245301.761836,496343.015664,52493.0,581557.369439,1011636.3851,994605.385103,992395.0,1595172.0,0.731484,1.042997,26.0,13.0,39.0,1529412.17649 0003,6666.2,397.3,11791.2,48058.8,6066.5,3358.3,3809.0,1592.5,832.2,1364.6,18620.1,3358.2,1230.1,868.8,387.8,3196.3,29.5,564.5,6502.5,299.7,11534.7,49200.8,5873.7,3507.3,3946.2,1524.0,3276.2,1337.3,396.1,3038.5,28.3,833.7,0.0,898.6,0.0,0.0,0.0,0.0,0.0,209.7,1033.6,567.5,426.9,391.2,838.6,977510.0,959921.0,959378.055641,201494.554347,198892.291337,400386.845684,195828.588494,195856.621464,391685.209958,47395.0,546057.845684,856065.055641,841326.055641,839846.0,1343201.0,0.736957,1.012656,14.0,18.0,32.0,1326413.64353 0004,13659.3,489.6,10112.0,45840.7,5329.2,2957.3,4039.3,1656.6,1394.3,1938.2,18888.5,3457.0,1255.7,1018.7,377.2,3088.6,78.5,733.7,11277.9,466.2,10573.7,46444.1,5301.5,3150.7,4467.3,1517.3,3402.9,1474.7,521.9,2920.4,22.7,876.8,0.0,1652.1,0.0,0.0,0.0,0.0,0.0,193.4,907.1,379.2,383.1,378.4,717.1,985232.0,953675.0,953154.249272,211068.911271,211312.154899,422381.06617,186018.611147,185870.571956,371889.183103,46433.0,561638.06617,869910.249272,842533.249272,841104.0,1416122.0,0.720025,1.034927,28.0,19.0,47.0,1368330.41095 0005,9119.2,488.1,12106.0,49738.7,5548.3,2705.3,3840.9,1639.5,1314.8,1237.2,20084.7,3380.8,1425.8,1187.9,510.3,3551.4,33.5,711.0,11314.7,240.3,12437.6,50174.7,5623.1,2778.4,3772.7,1631.3,3170.1,1526.3,551.3,3323.8,10.0,835.5,0.0,1461.0,0.0,0.0,0.0,0.0,0.0,226.3,841.1,491.1,468.5,479.5,854.0,1027148.0,1001275.0,1000483.86616,213704.947027,210791.150217,424496.097244,203332.944639,202993.824273,406326.768912,46353.0,570737.097244,900638.866156,878366.866156,877780.0,1514793.0,0.700141,1.032537,23.0,13.0,36.0,1467058.62476 0006,9845.8,257.8,11953.6,42127.0,5668.1,3058.9,3626.6,1623.6,823.7,1438.1,19297.5,3441.1,1303.1,930.3,466.1,3065.7,30.2,562.1,9530.3,279.3,11434.2,43358.6,5724.7,3173.0,3549.5,1551.9,3436.3,1363.6,486.7,3093.9,17.5,539.9,0.0,1265.9,0.0,0.0,0.0,0.0,0.0,240.2,806.0,303.1,342.3,378.3,779.9,941837.0,918249.0,917828.213412,190688.463096,192164.510327,382852.973423,191451.98054,189274.259449,380726.239989,46055.0,514947.973423,831817.213412,811217.213412,809871.0,1309048.0,0.724441,1.006892,15.0,9.0,24.0,1300087.33412 0007,8254.1,569.3,14655.5,57027.2,7220.3,3076.4,5000.4,2047.6,1444.1,2180.0,24925.8,4269.4,1646.8,1001.6,540.3,4138.4,40.0,947.1,6731.7,587.5,14976.0,56227.2,7192.1,3079.5,5037.1,1862.1,4379.1,1597.3,535.0,3967.1,26.9,907.7,0.0,1198.9,0.0,0.0,0.0,0.0,0.0,262.9,829.0,517.2,539.3,553.5,625.4,1226498.0,1204649.0,1204625.12276,233552.867924,233491.082501,467043.950425,269344.3417,268203.830633,537548.172333,56942.0,638499.950425,1081264.12276,1063852.12276,1061376.0,1709952.0,0.733114,1.022089,8.0,4.0,12.0,1672996.58646 0008,4739.9,257.8,13307.8,50309.4,5704.9,2475.9,3501.6,1505.1,672.2,1370.4,20671.8,3092.7,1043.9,689.8,407.1,3315.0,16.0,489.3,3985.9,140.1,13008.9,52329.7,5754.2,2456.4,3490.6,1527.0,3567.7,1220.0,372.4,3274.7,43.3,603.3,0.0,812.4,0.0,0.0,0.0,0.0,0.0,179.6,756.2,499.1,479.6,404.2,798.0,910640.0,898457.0,898532.130739,184673.361317,183228.623455,367901.984773,179094.484484,179136.661483,358231.145966,44238.0,515354.984773,781570.130739,771916.130739,770162.0,1273935.0,0.740047,1.035285,22.0,15.0,37.0,1230515.65242 0009,17659.9,560.9,11186.2,46549.7,5295.9,3719.8,4068.9,1616.9,1616.2,1541.8,19118.9,3503.6,1054.6,1216.6,320.6,3060.4,55.7,810.6,16694.3,408.0,11297.5,44192.0,5514.3,3828.4,3913.2,1367.9,3511.5,1222.7,381.1,3083.0,48.7,814.6,0.0,5160.4,0.0,0.0,0.0,0.0,0.0,222.0,971.0,356.7,388.4,404.7,840.6,1020601.0,979664.0,979835.424075,196883.952137,197792.698985,394676.651121,213727.223911,211588.549044,425315.772954,47171.0,533147.651121,905448.424075,868732.424075,866892.0,1482239.0,0.668526,0.970914,31.0,38.0,69.0,1526642.95225
DTI
Preproc
In DTI preprocessing, we take existent DTI and T1w images and we make topup and eddy, we register MNI template to T1 native space, and then to native DTI space (now with epi_reg). Using the info from this transformation the JHU atlases are registered to DTI native space.
Nota: Files acqparams.txt and dti_index.txt should exists. See how to make index.txt and acqparams.txt
$ dti_reg.pl mopead
or
$dti_reg.pl -chop -time '12:0:0' facehbi
Option:
- -cut text_file : just the subjects included into the supply file are analyzed. The file should be a simple IDs list.
- -old : Use the standard FSL method to register images (avoid as much as possible)
- -chop : for bad or choped DTI images. Take choped DTI images and try to make the register with choped MNI and T1 templates.
- -time : Change the maximum excution time. Default is 8 hours but for complex operations it can be modified.
- -h : print this help
QC
A visual inspection of registration quality is required.
$ make_dti_report.pl facehbi
Options:
- -h : Print this help
this command result in a report at working/dtis/index.html, that shows how well registration is done.
Metricas
Now we can get the FA and MD values for some ROIs of JHU atlases,
$ dti_metrics.pl mopead
Options:
- -cut text_file : just subjects included in this file are analyzed. The supplied file must be a single IDs list
- -a1 : Use the ICBM-DTI-81 white-matter labels atlas
- -a2 : Use the JHU white-matter tractography atlas (default)
- -sd : Also print standard deviations
- -h : Print the help
Tractography
The tractography processing is a combined execution of FSL's bedpostx and probtrackx scripts.
$ dti_track.pl mopead
or
[osotolongo@detritus facehbi]$ dti_track.pl -t1 -time '12:0:0' facehbi [osotolongo@detritus facehbi]$ squeue ... ... ... 121725 cuda dti_trac osotolon PD 0:00 1 (Priority) 121726 cuda dti_trac osotolon PD 0:00 1 (Priority) 121727 fast dti_trac osotolon PD 0:00 1 (Dependency) 121523 cuda dti_trac osotolon R 4:59 1 detritus 121524 cuda dti_trac osotolon R 1:32 1 brick01 121525 cuda dti_trac osotolon R 1:32 1 brick01
or
$ dti_track.pl -t1 -time '12:0:0' -uofm DMN facehbi
Options:
- -cut text_file : Only the subjects listed in the file are analyzed. The file should be a single list of IDs
- -t1 : for choped or bad images, add an extra step to T1w registration, trying of put everything in native DTI space.
- -time : Change the execution maximum time for registration
- -uofm <net> : Use a predetermined network from the UofM atlas. If not supplied, the file dti_track.seed should exists. This file is a list of Freesurfer LUT that are going to be combined and taken as the seed ROI
- -h : Print the help
Tractography metrics
Now we can get FA and MD values on the calculated tracts.
But before, you should move the probtrack directory to another path. Something like,
for x in `ls -d working/*_probtrack_out`; do mv $x `echo $x | sed 's/out/FPCustom/'`;done
and now you should point the script to the right path,
$ dti_metrics_tracks.pl -path FPCustom facehbi
or
$ dti_metrics_tracks.pl -path FPCustom -thr 0.5 facehbi
that put the output into facehbi_dti_FPCustom.csv file,
[osotolongo@detritus facehbi]$ head facehbi_dti_FPCustom.csv Subject;FPCustom_FA;FPCustom_MD 0001;0.288294;0.000875 0002;0.290151;0.000878 0003;0.275354;0.000878 0004;0.273996;0.000894 0005;0.266883;0.000859 0006;0.281883;0.000857 0007;0.268814;0.000873 0008;0.265971;0.000888 0009;0.261239;0.000885
Options:
- -thr : define the threshold to make the tracts mask (default: 0.25)
- -path : define the path to the tractography results
- -cut text_file : Only the subjects listed in the file are analyzed. The file should be a single list of IDs
- -sd : Also print standard deviations
- -h : Print the help
TRACULA
Tracula is included in Freesurfer 6.0. Here all we have done is to implement this tools into our cluster.
See this for more info about TRACULA
Let's assume that we have a project and we already hve the FS reconstruction. We will follow 5 steps.
1.- Preprocessing
[tester@detritus tractest]$ ctrac_prep.pl tractest INFO: SUBJECTS_DIR is /nas/data/subjects INFO: Diffusion root is /nas/data/subjects Actual FREESURFER_HOME /nas/usr/local/opt/freesurfer Submitted batch job 134138 Submitted batch job 134139 Submitted batch job 134140 Submitted batch job 134141 Submitted batch job 134142 Submitted batch job 134143 [tester@detritus tractest]$ squeue | grep trac 134143 fast trac_pre tester PD 0:00 1 (Dependency) 134138 fast trac_pre tester R 0:27 1 brick01 134139 fast trac_pre tester R 0:27 1 brick01 134140 fast trac_pre tester R 0:27 1 brick01 134141 fast trac_pre tester R 0:27 1 brick01 134142 fast trac_pre tester R 0:27 1 brick01
This script build a dmri.rc with defult config and the file trac_step1.txt with the orders that are automagically executed in the cluster.
Note: The file dmri.rc is generated only if do not exists. That is, to execute a custom procedure just edit the file properly and all the changes will be followed by the posterior executions. To restore defaults values just erase the file.
2.- BEDPOST
[tester@detritus tractest]$ ctrac_bedp.pl tractest INFO: SUBJECTS_DIR is /nas/data/subjects INFO: Diffusion root is /nas/data/subjects Actual FREESURFER_HOME /nas/usr/local/opt/freesurfer Submitted batch job 134156 Submitted batch job 134157 Submitted batch job 134158 Submitted batch job 134159 Submitted batch job 134160 Submitted batch job 134162 Submitted batch job 134163 Submitted batch job 134164 ... ... ...
Here three order files are executed (preproc, proc y postproc),
[tester@detritus tractest]$ ls trac_step2.* trac_step2.post.txt trac_step2.pre.txt trac_step2.txt
three emails will be send, one for each step.
3.- PROBTRAC
[tester@detritus tractest]$ ctrac_path.pl tractest INFO: SUBJECTS_DIR is /nas/data/subjects INFO: Diffusion root is /nas/data/subjects Actual FREESURFER_HOME /nas/usr/local/opt/freesurfer Submitted batch job 134822 Submitted batch job 134823 Submitted batch job 134824 Submitted batch job 134825 Submitted batch job 134826 Submitted batch job 134827
Here file trac_step3.txt is generated and executed.
4.- Statistics
[tester@detritus tractest]$ ctrac_stat.pl tractest INFO: SUBJECTS_DIR is /nas/data/subjects INFO: Diffusion root is /nas/data/subjects Actual FREESURFER_HOME /nas/usr/local/opt/freesurfer Submitted batch job 134828 Submitted batch job 134829 Submitted batch job 134830 Submitted batch job 134831 Submitted batch job 134832 Submitted batch job 134833 Submitted batch job 134834 Submitted batch job 134835 Submitted batch job 134836 Submitted batch job 134837 Submitted batch job 134838 Submitted batch job 134839 Submitted batch job 134840 Submitted batch job 134841 Submitted batch job 134842 Submitted batch job 134843 Submitted batch job 134844 Submitted batch job 134845 Submitted batch job 134846
File trac_step4.txt is built and executed. Group stats are collected and store into stats directory.
5.- Mean values of FA and MD, for each tract and subject, are calculated and tabulated.
[tester@detritus tractest]$ ctrac_metrics.pl tractest Collecting needed files tar: Removing leading `/' from member names
Now, this is just a parser. The result is the file proj_dti_tracula.csv,
[tester@detritus tractest]$ head tractest_dti_tracula.csv Subject;fmajor_FA;fmajor_MD;fminor_FA;fminor_MD;lh.atr_FA;lh.atr_MD;lh.cab_FA;lh.cab_MD;lh.ccg_FA;lh.ccg_MD;lh.cst_FA;lh.cst_MD;lh.ilf_FA;lh.ilf_MD;lh.slfp_FA;lh.slfp_MD;lh.slft_FA;lh.slft_MD;lh.unc_FA;lh.unc_MD;rh.atr_FA;rh.atr_MD;rh.cab_FA;rh.cab_MD;rh.ccg_FA;rh.ccg_MD;rh.cst_FA;rh.cst_MD;rh.ilf_FA;rh.ilf_MD;rh.slfp_FA;rh.slfp_MD;rh.slft_FA;rh.slft_MD;rh.unc_FA;rh.unc_MD 0004;0.520145;0.000843642;0.410587;0.00081121;0.391214;0.000774988;0.377662;0.00081455;0.487602;0.000783913;0.455246;0.000763673;0.419238;0.000839407;0.38455;0.00077119;0.399054;0.00079663;0.384662;0.000819744;0.404623;0.000769576;0.34408;0.000861059;0.448894;0.000788219;0.453546;0.000795844;0.411876;0.000899053;0.374587;0.000811536;0.389295;0.000819078;0.347546;0.000849276 0015;0.596899;0.000820075;0.475061;0.000804296;0.40034;0.000754748;0.304636;0.000869977;0.542709;0.000784643;0.495381;0.000739234;0.483681;0.000785939;0.453369;0.000727648;0.452678;0.000749013;0.417608;0.000767704;0.392618;0.000788396;0.368962;0.000829129;0.46229;0.000752411;0.493338;0.000771179;0.473559;0.000834579;0.440308;0.000755965;0.437991;0.000774135;0.407445;0.000797903 0028;0.527555;0.00105621;0.428527;0.000852201;0.39575;0.000822193;0.301705;0.000881434;0.496036;0.000806009;0.490595;0.000764855;0.472789;0.000899013;0.43059;0.000784625;0.450091;0.000799839;0.382866;0.000849374;0.372634;0.000825224;0.345963;0.000865059;0.518088;0.000821214;0.499748;0.00078346;0.423711;0.00094588;0.401581;0.000826089;0.399061;0.000828376;0.354049;0.000894577 0059;0.669996;0.000783834;0.488369;0.000810749;0.446545;0.000758311;0.372283;0.000817222;0.559518;0.000752429;0.532042;0.000731524;0.54095;0.000753004;0.450834;0.000735433;0.471832;0.000730669;0.426412;0.000810054;0.457772;0.000766213;0.387319;0.000801796;0.510041;0.000753322;0.516689;0.000753535;0.478745;0.000788816;0.492937;0.000804665;0.446911;0.000782436;0.43206;0.000821713 0067;0.517881;0.000847023;0.431189;0.000811166;0.422534;0.000738637;0.340224;0.000875277;0.432297;0.000770767;0.501685;0.000754257;0.446785;0.000832244;0.422139;0.000721088;0.451182;0.000742156;0.388619;0.000804834;0.406584;0.000775454;0.327958;0.000909032;0.374263;0.000797442;0.514622;0.000779207;0.417395;0.000865411;0.414617;0.000784126;0.450303;0.000790643;0.381713;0.000834836
Those are the final results.
Notice:
You must check the registration was right,
[fsluser@FSLVm7_64 ~]$ freeview -v /usr/local/fsl/data/standard/MNI152_T1_1mm_brain.nii.gz /nas/data/subjects/tractest_0004/dmri/mni/dtifit_FA.bbr.nii.gz &
What we do is to include a report tool,
[tester@detritus tractest]$ ctrac_report.pl tractest
which output is a report into directory ctrac_report
fMRI
single subject ICA
This is just FSL individual ICA analysis,
[osotolongo@detritus bids]$ rs_ica_one.pl mopead Collecting needed files sbatch /nas/data/mopead/working/0001_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118305 /nas/data/mopead/working/0001_rs.ica/scripts/feat2.sh sbatch --depend=afterok:118306 /nas/data/mopead/working/0001_rs.ica/scripts/feat4.sh sbatch /nas/data/mopead/working/0002_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118308 /nas/data/mopead/working/0002_rs.ica/scripts/feat2.sh sbatch --depend=afterok:118309 /nas/data/mopead/working/0002_rs.ica/scripts/feat4.sh sbatch /nas/data/mopead/working/0003_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118311 /nas/data/mopead/working/0003_rs.ica/scripts/feat2.sh sbatch --depend=afterok:118312 /nas/data/mopead/working/0003_rs.ica/scripts/feat4.sh ... ... ...
Final report is at working/0001_rs.ica/filtered_func_data.ica/report/00index.html.
Group ICA
This is FSL's group ICA analysis
[osotolongo@detritus bids]$ rs_ica_group.pl mopead Collecting needed files Counting available subjects Getting info from images Checking images and excluding wrong subjects Copying FSL files and setting directories Making global .fsf file Making individual .fsf files and scripts sbatch /nas/data/mopead/working/0001_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118672 /nas/data/mopead/working/0001_rs.ica/scripts/feat2.sh sbatch /nas/data/mopead/working/0002_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118674 /nas/data/mopead/working/0002_rs.ica/scripts/feat2.sh sbatch /nas/data/mopead/working/0003_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118676 /nas/data/mopead/working/0003_rs.ica/scripts/feat2.sh sbatch /nas/data/mopead/working/0004_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118678 /nas/data/mopead/working/0004_rs.ica/scripts/feat2.sh sbatch /nas/data/mopead/working/0005_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118680 /nas/data/mopead/working/0005_rs.ica/scripts/feat2.sh sbatch /nas/data/mopead/working/0006_rs.ica/scripts/feat1.sh sbatch --depend=afterok:118682 /nas/data/mopead/working/0006_rs.ica/scripts/feat2.sh ... ... ... Making global script sbatch --depend=afterok:120163,120165,120167,120169,120171,120173,120175,120177,120179,120181,120183,120185,120187,120189,120191,120193,120195,120197,120199,120201,120203,120205,120207,120209,120211,120213,120215,120217,120219,120221,120223,120225,120227,120229,120231,120233,120235,120237,120239,120241,120243,120245,120247,120249,120251,120253,120255,120257,120259,120261,120263,120265,120267,120269,120271,120273,120275,120277,120279,120281,120283,120285,120287,120289,120291,120293,120295,120297,120299,120301,120303,120305,120307,120309,120311,120313,120315,120317,120319,120321,120323,120325,120327,120329,120331,120333,120335,120337,120339,120341,120343,120345,120347,120349,120351,120353,120355,120357,120359,120361,120363,120365,120367,120369,120371,120373,120375,120377,120379,120381,120383,120385,120387,120389,120391,120393,120395,120397,120399,120401,120403 /nas/data/mopead/working/rs.gica/scripts/feat4_ica.sh Submitted batch job 120404
PET
update DB and DCM2BIDS
This is also a manual step. You should link the subject ID (that should exists) with the PET DICOM directory.
Next you should edit the conversion.json file. Firt run the helper,
[osotolongo@detritus bids]$ rm -rf tmp_dcm2bids/helper/* [osotolongo@detritus bids]$ dcm2bids_helper -d /nas/clinic/facehbi_2/FACEHBI-F001F/ Example in: /nas/data/f2cehbi/bids/tmp_dcm2bids/helper [osotolongo@detritus bids]$ ls tmp_dcm2bids/helper/ 003_FACEHBI-F001F__20170126133952.json 005_FACEHBI-F001F__20170126133952.json 006_FACEHBI-F001F__20170126133952.json 003_FACEHBI-F001F__20170126133952.nii.gz 005_FACEHBI-F001F__20170126133952.nii.gz 006_FACEHBI-F001F__20170126133952.nii.gz 003_FACEHBI-F001F_FACEHBI_20170126133952.json 005_FACEHBI-F001F_FACEHBI_Florbetaben_20min_20170126133952.json 006_FACEHBI-F001F_FACEHBI_Florbetaben_4x5min_20170126133952.json 003_FACEHBI-F001F_FACEHBI_20170126133952.nii.gz 005_FACEHBI-F001F_FACEHBI_Florbetaben_20min_20170126133952.nii.gz 006_FACEHBI-F001F_FACEHBI_Florbetaben_4x5min_20170126133952.nii.gz
and then add the right info,
{ "dataType": "pet", "modalityLabel": "fbb", "customLabels": "single", "criteria": { "SeriesDescription": "FACEHBI_Florbetaben_4x5min", "ImageType": ["ORIGINAL", "PRIMARY"] } }, { "dataType": "pet", "modalityLabel": "fbb", "customLabels": "combined", "criteria": { "SeriesDescription": "FACEHBI_Florbetaben_20min", "ImageType": ["ORIGINAL", "PRIMARY"] } }
Last, edit the project config file and add PET DICOM path
[osotolongo@detritus bids]$ cat ~/.config/neuro/f2cehbi.cfg DATA = /nas/data/f2cehbi SRC = /nas/corachan/facehbi_2 PET = /nas/clinic/facehbi_2 WORKING = /nas/data/f2cehbi/working BIDS = /nas/data/f2cehbi/bids
After all this you are ready to import PET images,
[osotolongo@detritus bids]$ pet2bids.pl f2cehbi Submitted batch job 117511 Submitted batch job 117512 Submitted batch job 117513 Submitted batch job 117514 Submitted batch job 117515 ... ... ... [osotolongo@detritus bids]$ squeue | head JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON) 117706 fast dcm2bids osotolon PD 0:00 1 (Resources) 117707 fast dcm2bids osotolon PD 0:00 1 (Priority) 117708 fast dcm2bids osotolon PD 0:00 1 (Priority) 117709 fast dcm2bids osotolon PD 0:00 1 (Priority) 117710 fast dcm2bids osotolon PD 0:00 1 (Dependency) 117511 fast dcm2bids osotolon R 1:02 1 brick01 117512 fast dcm2bids osotolon R 1:02 1 brick01 117513 fast dcm2bids osotolon R 1:02 1 brick01 117514 fast dcm2bids osotolon R 1:02 1 brick01
and if everything is OK you will have PET NIfTI files into the BIDS tree,
[osotolongo@detritus f2cehbi]$ tree bids/sub-0001/ bids/sub-0001/ ├── anat │ ├── sub-0001_T1w.json │ ├── sub-0001_T1w.nii.gz │ ├── sub-0001_T2w.json │ └── sub-0001_T2w.nii.gz ├── dwi │ ├── sub-0001_dwi.bval │ ├── sub-0001_dwi.bvec │ ├── sub-0001_dwi.json │ └── sub-0001_dwi.nii.gz ├── func │ ├── sub-0001_task-rest_bold.json │ └── sub-0001_task-rest_bold.nii.gz └── pet ├── sub-0001_combined_fbb.json ├── sub-0001_combined_fbb.nii.gz ├── sub-0001_single_fbb.json └── sub-0001_single_fbb.nii.gz 4 directories, 14 files
Registration
The PET image is registered to T1 native space with ANTs,
[osotolongo@detritus f2cehbi]$ fbb_correct.pl f2cehbi Collecting needed files Submitted batch job 117911 Submitted batch job 117912 Submitted batch job 117913 Submitted batch job 117914 Submitted batch job 117915 Submitted batch job 117916 ... ... ... [osotolongo@detritus f2cehbi]$ squeue JOBID PARTITION NAME USER ST TIME NODES NODELIST(REASON) 118103 fast fbb_reg_ osotolon PD 0:00 1 (Resources) 118104 fast fbb_reg_ osotolon PD 0:00 1 (Priority) 118105 fast fbb_reg_ osotolon PD 0:00 1 (Priority) 118106 fast fbb_reg_ osotolon PD 0:00 1 (Dependency) 117911 fast fbb_reg_ osotolon R 1:03 1 brick01 117912 fast fbb_reg_ osotolon R 1:03 1 brick01 117913 fast fbb_reg_ osotolon R 1:03 1 brick01 117914 fast fbb_reg_ osotolon R 1:03 1 brick01 117915 fast fbb_reg_ osotolon R 1:03 1 brick01 ... ... ...
QC
Also, the fbb_correct.pl command give us a report to evaluate de quality of the PET registration at working/fbbs directory.
SUVR and Centiloid
Following Rowe et. al., we put everything into MNI space and apply a template to calculate SUVR and later Centiloid values.
[osotolongo@detritus f2cehbi]$ fbb_cl_metrics.pl f2cehbi Submitted batch job 119962 Submitted batch job 119963 Submitted batch job 119964 Submitted batch job 119965 Submitted batch job 119966 Submitted batch job 119967 Submitted batch job 119968 ... ... ... [osotolongo@detritus f2cehbi]$ head f2cehbi_fbb_cl.csv Subject; SUVR; Centilod 0001;0.947448581259062;-9.56138763485993 0002;0.998527551470408;-1.72587360443941 0003;0.976769346725309;-5.06358221233762 0004;0.9257683185824;-12.8871399294598 0005;1.10081272707067;13.9646723326406 0006;1.33944078045911;50.5702157224281 0007;0.921519792224461;-13.5388638727676 0008;0.986394015362684;-3.58715804336427 0009;0.94463879573264;-9.99240873461304