Commandline Usage
Commandline Arguments
The command to run the main DataHIPy
commandline interface is as follows.
DataHIPy command line interface.
usage: datahipy [-h]
[--command {dataset.create,dataset.get,dataset.create_tag,dataset.get_tags,dataset.checkout_tag,datasets.get,dataset.release_version,dataset.publish,dataset.clone,sub.get,sub.import,sub.edit.clinical,sub.delete,sub.delete.file,project.create,project.sub.import,project.doc.import,project.create_tag,project.get_tags,project.checkout_tag,project.release_version}]
[--input_data INPUT_DATA] [--output_file OUTPUT_FILE]
[--dataset_path DATASET_PATH] [--input_path INPUT_PATH]
[--git_user_name GIT_USER_NAME]
[--git_user_email GIT_USER_EMAIL] [-v]
Named Arguments
- --command
Possible choices: dataset.create, dataset.get, dataset.create_tag, dataset.get_tags, dataset.checkout_tag, datasets.get, dataset.release_version, dataset.publish, dataset.clone, sub.get, sub.import, sub.edit.clinical, sub.delete, sub.delete.file, project.create, project.sub.import, project.doc.import, project.create_tag, project.get_tags, project.checkout_tag, project.release_version
Method to be run.
- --input_data
Input JSON data
- --output_file
File location after processing
- --dataset_path
Path to the dataset
Default: “/output”
- --input_path
Path to the input data (e.g. input_data.json)
Default: “/input”
- --git_user_name
Git user name to use for Datalad ops
- --git_user_email
Git user email to use for Datalad ops
- -v, --version
show program’s version number and exit
Commands
Here is a list of all the commands available in the --command
argument of the datahipy commandline interface with an example of input JSON file (--input_data
argument) generated and used by the test/cli/test_run.py
script for testing each command.
Tip
The tests in test/cli/test_run.py
can also provide a good support for understanding the usage of each command.
Dataset
dataset.create
Create a version tag in a Datalad-controlled BIDS dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "dataset_dirname": "NEW_BIDS_DS", "DatasetDescJSON": { "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom", "Jerry" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": "Picsou", "ReferencesAndLinks": "n/a", "DatasetDOI": "n/a" } }
dataset.create_tag
Create a version tag in a Datalad-controlled BIDS dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_BIDS_DS", "type": "bids", "tag": "1.0.0", "changes_list": [ "Import sub-carole data" ] }
dataset.checkout_tag
Checkout a Datalad-controlled BIDS dataset at a specific tag, the master branch, or the HEAD.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_BIDS_DS", "tag": "0.0.0" }
dataset.release_version
Make a patch (1.0.0
→ 1.0.1
) / minor (1.0.0
→ 1.1.0
) / major (1.1.0
→ 2.0.0
) version release of a Datalad-controlled BIDS dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_BIDS_DS", "type": "bids", "level": "patch", "changes_list": [ "Delete sub-carole data" ] }
Example of content of output JSON data:
{ "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom", "Jerry" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "nan", "AgeMax": "nan", "ParticipantsCount": 0, "ParticipantsGroups": [ null ], "Participants": [], "Size": "109M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [ { "key": "SUBJECT_FOLDERS", "severity": "error", "reason": "There are no subject folders (labeled \"sub-*\") in the root of this dataset.", "files": [ { "key": "SUBJECT_FOLDERS", "code": 45, "file": null, "evidence": null, "line": null, "character": null, "severity": "error", "reason": "There are no subject folders (labeled \"sub-*\") in the root of this dataset.", "helpUrl": "https://neurostars.org/search?q=SUBJECT_FOLDERS" } ], "additionalFileCount": 0, "helpUrl": "https://neurostars.org/search?q=SUBJECT_FOLDERS", "code": 45 } ], "BIDSWarnings": [], "BIDSIgnored": [ { "key": "NO_T1W", "severity": "ignore", "reason": "Dataset does not contain any T1w scans.", "files": [ { "key": "NO_T1W", "code": 53, "file": null, "evidence": null, "line": null, "character": null, "severity": "ignore", "reason": "Dataset does not contain any T1w scans.", "helpUrl": "https://neurostars.org/search?q=NO_T1W" } ], "additionalFileCount": 0, "helpUrl": "https://neurostars.org/search?q=NO_T1W", "code": 53 } ], "BIDSValid": false, "DataTypes": [], "Formats": [ ".json", ".tsv" ], "SessionsCount": 0, "Tasks": [], "RunsCount": 0, "EventsFileCount": 0, "FileCount": 4, "DatasetVersion": "1.0.1" }
dataset.get
Get a JSON summary of dataset consisting of all fields, participants, and existing entities.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "owner": "hipadmin", "path": "/test/tmp/NEW_BIDS_DS" }
Example of content of output JSON data:
{ "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "25", "AgeMax": "25", "ParticipantsCount": 1, "ParticipantsGroups": [ null ], "Participants": [ { "participant_id": "sub-carole", "age": 25, "sex": "M", "hospital": "CHUV", "handedness": "n/a", "Subject_ready": true } ], "Size": "281M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".vmrk", ".eeg", ".vhdr", ".bidsignore", ".tsv", ".nii" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 19 }
datasets.get
Get a list of JSON summaries of all datasets.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "owner": "hipadmin", "datasets": [ { "path": "/test/tmp/NEW_BIDS_DS" }, { "path": "/test/tmp/NEW_BIDS_DS" } ] }
Example of content of output JSON data:
[ { "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "25", "AgeMax": "25", "ParticipantsCount": 1, "ParticipantsGroups": [ null ], "Participants": [ { "participant_id": "sub-carole", "age": 25, "sex": "M", "hospital": "CHUV", "handedness": "n/a", "Subject_ready": true } ], "Size": "281M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".vmrk", ".eeg", ".vhdr", ".bidsignore", ".tsv", ".nii" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 19 }, { "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom", ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "25", "AgeMax": "25", "ParticipantsCount": 1, "ParticipantsGroups": [ null ], "Participants": [ { "participant_id": "sub-carole", "age": 25, "sex": "M", "hospital": "CHUV", "handedness": "n/a", "Subject_ready": true } ], "Size": "281M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".vmrk", ".eeg", ".vhdr", ".bidsignore", ".tsv", ".nii" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 19 } ]
dataset.publish
Publish a Datalad-controlled BIDS dataset to the HIP public space.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "sourceDatasetPath": "/test/tmp/NEW_BIDS_DS", "targetDatasetPath": "/test/tmp/public/PUBLIC_NEW_BIDS_DS" }
Example of content of output JSON data:
{ "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom", "Jerry" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "25", "AgeMax": "25", "ParticipantsCount": 1, "ParticipantsGroups": [ null ], "Participants": [ { "participant_id": "sub-carole", "Subject_ready": true, "age": 25, "sex": "M", "hospital": "CHUV" } ], "Size": "109M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".tsv", ".nii", ".eeg", ".vhdr", ".vmrk" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 20, "DatasetVersion": "0.0.0" }
dataset.clone
Clone a Datalad-controlled BIDS dataset from the HIP public space to the private space of the user.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "sourceDatasetPath": "/test/tmp/public/PUBLIC_NEW_BIDS_DS", "targetDatasetPath": "/test/tmp/PUBLIC_NEW_BIDS_DS" }
Example of content of output JSON data:
{ "Name": "My New BIDS dataset", "BIDSVersion": "1.4.0", "License": "n/a", "Authors": [ "Tom", "Jerry" ], "Acknowledgements": "Overwrite test", "HowToAcknowledge": "n/a", "Funding": [ "Picsou" ], "ReferencesAndLinks": [ "n/a" ], "DatasetDOI": "n/a", "AgeMin": "25", "AgeMax": "25", "ParticipantsCount": 1, "ParticipantsGroups": [ null ], "Participants": [ { "participant_id": "sub-carole", "Subject_ready": true, "age": 25, "sex": "M", "hospital": "CHUV" } ], "Size": "110M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".tsv", ".nii", ".eeg", ".vhdr", ".vmrk" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 20, "DatasetVersion": "0.0.0" }
Participant
sub.import
Import and update files for a given participant into an existing BIDS dataset. An appropriate record is added/updated to the participants.tsv
tabular file if needed.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "subjects": [ { "sub": "carole", "age": "25", "sex": "M", "hospital": "CHUV" } ], "files": [ { "modality": "ieeg", "subject": "carole", "path": "/test/test_data/sub-carole/SZ1.TRC", "entities": { "sub": "carole", "ses": "postimp", "task": "stimulation", "acq": "1024hz" } }, { "modality": "ieeg", "subject": "carole", "path": "/test/test_data/sub-carole/SZ2.TRC", "entities": { "sub": "carole", "ses": "postimp", "task": "stimulation", "acq": "1024hz" } }, { "modality": "T1w", "subject": "carole", "path": "/test/test_data/sub-carole/3DT1post_deface.nii", "entities": { "sub": "carole", "ses": "postimp", "acq": "lowres", "ce": "gadolinium" } }, { "modality": "T1w", "subject": "carole", "path": "/test/test_data/sub-carole/3DT1post_deface_2.nii", "entities": { "sub": "carole", "ses": "postimp", "acq": "lowres", "ce": "gadolinium" } }, { "modality": "T1w", "subject": "carole", "path": "/test/test_data/sub-carole/3DT1pre_deface.nii", "entities": { "sub": "carole", "ses": "preimp", "acq": "lowres" } }, { "modality": "ct", "subject": "carole", "path": "/test/test_data/sub-carole/3DCTpost_deface.nii", "entities": { "sub": "carole", "ses": "postimp", "acq": "electrodes" } } ] }
sub.get
Get information about data available for a given participant of a dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "owner": "hipadmin", "path": "/test/tmp/NEW_BIDS_DS", "sub": "carole" }
Example of content of output JSON data:
sub.edit.clinical
Edit the participant’s information stored in the participants.tsv
tabular file.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "subject": "carole", "clinical": { "age": "30", "sex": "M", "handedness": "L", "hospital": "CHUGA" } }
sub.delete
Remove a participant from a given BIDS dataset. The record will be deleted from the participants.tsv
tabular file.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "subject": "carole" }
sub.delete.file
Remove data file(s) from a BIDS dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "files": [ { "subject": "carole", "modality": "Anat", "fullpath": "sub-carole/ses-postimp/anat/sub-carole_ses-postimp_acq-lowres_ce-gadolinium_run-02_T1w.nii" } ] }
Project
project.create
Create a new Datalad-controlled project dataset in the collaborative space of the HIP.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_PROJECT", "title": "New Project Title", "description": "Project Description that would be put in the README.md file", "datasetDescription": { "Name": "BIDS Dataset Title", "BIDSVersion": "1.6.0", "License": "CC-BY-4.0", "Authors": [ "Author 1", "Author 2" ], "Acknowledgements": "Acknowledgement 1", "Funding": [ "Funding 1" ], "ReferencesAndLinks": [ "Reference 1", "Reference 2" ], "DatasetDOI": "" } }
project.sub.import
Import an existing sub-<participant_label> folder from a BIDS dataset of the center space of the HIP to the BIDS dataset of the project (located in <project_directory>/inputs/bids-dataset
).
Example of content of input JSON data for the --input_data
argument when using this command:
{ "sourceDatasetPath": "/test/tmp/NEW_BIDS_DS", "participantId": "sub-carole", "targetDatasetPath": "/test/tmp/NEW_PROJECT/inputs/bids-dataset" }
Example of content of output JSON data:
{ "Name": "BIDS Dataset Title", "BIDSVersion": "1.6.0", "License": "CC-BY-4.0", "Authors": [ "Author 1", "Author 2" ], "Acknowledgements": "Acknowledgement 1", "Funding": [ "Funding 1" ], "ReferencesAndLinks": [ "Reference 1", "Reference 2" ], "DatasetDOI": "", "AgeMin": "30", "AgeMax": "30", "ParticipantsCount": 1, "ParticipantsGroups": [ "n/a" ], "Participants": [ { "participant_id": "sub-carole", "age": 30, "sex": "M", "group": "n/a", "hospital": "CHUGA", "Subject_ready": true, "handedness": "L" } ], "Size": "172M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": true, "DataTypes": [ "anat", "ct", "ieeg" ], "Formats": [ ".json", ".tsv", ".nii", ".eeg", ".vhdr", ".vmrk" ], "SessionsCount": 2, "Tasks": [ "stimulation" ], "RunsCount": 2, "SEEGChannelCount": 128, "SamplingFrequency": 512, "RecordingDuration": 255, "EventsFileCount": 2, "FileCount": 22 }
project.doc.import
Import an existing document from the center space of the HIP to the documents/ folder of the project.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "sourceDocumentAbsPath": "/test/tmp/NEW_BIDS_DS/participants.tsv", "targetProjectAbsPath": "/test/tmp/NEW_PROJECT", "targetDocumentRelPath": "documents/other/participants.tsv" }
project.create_tag
Create a version tag in a Datalad-controlled project dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_PROJECT", "type": "project", "tag": "1.0.0", "changes_list": [ "Import sub-carole data from existing dataset" ] }
project.checkout_tag
Checkout a Datalad-controlled project dataset at a specific tag, the master branch, or the HEAD.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_PROJECT", "tag": "0.0.0" }
project.release_version
Make a patch (1.0.0
→ 1.0.1
) / minor (1.0.0
→ 1.1.0
) / major (1.1.0
→ 2.0.0
) version release of a Datalad-controlled project dataset and its nested BIDS dataset.
Example of content of input JSON data for the --input_data
argument when using this command:
{ "path": "/test/tmp/NEW_PROJECT", "type": "project", "level": "patch", "changes_list": [ "Delete sub-carole data" ] }
Example of content of output JSON data:
{ "Name": "BIDS Dataset Title", "BIDSVersion": "1.6.0", "License": "CC-BY-4.0", "Authors": [ "Author 1", "Author 2" ], "Acknowledgements": "Acknowledgement 1", "Funding": [ "Funding 1" ], "ReferencesAndLinks": [ "Reference 1", "Reference 2" ], "DatasetDOI": "", "AgeMin": "nan", "AgeMax": "nan", "ParticipantsCount": 0, "ParticipantsGroups": [], "Participants": [], "Size": "76M", "BIDSSchemaVersion": "v1.6.0", "BIDSErrors": [ { "key": "QUICK_VALIDATION_FAILED", "severity": "error", "reason": "Quick validation failed - the general folder structure does not resemble a BIDS dataset. Have you chosen the right folder (with \"sub-*/\" subfolders)? Check for structural/naming issues and presence of at least one subject.", "files": [ { "key": "QUICK_VALIDATION_FAILED", "code": 61, "file": { "name": "bids-dataset", "path": "bids-dataset", "relativePath": "bids-dataset" }, "evidence": null, "line": null, "character": null, "severity": "error", "reason": "Quick validation failed - the general folder structure does not resemble a BIDS dataset. Have you chosen the right folder (with \"sub-*/\" subfolders)? Check for structural/naming issues and presence of at least one subject.", "helpUrl": "https://neurostars.org/search?q=QUICK_VALIDATION_FAILED" } ], "additionalFileCount": 0, "helpUrl": "https://neurostars.org/search?q=QUICK_VALIDATION_FAILED", "code": 61 } ], "BIDSWarnings": [], "BIDSIgnored": [], "BIDSValid": false, "DataTypes": [], "Formats": [ ".json", ".tsv" ], "SessionsCount": 0, "Tasks": [], "RunsCount": 0, "EventsFileCount": 0, "FileCount": 4, "DatasetVersion": "1.0.1" }
Running DataHIPy in Docker
Please have a look at the special REST API service of the HIP Gateway! This service creates and executes the different commands of DataHIPy in Docker. The source code is available at the following URL: