Distribution of Action Movements Descriptor

In this site you'll find:

We also provide detailed information about the action instances corrected or left out for this datasets.

For more information, please contact fronchetti (at) lidi.unlp.edu.ar

How to cite:

(soon)


Since our model needs to be tested with different datasets, and each has it's own format, we provide scripts to convert from the original format to our own unified format.

In all cases, the result is a single .mat MATLAB/Octave file that contains all the information of the actions. The file can be opened in MATLAB/Octave with the load function.

The .mat file contains:

Download the script to convert from the datasets MSRC12, MSR Action3D, Berkley MHAD to our own format.

Additionally, the script animate_dataset.m can be used to view the gestures of the MSRC12 and Action3D dataset.

The MSRC12 is an action dataset with 12 gesture classes recorded by 30 subjects. In the homepage of the dataset more information can be found about it, along with a download link for a zip file with the action data. The zip contains several files, one for each action sequence. To convert the dataset to our format:

  1. Download the dataset, and extract the data directory somewhere.

  2. Download and extract the matlab code.

  3. Download and extract the data folder of Hussein's et al. annotation, which contains the start:end metadata to segment the action sequences.

  4. Use the sample script generate_dataset_from_msrc12.m to call the function msrc12_load_files with the path of the data directory of the MSRC12 dataset (with .csv and .tagstream), and the path to the folder containing Hussein's et al. extracted annotation (the one named sepinst with the .sep files). to generate the .mat file with our unified format.

We also provide a list of the action instances we excluded in MSRC12

NOTE: We do not take any credit whatsoever for the creation of this dataset. We only provide a script to read the data and convert it to a more appropriate format. If you use it and want to cite its authors, please visit to the dataset's homepage for more information on how to cite.

The MSR Action3D is an action dataset with 20 gesture classes recorded by 10 subjects. In the homepage of the dataset more information can be found about it (and others!), along with a download link for a zip file with the action data. Be careful because the website contains information about many datasets: we used Ferda Ofli's version of the dataset , and that's the one you should download. The zip contains several files, one for each action sequence. To convert the dataset to our format:

  1. Download the dataset, and extract to any directory.

  2. Download and extract Jiang's Experiment File List, a commonly employed whitelist.

  3. Download and extract the matlab code.

  4. Use the sample script generate_dataset_from_action3d.m to call the function action3d_read_and_convert with the path of the directory of the original dataset and the path to Jiang's Experiment File List to generate the .mat file with our unified format.

Note that this process also reorders the joints of the actions, so that they match the order in the MSRC12 dataset (the default for Kinect devices)

We also provide a list of the action instances we excluded and corrected

NOTE: We do not take any credit whatsoever for the creation of this dataset. We only provide a script to read the data and convert it to a more appropriate format. If you use it and want to cite its authors, please visit to the dataset's homepage for more information on how to cite.

(Soon)