ICL-Gait: A Real-World Multi-View Pathological Gait Dataset

Dataset | Supplementary Code


This is a dataset collected for Cross-Subject, Cross-View, Sim2Real, Cross-Modality gait pose estimation and abnormal gait pattern recognition.

We provide data of multiple modalities, including Depth, Point Cloud, Kinematics, Segmentation Map and 2D Keypoints. Access to raw RGB data was temporally suspended for ethics issue.


Real-World Dataset

  • 5 Gait Conditions + 5 Viewpoints + 8 Subjects

  • RGB and Depth images were collected by RealSense D435.
  • 3D skeleton and kinematics collected by Vicon and CGM2.
  • Segmentation Map generated by CDCL.
  • 2D Keypoints generated by OpenPose.

  • Synthetic Data

  • Synthetic data generated by SMPL.
  • Joints parameteres from real-world dataset.
  • Varied shapes, viewpoints, occlusions.
  • Occlusion rendered by Open3D.

  • Dataset

    An example for visualization can be downloaded here. Please sign the form and you will be given the further instructions for downloading this dataset.

    Supplementary Code

    We provide the script for visualizing data from different modalities, generating synthetic data, as well as for experiment settings of benchmarking.


    Please cite this work for the use of this dataset in your work.

  • Xiao Gu, Jianxin Yang, Hanxiao Zhang, Jianing Qiu, Frank Lo, Yao Guo, Guang-Zhong Yang, Benny Lo, "Occlusion-Invariant Rotation-Equivariant Semi-Supervised Depth Based Cross-View Gait Pose Estimation", 2021.Bibtext

  • Please also consider citing the following if you find this work helpful.

  • Xiao Gu, Yao Guo, Guang-Zhong Yang, Benny Lo, "Cross-Domain Self-Supervised Complete Geometric Representation Learning for Real-Scanned Point Cloud Based Pathological Gait Analysis", IEEE JBHI, 2021.Bibtext
  • Xiao Gu, Yao Guo, Fani Deligianni, Benny Lo, Guang-Zhong Yang, "Cross-Subject Cross-Modal Transfer for Generalized Abnormal Gait Recognition", IEEE TNNLS, 2020.Bibtex

  • Acknowledgement

    This project was approved by the College Ethics Committee of Imperial College London with the reference No. 18IC4915. We want to express our gratitude to the subjects participating in our study.

    For any question regarding this dataset, please contact Xiao Gu (xiao.gu17@imperial.ac.uk).