<?xml version='1.0' encoding='utf-8'?>
<eprints xmlns='http://eprints.org/ep2/data/2.0'>
  <eprint id='https://opendata.ljmu.ac.uk/id/eprint/133'>
    <eprintid>133</eprintid>
    <rev_number>12</rev_number>
    <documents>
      <document id='https://opendata.ljmu.ac.uk/id/document/923'>
        <docid>923</docid>
        <rev_number>7</rev_number>
        <files>
          <file id='https://opendata.ljmu.ac.uk/id/file/2488'>
            <fileid>2488</fileid>
            <datasetid>document</datasetid>
            <objectid>923</objectid>
            <filename>ReadMe.txt</filename>
            <mime_type>text/plain</mime_type>
            <hash>fcccbe8793457626daf6f96e9f945912</hash>
            <hash_type>MD5</hash_type>
            <filesize>949</filesize>
            <mtime>2022-10-27 10:27:58</mtime>
            <url>https://opendata.ljmu.ac.uk/id/eprint/133/1/ReadMe.txt</url>
          </file>
        </files>
        <eprintid>133</eprintid>
        <pos>1</pos>
        <placement>1</placement>
        <mime_type>text/plain</mime_type>
        <format>text/plain</format>
        <language>en</language>
        <security>public</security>
        <license>cc_by_4</license>
        <main>ReadMe.txt</main>
        <content>readme</content>
        <resourcetype>Text</resourcetype>
      </document>
      <document id='https://opendata.ljmu.ac.uk/id/document/924'>
        <docid>924</docid>
        <rev_number>1</rev_number>
        <files>
          <file id='https://opendata.ljmu.ac.uk/id/file/2490'>
            <fileid>2490</fileid>
            <datasetid>document</datasetid>
            <objectid>924</objectid>
            <filename>indexcodes.txt</filename>
            <mime_type>text/plain</mime_type>
            <hash>8a8224239cd162e8712b678fec6ba3cf</hash>
            <hash_type>MD5</hash_type>
            <filesize>479</filesize>
            <mtime>2022-10-27 10:28:06</mtime>
            <url>https://opendata.ljmu.ac.uk/id/eprint/133/2/indexcodes.txt</url>
          </file>
        </files>
        <eprintid>133</eprintid>
        <pos>2</pos>
        <placement>2</placement>
        <mime_type>text/plain</mime_type>
        <formatdesc>Generate index codes conversion from text/plain to indexcodes</formatdesc>
        <language>en</language>
        <security>public</security>
        <main>indexcodes.txt</main>
        <relation>
          <item>
            <type>http://eprints.org/relation/isVersionOf</type>
            <uri>https://opendata.ljmu.ac.uk/id/document/923</uri>
          </item>
          <item>
            <type>http://eprints.org/relation/isVolatileVersionOf</type>
            <uri>https://opendata.ljmu.ac.uk/id/document/923</uri>
          </item>
          <item>
            <type>http://eprints.org/relation/isIndexCodesVersionOf</type>
            <uri>https://opendata.ljmu.ac.uk/id/document/923</uri>
          </item>
        </relation>
      </document>
      <document id='https://opendata.ljmu.ac.uk/id/document/925'>
        <docid>925</docid>
        <rev_number>5</rev_number>
        <files>
          <file id='https://opendata.ljmu.ac.uk/id/file/2492'>
            <fileid>2492</fileid>
            <datasetid>document</datasetid>
            <objectid>925</objectid>
            <filename>Gait Dataset.zip</filename>
            <mime_type>application/zip</mime_type>
            <hash>a5ddb14e9aad66787dd3c6c63776d413</hash>
            <hash_type>MD5</hash_type>
            <filesize>18700445600</filesize>
            <mtime>2022-10-28 12:52:11</mtime>
            <url>https://opendata.ljmu.ac.uk/id/eprint/133/3/Gait%20Dataset.zip</url>
          </file>
        </files>
        <eprintid>133</eprintid>
        <pos>3</pos>
        <placement>3</placement>
        <mime_type>application/zip</mime_type>
        <format>archive</format>
        <language>en</language>
        <security>public</security>
        <license>cc_by_4</license>
        <main>Gait Dataset.zip</main>
        <content>data</content>
        <resourcetype>Dataset</resourcetype>
      </document>
    </documents>
    <eprint_status>archive</eprint_status>
    <userid>7</userid>
    <dir>disk0/00/00/01/33</dir>
    <datestamp>2022-10-28 14:15:16</datestamp>
    <lastmod>2024-03-12 17:36:10</lastmod>
    <status_changed>2022-10-28 14:15:16</status_changed>
    <type>data_collection</type>
    <metadata_visibility>show</metadata_visibility>
    <sword_depositor>7</sword_depositor>
    <creators>
      <item>
        <name>
          <family>Topham</family>
          <given>Luke</given>
        </name>
        <orcid>0000-0002-6689-7944</orcid>
      </item>
      <item>
        <name>
          <family>Khan</family>
          <given>Wasiq</given>
        </name>
      </item>
    </creators>
    <title>360 Degree Gait capture: A diverse and multi-modal gait dataset of indoor and outdoor walks acquired using multiple video cameras and sensors</title>
    <ispublished>datapub</ispublished>
    <divisions>
      <item>ict</item>
    </divisions>
    <full_text_status>public</full_text_status>
    <keywords>Gait; Person Identification; Machine Learning</keywords>
    <note>Please cite the following paper:  Topham, L.K., Khan, W., Al-Jumeily, D., Waraich, A. and Hussain, A.J., 2023. A diverse and multi-modal gait dataset of indoor and outdoor walks acquired using multiple cameras and sensors. Scientific Data, 10(1), p.320. https://doi.org/10.1038/s41597-023-02161-8</note>
    <abstract>Many of the existing gait datasets are limited by their lack of diversity in terms of the participants (e.g., gender, age, height, weight, ethnicity), recording environments (e.g., recording angles, indoors / outdoors), and availability. Therefore, we present a gait dataset containing 65 diverse participants with both indoor and outdoor environments. The data was acquired using 2 digital cameras and a digital goniometer (used to measure joint angles). Each participant provided 24 walking sequences from a range of viewing angles (360 degrees in 45 degree increments). Each participant also provided an alternative outfit to provide diversity in personal appearance. 

This dataset will be of value to applications gait identification, human pose estimation, and more.</abstract>
    <date>2022-10-27</date>
    <date_type>published</date_type>
    <publisher>Liverpool John Moores University</publisher>
    <id_number>10.24377/LJMU.d.00000133</id_number>
    <copyright_holders>
      <item>Luke Topham</item>
    </copyright_holders>
    <field_of_research>
      <item>46</item>
    </field_of_research>
    <collection_method>65 people were recorded for this dataset using 2 digital cameras and a digital goniometer.</collection_method>
    <language>English</language>
    <metadata_language>English</metadata_language>
    <legal_ethical>UREC Ref: 21/CMP/004

Participants identities such as names are removed and instead data relating to each person is labelled with a participant ID number.</legal_ethical>
    <collection_date>
      <date_from>2021-10-01</date_from>
      <date_to>2022-10-27</date_to>
    </collection_date>
    <related_resources>
      <item>
        <url>https://doi.org/10.1038/s41597-023-02161-8</url>
        <type>pap</type>
      </item>
    </related_resources>
    <ukri_date_sub>2022-10-27</ukri_date_sub>
  </eprint>
</eprints>
