No.

Dataset

Description

1

A Matlab (Mathwork, USA) user interface for the FES module developed by TUM. The interface allows an intuitive control of the module through graphic interface. The program was developed as part of WP5 activities.

2

A Unity game for grip force training using GripAble (https://gripable.co/). The game was developed to collect grip force data during FES applications to stroke patients. The programme was developed by ICL as part of WP5 activities.

3

The repository contains C++ codes for spasticity assessment coaching using IMU and EMG measurements in the Unity environment. The program was developed as part of WP5 activities.

4

The repository contains C++ code, logs and other relevant materials for building and controlling the LEGO arm exoskeleton designed and developed by DTU. The system consists of a low-cost 2 DOF upper-limb exoskeleton made out of LEGO and the software system. They were developed as part of WP7 activities.

5

For the development of novel human-exoskeleton control and modelling techniques, a high-fidelity simulation environment is made publicly available. In particular, a 4 DoF upper-limb exoskeleton interacting with a musculoskeletal human upper-limb system is simulated in the physics environment MuJoCo with particular focus on truthful contact-based interaction

6

These are sEMG recordings from three muscle fatiguing experiments. Data was recorded using a PLUX Biosignals EMG sensor.

7

All data will be anonymised during the research and consent will be sought as part of the Ethics approval process if it is deemed necessary to share this data with another partner or publish it. Nothing will occurwithout prior approval by the participant. A minimal amount of sensitive personal data will be collected for describing demographically the population on which the technology was tested. Blur will be also used in videos and photos when necessary with the aim of anonymizing them. Participants will be also informed
about this in an informed consent process.

8

The dataset will include a collection of motion and force profiles of humans when performing a tracking task in two-dimensional cartesion coordinations while supported by different types of controllers. The experiments are executed on a two DoF manipulandum, which consists of two orthogonally mounted single rail stages (Copley Controls Thrustube Module), each driven by linear servo motors. Both rail stages are
equipped with optical encoders that measure the position of a cart on the upper rail with 1 μm precision. Additionally, a six DoF force-torque sensor (JR3-75M25) is mounted below the handle, through which the human interacts with the system, to measure forces in the horizontal plane (1000 Hz). The data are completely anonymous and will be used as a dissemination material.

9

The dataset will include data extracted from healthy participants when performing a target tracking task with their wrist flexion/extension motion. The experiments are executed using a 1-DOF wrist robot and an electrical stimulation system acting on the wrist flexor and extensor muscles. The experiment follows a repeated measures design, where participants performed the task alone, with the help of robotic assistance, with the help of functional electrical stimulation and with the help of both robotic assistance and stimulation.

10

The data set will include a collection of movement recordings of humans while performing unstructured daily activities or engage in scripted daily activities. The movements of the participants will be recorded (at approx. 60Hz) using motion tracking systems with inertia measurement units (Shimmer, USA) worn on a body segment such as the wrist. The inertia sensor consists of a gyroscope (measuring angular velocity), an accelerometer (acceleration), and magnetometer (gravity vector) which are used to estimate a pose of a
body segment at a given time. Furthermore, although the device is capable of wireless data streaming, the functionality is not used as the collected data are not encrypted, and could be intercepted by other devices.

11

The dataset will include a collection of motion and force profiles of humans when performing reaching motion. The motion data will be human motion at the Cartesian coordinate, tracked at 100 Hz using a marker-based motion capture system. The force/torque applied by a participant will be measured with a 6D force/torque sensor (1000 Hz). The dataset will be a collection of motion/force data from multiple individuals, where each of them performs reaching motion, and will be valuable for benchmarking algorithms for arm impedance of humans. The raw data are images of the reflective markers taken by each motion tracking camera at the preset frequency. The cameras use reflections of infrared light on the special markers to visualize their positions, thus the raw data do not record any personal information. The centroid of each marker image is then triangulated from multiple cameras to estimate its position in the Cartesian coordinate. The position data are completely anonymous and will be used as a dissemination material.

12

The data set will include surface electromyography recordings (Shimmer sensing, USA), which use sensitive electrical amplifiers to detect and measure the small electrical signals that naturally originate in any active muscles and can be measured by electrodes attached to the skin overlying the muscle. The amplified signals are used to measure the time and intensity of muscle activation during movement/rehabilitation routines to support algorithms for the human arm impedance estimation. The data are stored on a flash memory of the device, and directly transferred to a host computer via a dock station.
Furthermore, although the device is capable of wireless data streaming, we do not use the functionality as the collected data are not encrypted, and could be intercepted by other devices.

13

This dataset includes behavioural data extracted from healthy participants for both usability purposes as well as the validation of the personalised coach. More specifically, user demographics and internal state, preferences, interventions performed and performance, user actions and motion primitives will be collected when interacting with the ReHyb system, followed by questionnaires and video recordings of interviews.