Kinesthesia: “Awareness of the position and movement of the parts of the body.”
This project encompasses the development of a user-friendly interface between the Microsoft Kinect SDK and National Instruments‘ LabVIEW, and the subsequent development of a selection of tools for use in the field of stroke rehabilitation, gait analysis and laprascopic surgery. This LabVIEW-Kinect toolkit is designed with the intention of allowing both ourselves and future users to easily interface the depth and skeletal tracking functionalities of the Kinect with any LabVIEW system. In addition to the medical applications detailed, a number of further examples are demonstrated in order to show-case the potential for connectivity between the two systems.
Products: NI LabVIEW, NI Vision Development Module, Microsoft Kinect, Microsoft Kinect SDK
The video below shows an example of how we can interface the Microsoft Kinect with labVIEW to produce an intuitive control system. Using the Kinect’s skeleton tracking ability to control a VTOL aircraft demonstration through the use of NI DAQ board. Users are have the option to control the rig using a hand control, signal generator or through body position using the Kinect Toolkit developed as part of this project all can be operated through a manual or Fly-by-Wire control system.
Figure 1: Kinect and LabVIEW controlled VTOL demonstration rig.
Besides the development of the Kinect toolkit as a whole, three key challenges are addressed in this project. These are:
Stroke, the disturbance of blood supply to the brain, is the leading cause of disability in adults in the USA and Europe, and has a profound effect on the quality of life of those to whom it occurs. The physical effects of stroke are numerous, and there exists a substantial field of study devoted to the assistance and rehabilitation of stroke sufferers. It is a complex and multidisciplinary affair, requiring constant monitored cognitive and physical therapy. As such, a need is presented for a system that provides mental stimulation to the patient whilst accurately recording body position and movement, allowing physiotherapists to both maintain patient motivation and extract detailed information on their movements.
With the advent and continued development of laprascopic (keyhole) surgery, it is essential that the operating surgeon has accurate and up to date information on an area that may not be directly visible to them. Before operating, the abdomen is usually inflated to provide the surgeon with the necesessary space to both view and access the internal organs. However, it is difficult for the surgeon to accurately guage the level of inflation, and the working space available to them. A tool that can determine inflation of the abdomen, and provide an estimate of the inflated volume would increase the safety of procedures and provide the surgeon with more information about the patient prior to operation.
On top of this work will be carried out to see if the skeleton tracking abilities of the Kinect can be used to programmatically and objectively assess a surgeons skill through economy of movement, accuracy and time.
It has been suggested that a person’s gait is more unique than their fingerprint. Indeed, the way in which we walk can offer insight into a number of medical problems which may take longer to present in other forms. Stroke, for example, can result in a defined limp in one side of the body, and the extent of this limp may offer further information into the extent of the stroke. Similarly, analysis of one’s gait can also offer information into struggling hip, knee or ankle joints, and can suggest not only that a joint replacement is required, but also that a specific type of replacement would be beneficial. A low-cost, investigative tool that can be used prior to expensive specialist referrals would offer a significant benefit to clinicians.
The Microsoft Kinect has already revolutionised the gaming industry with its ability to track users motions, marking a key movement away from traditional control systems. This project is aimed at taking advantage of Microsofts innovative technology, and interfacing it with NI LabVIEW, via the development of a fully functional LabVIEW driver and toolkit. With this in place, we are developing a selection of motion tracking tools and programs specifically aimed at tackling three key areas:
Currently, a number of technologies exist to track patient movements. However, not only can these systems can cost upwards of £40,000 per camera, they have the added chore of requiring the user to wear markers placed on the skin or clothing, and are often significantly more accurate than is necessary, providing no extra insight into patients movements at a large capital expense. As such, a significant advantage is presented by the development of an easy to use, Kinect-based system which can produce similar results at a fraction of the cost. A system is developed that can record normal camera footage of a patient, alongside a full 3D rendering of the user’s skeleton, allowing the operator to rotate and explore the users movements. Further to this, A virtual stroke rehabilitation environment is created within LabVIEW, that provides patients with tasks to perform, based upon the industry-standard ARAT (Action Reach Arm Test), that mimic day to day human movements, an example of this application can be seen in Figures 2 and 3.
Figure 2: A demonstration of the Virtual ARAT test being developed that shows your arms moving in real time with camera control maintained by your head position
Figure 3: shows the 3D scene created for the Virtual ARAT tests.
Using the depth-mapping functionality of the Kinect, the depth from the camera to the abdomen for each pixel in the camera’s range is determined. This information is fed into LabVIEW, and processed, using approximations of the geometry of the abdomen, in order to determine the inflated volume, and provide surgeons with a more accurate picture of the intra-abdominal cavity.
As with stroke rehabilitation, gait analysis is often undertaken via the application of various sensors to the patients body, the location of which is then fed into the computer system. This is generally a long and drawn out affair, which may result in patient discomfort. Using the Kinect to track the user’s skeleton, gait can be quickly analysed and important metrics concerning the patient can be calculated through LabVIEW and fed back to the operator. Figure 4 shows the real time position tracking of a patients right hand in the X axis. A user can select any of the 20 joints to track and access a number of metrics on this joint.
Figure 4: shows an example of the data that can be obtained on the fly from the system. The program above allows the smoothness and magnitude of movements to be montired as well as the velocity and acceleration of any one of the 20 joints in the X, Y and Z axis.
In order to assess the accuracy of the Kinect’s skeletal tracking, we tested the Kinect against the system currently used by the university, Optotrak. This camera system is the current industry benchmark, and is capable of extremely accurate sensor tracking down to sub millimetre levels. The system at the university is currently set up to track arm motion in stroke patients, monitoring speed, accuracy and fluidity of motion. Small wired sensors are attached at various points on the users upper body, whose position is then determined by a triscopic camera, and relayed to a LabVIEW processing environment. The high accuracy level in this system lends itself perfectly as a validation tool; effectively providing a gold standard to which we can compare the Kinect. We used what is known as a ‘far reach’ exercise for our primary validation. This consists of the user sitting up straight in a chair, and attempting to touch a target on an LCD screen with their hand whilst maintaining a vertical trunk.
Below are photos of the Optotrak system being used to record our primary validation measurements.
Figure 5: Optotrak ™ camera, industry standard motion tracking system capable of sub millimetre accuracy.
Figure 6: A team member using the Optotrak rig developed for stroke patient reach exercises, being compared with the Kinect (visible in upper right)
Figure 7: Optotrak rig from another angle, showing the LCD screen which the patient must attempt to touch.
Figure 8: Video footage of the Optotrak rig in use.
Figure 9: Video footage of the Optotrak rig in use.
The toolkit is approaching full functionality, allowing the user to initialise and close the Kinect’s different components through polymorphic VIs (RGB Camera, Depth Camera and Skeletal Tracking) dependent on the functionalities they require, and a number of sub-VIs are either completed, or approaching completion, which relate to the processing, extracting and displaying of data from the Kinect. Figure 10 shows the Polymorphic VIs that the user can simple drag into the block diagram, the code displayed lets the users access the Video, Depth and Skeleton data. In the Example below the code is also producing a 3D picture control plot of the skeleton on the fly.
Figure 10: Simple code required using the Polymorphic VIs built to access the Video, Depth and Skeleton Plotting functions of the Kinect
A VI has been developed to record live RGB and skeletal data concurrently, which embeds the skeletal data directly into an .avi file. A further VI has been developed which allows this video to be reviewed, chopped up and saved, giving cherry-picked footage of a physiotherapy session, providing a 3D, rotatable rendering of the patient’s skeleton, alongside the raw video footage, as can be seen in figure 11.
Figure 11: Video analysis suite developed for clinicians to play back, edit and analyse data recorded from assessments.
The system has great scope for future development and the inclusion of additional uses. The system also has the potential to allow a rehabilitation user to be completely independent. After the initial meeting with a clinician and a rehabilitation regime in place, the user could perform all the tasks at home, with the data including recorded video, sent back to the clinician who could analyse the information from the hospital. This would be especially useful for users who currently struggle to leave the house, or may feel burdened by regular rehabilitation appointments.