Friday, August 31, 2012

Summary of Medical Research - Video


I have compiled a video to summarize the work done over summer. I hope it is helpful to everyone.

Monday, August 27, 2012

Presentation under construction

I am making a presentation for the project. I have made the slides; I am currently working on adding voice to the presentation.
For the presentation, please refer to:
https://docs.google.com/presentation/d/1w8bTrBoPRzjik5Nfkr136YjJ8bo2K5W54tyugfEmUmg/edit

Tuesday, August 21, 2012

Prototype

Ultrasound Guided Needle Placement Training Using Kinect

The prototype is complete!
The rendering module and the tracking module are fully integrated now. The needle visualization is also working. Right now the needle's radius (its actually a cylinder), is rather large. I did this in order to easily get the needle in screen, otherwise it would be very difficult. But this parameter is very easy to change, and can be tuned as per the needs of the ultrasound guided needle placement training.

Challenges:
Many of the challenges that we are facing are mechanical in nature. The probe and needle tracking is working well despite the fact we are not using any fancy setup for tracking them.

What has been found?
At the start of the project we were very anxious about the accuracy of the Kinect for tracking probe and needle. But we can now see that Kinect works very well for this application.

Future Work:
1. Improving accuracy of tracking by using sophisticated marker tracking setup
2. Developing the software in the form of a game so that new surgeons can monitor their own skill level

Monday, August 20, 2012

Long Due Blog Update - Rendering Ultrasound Images based on Kinect Input

Rendering based on Unmodified Kinect Input, discarding normal and up vectors for probe
Rendering Based on Stabilized Kinect Input, discarding Up and Normal Vectors from Kinect
Rendering Based on Stabilized Kinect Input, using Up and Normal Vectors from Kinect

Due to a number of reasons, I could not get a chance to update my blog. This was my progress as of August 3, 2012, just before SIGGRAPH conference.
I had completed integration of rendering module with the Kinect Tracking module.
The first video shows rendering based on data obtained from Kinect without any modifications; as you can see, the output shakes a lot which is not desirable.
To remove the shaking, I followed the simple approach recommended by Dr. Norman Badler, which was to just ignore some of the data given by Kinect. The second video and third video show outputs obtained using this approach. To improve the quality of the output Dr. Badler had suggested using B-Spline to interpolate between frames. I will do it in the future.
The second video discards the Up vector and Normal vector of the slice plane obtained using Kinect, which adds to the stability; but this method will prevent the doctors from being able to orient the probe along oblique directions. So this method cannot be used.
The third video shows the result of the integrated Tracking and Rendering module, without the needle.

Thursday, July 26, 2012

All Spacial Information Collected for Both Probe and Needle

The above video shows the effect that all of the markers of the probe and the needle are being tracked by the camera.

The user should wear white coat and also white gloves if necessary, because skin color and the needle marker colors are similar thus confusing the camera. We will try to modify the color alignment to make it more convenient, but this might also be reasonable for trainees. The background is also better to be white.

As we noticed before, the data collected vibrates a lot. We are not sure how this would be like in the mock ultrasound output since we haven't integrated our codes. However, we are also trying to use some filter( Karman Filter or simpler, low pass filters) for optimization.

Cheers!

Wednesday, July 25, 2012

Multiple Updates - Needle Rendering/Integration


Needle Visualization

Now we have the needle visualization along with the medical data rendering. In the video I am moving the slicing plane front and back, so the needle is visible sometimes (when the slicing plane and the needle are nearly in the same plane). As the slicing plane gets near the needle plane the needle visualization becomes brighter and brighter. Also note that the slicing algorithm is capable of changing the needle visualization as the needle penetrates into or off the body.


Integration Methodology

Robin and I have very specific settings for each of our modules, so we decided not to integrate the modules at the code level. So I thought it would be good if we make Robin's module send the data that it captures using Kinect to my module using Shared Memory. After a little searching in the internet, I got a surprisingly easy method to implement shared memory.
In the video above, I created a dummy process, "Data Provider" which just randomly sends data regarding the probe position and orientation. The rendering module is rendering as per the data that it gets from the dummy process.
Now we are going to add the small piece of code in Robin's module so that it can send data to my module.

Tuesday, July 24, 2012

Probe 3D Spacial information

Got 3D Spacial information for all of the markers and the bottom surface of the probe.

The values vibrated heavily because all of the inputs are unstable. We are now trying to integrate our code to see the final results.

In the video the bottom line shows the probe surface position.