According to our site description, Krotalov Studios is all about science fiction video games, software, and writing. So far, though, there hasn't been much of anything except writing. It was my intention to focus primarily on the video games, with a few pieces of interesting software on the side, mostly to either supplement the games or to be somewhat similar to a game, such as simulation software. However, I've been a bit busy with real life, and software takes a long time to develop, especially with only one person. Writing, on the other hand, is easy to do (difficult to do well, perhaps) and as is the case with short stories and novellas, it's much easier to find the time to do it. That's why my focus has been mostly on writing since launching this site.
However, I think it's about time I started showing off some of the other projects I've been working on lately, even if they're no where close to being finished. I'll likely still be focused on writing, but this will help bring some more diverse content to the site. So without further ado, I'd like to share my first software project after the break.
This project started as part of my summer research in school, and I'm trying to continue the work on my own. It first requires a bit of explanation. For those who aren't aware, amputees with missing arms or legs can often suffer from what is known as Phantom Limb Pain, especially when the injury which resulted in the amputation was accompanied by a lot of pain (meaning they were conscious when it happened, as opposed to being under during surgery, which is more common among amputees then you might think). Essentially, the brain still expects nervous input to be coming from the missing limb, but since there is no longer a limb to provide that input, the brain becomes much more sensitive to phantom input. This results in illusory sensations within the missing limb that are often painful, and most commonly take the form of fingers or toes clenched so tightly as to cause pain. Being unable to move a limb that isn't actually there, the amputees are most often unable to do anything about this pain, and simply have to deal with it. When it gets to be quite painful, this can obviously be quite inconvenient.
One of the more effective treatments to help alleviate the pain is known as mirror box therapy. Essentially, the amputee can make use of a mirror carefully positioned in front of their body so as to create a mirror image of their existing limb right where there missing one would be (obviously this only works for single amputees. Double amputees are out of luck here). When the brain sees the image of this limb exactly where it expects a limb to be, it creates a strong subconscious illusion that the limb is indeed actually there, even if the amputee consciously knows its fake. The amputee then makes mirror symmetric movements with both limbs (even though they're missing one, they can still pretend to move it), and in real time, the mirror creates the illusion of that missing limb moving as its expected to. This deeply strengthens that initial illusion. If the amputee watches themselves unclench their fist or feet, it can go a long way in helping to relieve phantom limb pain.
While cheap, the downside to this method is that the illusion is limited to the size of the mirror. The help make a convincing illusion, the mirror is also often placed in a box to help isolate the relevant images the brain needs. With this method, the amputee is stuck wherever they decide to set up, and their range of motion is quite limited.
I've been working to develop software that will achieve this same mirror-image effect in an AR and VR environment, which will give the amputee a much greater range of motion and achieve an overall more convincing illusion of a functional limb. A software version of mirror box therapy has several advantages over traditional treatment. First, as has already been mentioned, is the greater range of motion, both for the limb itself, and for the amputee to move about. Second, while a mirror can be pretty cheap, software can be cheaper. Yes, there's software out there that can cost thousands of dollars (I'm looking at you 3ds Max). But I'm not out to make money; I'm out to make a product that will work and that people will want to use. I could sell it for far below that cost of a mirror, or better yet, give it away for free to those who need it (maybe even go open-source if it becomes successful and I don't have the time to keep development internal). And lastly, I can turn the therapy into a game. The whole point is to relieve pain. With the traditional method, the user's focus is on relieving the pain, and the act of consciously thinking about the pain can make it that much worse. I've found that when I'm sick or have a headache, one of the best ways I can get through it is by focusing my attention on a video game. It might not make the sickness go away, but the shift in focus completely erases the perception of any pain. If I can turn this software into a fun little game, it can not only treat the cause of the pain, but help shift focus away from it as well.
When I originally worked on this project, I developed an AR Android app which uses computer vision techniques to mirror the user's body on screen. You can see a video of how it works below.
Obviously, there's a few issues with how it works. The body part segmentation is based on skin color, so depending on the current lighting and background, the software can quite often mirror parts of the image it shouldn't and vice versa. My programming skills are still in development, so there's also a few quirks and bugs that require quite a bit of polishing. But the biggest flaw is the frame rate, which is quite low. The frame rate is generally considered the most important aspect of creating a convincing illusion. A mirror creates the image in real time, but with the amount of math being done on a mobile processor to achieve this effect, there is a significant amount of lag. Even the slightest bit of delay can ruin the desired illusion, since the brain can easily pick up on a mismatch between what its seeing and and what it expects or feels (this a bit of a problem in VR in general).
While I'm sure there's ways to potentially improve this, it seems to be beyond my ability at the moment. I've done a lot thinking on several different approaches to the problem at hand, and I've decided to shift from AR to VR. It was actually something I wanted to do as soon as the project was described to me. I haven't had a better VR experience than Elite Dangerous on the Oculus Rift, and seeing my pilot's hand move the stick as I controlled my own very-similar-looking X52 was very convincing. But since my summer research was part of a larger project for a pair of AR glasses, my initial development was for AR.
Now that I'm continuing on my, I've already begun working on a VR prototype using an HTC Vive and its positional controllers. This is in a very early stage of development, but it's already showing a lot more promise than the app. Frame rate is no loner an issue at all. Developing the game component is much easier on a PC platform. And I can even give the user varying body forms, as this has, perhaps counter-intuitively, little effect on the illusion. The only downside is the use of the HTC Vive, which is quite an expensive piece of hardware and defeats the purpose of a cheaper alternative to mirror box therapy. I have several ideas at the moment on how to shift to alternative hardware, but for the moment, my priority is develop something that works effectively, regardless of the cost. As technology develops, I can count on the needed hardware to become cheaper and more affordable. So long as I can prove the task possible, market forces will make it feasible if there's truly a need. In the meantime, I'll continue development, and when the prototype becomes more polished, I'll continue to post status updates on progress here on the site.