not entirely a "game/art" topic, but I'd share it anyway. Last semester I did a project for neuro-scientists, creating an interactive viewer of a rat's brain. The task involved creating surface models of brain's structures in 3dsmax, and the viewer program with luxinia.
reconstruction of the structures was done from spline slices.
a voxelation of splines, was done and interpolated between slices
made a maxscript/maxplugin combo that allows to load multiple splines (AI files) and staple on top, and/or select a bunch of such splines and create a mesh for them (lots of tweakables)
http://crazybutcher.cottages.polycount.com/wip/brainviewer/reconstruct/nodgeandinfluencefix.png
the viewer is able to allow a slice plane, and also click/select/highlight the parts. Also added a simple vertexpainter for marking things. Scientists can also load statistics data and color parts accordingly.
some preview of slicing in action
http://crazybutcher.cottages.polycount.com/wip/brainviewer/brainviewer.gif
so much for the rats... they inject the animals some weird liquid to colorize blood, freeze the brain, extract it, and then cut slices of them, and microscope photograph them... yuck.
---
a more recent project, and topic of my diploma thesis is a brain surgery simulator (for humans!). Specifically the volume renderer for it. Another student already did the "haptic" device stuff, so that doctors can virtually move their endoscope thru the nose inside the brain, to remove tumors and alike. Yes they actually do that, less "destructive" then breaking the skull somewhere else...
now those forcefeedback devices
http://www.sensable.com/haptic-phantom-premium-6dof.htm
are extremely cool to play with, they really can generate some solid forces, and it is quite a big experience when you "virtually" move, throw, catch, virtual boxes, and all of a sudden a polygon "weighs" something for real...
The forces have to be computed at 1000 fps, because we humans would otherwise recognize it as unsmooth.
Anyway to set the CPU free for this stuff, the renderer is supposed to make use of GPU as much as possible, directly rendering the 3d volume data from patients' CT data. Ideally fast enough to render two images for the 3d monitor.
So I started playing with volume rendering techniques, and here some first results using gpu-raytracing on my crappy system.
the data values can be changed on the fly with a simple transfer function.
and now some nose-to-brain flying
will need to create better lighting effects, and if possible some sort of material "gory" type of surface, to make it look a bit more real. Also interaction with regular trimesh objects, and a couple of quality improvements...
first time running the first versions of it, showed the immense speed improvements on newer hardware. On my own gf6600 I had like 30 fps, at universities gf7900 it was 400fps and on the gf8800 800fps...
Once the stuff looks like more gore, I will pimp some images again.
Replies
that image of the skull scan is quite unintentionally spooky, i love it
be sure to post more when you have more!
ACE!
some shading
playing with detail textures/slimy shineness
said when there is more gore I will add an image, got sidetracked with other work, but now I am back on finalizing this project more. still need to remove some artefacts that are mostly visible when in motion. But I hope that next month it is so far that it can be used at a medical facility for testing