I've set up 6 cameras in my environment with FOV set to 90" and rendered out 6 images, I get a perfect cube map of my world quick time vr style. The problem is these look great when they have the fancy cubemap node hooked up to them so they are distorted correctlyo. But when I apply these to a polygon model there is shearing where all the edges meet up. Is there any way to distort the uv's of my skybox model or run a filter on the textures themselves to remove the distortion but still have all the edges meet up seamlessly?
I don't understand. Skybox ? But a skybox uses a cubemap. It only works from a central point of view, so it has to be linked to your cam. Are you like, turbosmoothing the box ?
Noors, yeah I figured out the hdrshop after posting. You can convert an environment cross into a latlong image and apply that to a physically modeled sphere. I noticed there is still some distortion in the image though. I'll make a cross today that has nothing in it but straight lines and see if it converts without distortion.
Hey EQ, source is not distorted, it was captured from a Maya scene where I set up 6 cameras with 90" FOV and then converted those into an environment cross with ati cube map gen, after that I brought the cross into hdrshop and converted it to a lat long image but that image has "some" distortion in certain places even when mapped to a sphere in Maya. Basically what I'm trying to do is convert a sports stadium environment into a skybox so it renders faster. Basically I have a cube with the textures applied and it looks pretty good but I wanted to get rid of the remaining distortion. So far the best luck I've had was just cutting up the environment cross and stacking it horizontally in photoshop and mapping each section of the texture to one face on the cube. Works pretty good for faking the environment but has distortion issues at the edges due to the 90" FOV that was used to capture it.
But originally you have the environment you are trying to capture in Maya. You render from 6 cameras. Convert to cubemap format. Then try to convert back to use as a skybox?
Can't you just use your 6 original images and map them to 6 planes (a cube)? Skipping all the CubeMapGen and HDRShop?
kodde, yes I thought that too! The way the cube maps work as I just realized is they look up some normal projection thing in real time. I thought you could just map it to a box but there is distortion on the edges since the camera FOV is 90", you basically have sheering in places.
Noors tried the smoothing of a cube too no luck, this actually makes the problem worse because then the mesh goes in a different direction than the distortion from the 90" FOV.
Works on this a bit with our rendering programmer and he pointed out the cube map lookup can't really be baked down to the geometry. Then I figured out the lat long conversion does what I want but it leaves some distortion that looks wonky.
what if you bake your environement on a reflective sphere, then flip its normals ? Like a real HDRI ball ?
I just remember Vray has a spherical camera. It's certainly possible to do smthing with it :shifty: Maybe your renderer has such an option ?
Yeah that is not an issue and it is very easy to get that working with the Maya environment cube node. The problem with that is as the camera moves the environment moves, think quicktime vr style. I want a static ground plane and just some ghetto wrap around environment texture/skybox that from a distance looks 3d.
malcolm> I'm not sure what you are getting at to be honest. Is this used in some form of game engine? Is the skybox following the camera's translations? Or are you trying to re-create this scenario in Maya? Did you point constrain the skybox to your camera?
No it's just in Maya, nothing fancy like moving the sky with the camera. I actually want the sky to not move with the camera. I'll throw together some example shots on the weekend as I can't actually post what I need it for here. Basically I want to bake a 3d environment to a static texture and map that to a cube without any distortion.
That scenario would always have distortion wouldn't it? I mean it works with a cube as long as it moves with the cameras movement. But as soon as you brake that original relationship in positions between the skybox <-> camera you would start to see different artifacts depending on your skybox mesh's shape. At least from what my head can work out
Yeah, cubemap has zero distortion, as long as the camera is exactly in the middle of the cube.
Mapping a cubemap to a cube can be done seamlessly, but you need to tell the textures not to tile. They need to use Clamp UV mode (not sure about the terminology in Maya).
3ds Max has a Panorama rendering utility that makes a perfect latlong pano. As long as the camera is exactly in the middle of your cube, and the camera has to be level.
Since I'm generating the initial cube map from a real environment do I need to model the my cube first that the texture is going to be applied to and capture the data that way to remove all distortion when it's converted to a lat long image? Or does hdrshop just not have the ability to make a lat long from a cube without distortion?
I don't use hdrshop, sorry. I would be surprised if maya doesn't have a latlong rendering option somewhere, maybe as a melscript or plugin. Then you could render the scene directly into that, instead of thru a cube first.
Replies
http://www.outpt.co.uk/how-to-convert-a-skybox-to-a-skydome/
or use RTT projection, yeah, should work too.
But originally you have the environment you are trying to capture in Maya. You render from 6 cameras. Convert to cubemap format. Then try to convert back to use as a skybox?
Can't you just use your 6 original images and map them to 6 planes (a cube)? Skipping all the CubeMapGen and HDRShop?
Noors tried the smoothing of a cube too no luck, this actually makes the problem worse because then the mesh goes in a different direction than the distortion from the 90" FOV.
Works on this a bit with our rendering programmer and he pointed out the cube map lookup can't really be baked down to the geometry. Then I figured out the lat long conversion does what I want but it leaves some distortion that looks wonky.
I just remember Vray has a spherical camera. It's certainly possible to do smthing with it :shifty: Maybe your renderer has such an option ?
A sphere might have less of this issue?
Mapping a cubemap to a cube can be done seamlessly, but you need to tell the textures not to tile. They need to use Clamp UV mode (not sure about the terminology in Maya).
3ds Max has a Panorama rendering utility that makes a perfect latlong pano. As long as the camera is exactly in the middle of your cube, and the camera has to be level.
I've not much experience doing it as a co worker handled it at my last job, I found this though.
http://www.andrewhazelden.com/blog/2011/01/latlong_lens-and-cubemap_lens-mental-ray-shaders-compiled-for-maya-2011-x64-on-windows/