Home General Discussion

ZSpace 3d Display

interpolator
Offline / Send Message
AlexCatMasterSupreme interpolator
They make a custom monitor that has 4 tracking cameras built into it. You wear a pair of polarized glasses, with some small markers on the front. And you hold a stylus that's also tracked.

zspace-system.png

I used this today, our company had them come and demo, it was reallllllllly REALLY cool. We got to see stuff and look around models just by using our head. Modeling in Maya with it was really fun, also you can use a mouse and keyboardw hen you model. It was beyond fun seeing my particle systems in 3d. The best part were the unity demos and playing angry bots. When you are looking low on the screen you cant see over the walls until you move your head over the wall. I think we might buy one and if we do I will tell you all about it.

https://www.youtube.com/watch?v=-TwBtZC4ES0

http://zspace.com/

Replies

  • MikeF
    Offline / Send Message
    MikeF polycounter lvl 20
    seems pretty cool, do you see this as practical at all in a work environment though? i'm not a big fan of 3d stuff involving polarized lenses as they tend to give me a headache after 30 minutes or so
  • AlexCatMasterSupreme
    Offline / Send Message
    AlexCatMasterSupreme interpolator
    MikeF wrote: »
    seems pretty cool, do you see this as practical at all in a work environment though? i'm not a big fan of 3d stuff involving polarized lenses as they tend to give me a headache after 30 minutes or so

    I think that it could be easily. The only problem with modeling in it right now is that it felt like it wasn't optimized for a mouse. I think it can be fixed and for game art be really cool. The stuff in unity was straight up awesome and It adds a whole new level of complexity that is seamless to a game. I only had about 25 minutes total but knowing my boss he will probably want one. Its a really awesome thing to pick your model up and bring it next to your face and play with it. I can see it being used heavily for educational purposes and simulations. I think with a small amount of tweaking it would be nice for modeling. Obviously not nessisary but it doesn't seem like you need to jump through extra hoops to do stuff. Sorry if this is typed bad I'm on my phone.
  • Ace-Angel
    Offline / Send Message
    Ace-Angel polycounter lvl 12
    Any information on the hardware required from your PC? Is all the depth pushing being done by the display? What about stuff like ZB and their 2.5D tech, does that gripe it?

    Really curious about it.
  • AlexCatMasterSupreme
    Offline / Send Message
    AlexCatMasterSupreme interpolator
    Ace-Angel wrote: »
    Any information on the hardware required from your PC? Is all the depth pushing being done by the display? What about stuff like ZB and their 2.5D tech, does that gripe it?

    Really curious about it.

    It was running on a laptop when it was demoed.
  • breakneck
    Offline / Send Message
    breakneck polycounter lvl 13
    I'm glad to see this company is still going strong.
    I demoed this several years ago at their office(~1 hour in maya). At the time, I didn't see any production value from it. Waving a pen through the air at your model wasn't very efficient by any means. I do however see it being of some use in other fields. For example, medical procedure demos (surgery simulator, lolz)or something of that nature. Or if there was ever some sort of force feedback + zbrush, I could see this being a useful piece of hardware.
    I'm really curious about how far along this has come since last time I've seen it.
  • Eric Chadwick
    When I checked out the hardware at an event last Spring, they only supported specific 3d apps, no Zbrush support then but they said they were interested in doing it. I think they have to write a plugin to generate the cameras on the fly, in whichever app.
  • AlexCatMasterSupreme
    Offline / Send Message
    AlexCatMasterSupreme interpolator
    When I checked out the hardware at an event last Spring, they only supported specific 3d apps, no Zbrush support then but they said they were interested in doing it. I think they have to write a plugin to generate the cameras on the fly, in whichever app.

    This is the case, they had support for Unity and it worked perfect, the Maya support was in the view port and then they had a special model viewer program. When you get it you get the SDK though too.
    breakneck wrote: »
    I'm glad to see this company is still going strong.
    I demoed this several years ago at their office(~1 hour in maya). At the time, I didn't see any production value from it. Waving a pen through the air at your model wasn't very efficient by any means. I do however see it being of some use in other fields. For example, medical procedure demos (surgery simulator, lolz)or something of that nature. Or if there was ever some sort of force feedback + zbrush, I could see this being a useful piece of hardware.
    I'm really curious about how far along this has come since last time I've seen it.

    When I used it in Maya I ended up using a mouse and keyboard. I really thought it could be something do-able. I really hope I get to use one more. It was so much fun. As other uses it's defiantly something that would be good for that.

    Did you ever get to try the Unity app?
Sign In or Register to comment.