again not something that needs to be "implemented" in code.. we do this, its part of the rig, no programmers required.
its just a matter of is it important to your game. crying eyeball tear tech is great for a game that involves you doing tons of crying and twitchy eyeball movements close up in a camera.. but the reason we haven't seen too much of this is that its not something that needs to be done realtime and i can't see a whole lot of gameplay possibilities this creates.. you can prerender crying crazy lady cinematic cause i don't think i want to actively participate in crazy girl crying.. she can go crazy and cry, but i am going to do the same thing in the game i would do in real life.. sit there and watch her go ape shit and think she is absolutely retarded. I don't think that needs to be realtime.. it can be completely canned, not anything to participate in, besides maybe leaving the house so i don't have to listen to the BS.
[ QUOTE ]
besides maybe leaving the house so i don't have to listen to the BS.
[/ QUOTE ]
Ah man, laughed hard to that!
I agree with most folks here, in that while it looks decent, it's by no means realistic enough to pass the valley. The eye demo looked pretty weak to me in terms of realism. The movement good, the rest, very cg. The advancement is nice, but doesn't seem to be the last straw in terms of getting out of the valley.
As far as whether or not a girl crying and emotionally freaking out in real time is needed... I'll hold judgment until the game is out. This is the company that did the Indigo Prophecy so I'm definitely curious to see what the end product is like.
I think a crying non-cutscene character could work well if implemented right. I remember how interesting it was in Half Life 1 when the scientists were turning their heads to follow me while they went through their lines.
Fell apart real fast though when I started jumping around the room and they just continued to read through their lines. If you could give the character enough AI and variation that she reacts to my disinterest by making it an affront to her vulnerability, that could get real interesting. Her gaze following me around the room only some of the time, not locked-on 100%, mechanical.
yeah i think i am being way too cynical, but thats just me. it just really pisses me off when companies make grandiose claims and don't back it up.. no it doesn't pass the uncanny valley in any way, shape, or form, it doesn't just look fake, it looks wrong, therefore they fail.. but if they didn't go around stating they cured cancer, world hunger, aids, and the common cold i would think, hey neat eyeballs.. its one thing to make something cool, its another to blow hot steam out of your ass and call it a magic rainbow.
Sure. That's just marketing though. All the complaining isn't going to change the way Fifth Avenue works. All the discussion about it just cements the brand name further into the public consciousness, and today's bad feelings tend to soften over time. Human nature. What was that saying... any publicity is good publicity...
Granted Gran Turismo focuses on cars... the in-game visuals are very close to photorealism. One of the major things that give it away is the overall image clarity of the game.
I couldn't agree more Soul. I think the suspension of disbelief for a lot of games like that, sports titles included, is broken due to image clarity. It lack the imperfection needed to sell the realism.
[/ QUOTE ]
I haven't watched the latest clip it still loading, but I just watched the old youtube link and how can anyone be proud if that?
EDIT: Just watched a pair of eyes and kept waiting to see the rest of the face react. Hopefully they fixed the parts of their tech that cause the "whinny like a horse" animations to play whenever the character is talking.
[ QUOTE ]
again not something that needs to be "implemented" in code.. we do this, its part of the rig, no programmers required.
[/ QUOTE ]
Please correct me if I'm wrong, but you're engine still has to read that rig right? As far as I understand typically eye controls are done using look at constraints with some sort of dummy controlling where they're looking right? It seems like with the addition of a max (or whatever program the rigging is being done in) constraint, you've gone beyond the regular ole IK FK bones set up and may require a programmer to integrate and to make your exporter recognise that. Now I doubt thats really a big deal, but just adding whatever you want to the rig doesn't gaurantee the engine is going to know what to do with it.
Since its already part of the rig you're working with I assume that sort of thing has already been worked through and integrated into the engine and exporters your working with. If you're working on a game with an engine that is still in development then theres really no assumption that ANYTHING will work without the attention of a programmer. I seriously doubt the artists and animators that worked on Crysis sat around with their thumbs up their asses until the engine was completely finished and every desired feature was running smoothly.
my main complaint about this vid is the lipsyncing is HORRIBLE. Her mouth moves like she's deaf. In terms of emotion and story telling, though, it's awesome. I've shown this to non-artist folks and it's affected them more than any game I've ever seen.
That last video definitely hasn't crossed the valley. The lip sync is horrible. It's nice to see more of the little twitches & movements that are usually missed, but I definitely have an inate unsettling feeling when I watch it.
That video is nice, kind of a sum is more than it's parts thing. As far as her mouth is concerned there just isn't enough articulation in the lip movements in my opinion. Still as a whole is a pretty impressive bit of entertainment, especially for a game following in the footsteps of The Indigo Prophecy.
Yuck I just saw the HD "casting" vid. Yeah, the lipsync truly is horrible, like her lips don't close very often. Ever made a MM and a BUH and a PUH without your lips touching?
As an aside, the end was funny, reminded me of casting tapes we made for one of those 90's FMV CDROM games I worked on. Seeing actors act their hearts out, and asking for the verdict that never comes. "We'll let you know." Heh.
i guess the fact that this thread made it to page 3 already suggest more studios should claim they've "crossed the barrier" when marketing their projects
If they can pull of a new set of lip sync (quickly) that at least matches the work they've put into the eye movements, I'll feel less inclined to roast them about their 2 year old sad attempt to jump uncanny gorge on a moped. As someone already pointed out, (might have been Arsh? too lazy to go digging) facial animations aren't even a side dish, what they are showing is impressive sure but not really anything close to a full set of facial expressions. I think they are obsessed with being the first to some mysterious goal that everyone else in the industry could have already gotten to if they saw it important.
did anyone mention that the tech demo was labeled as a raw mocap dump and a realtime 3d? probably more than a little premature to use that as marketing material given all the aforementioned problems... but if that really is a raw dump, then yeah, i'd say it's promising.
[/ QUOTE ]
Sure most of the stills look fine, its the motion and the blending that kill it. Even in this shot her teeth aren't touching her lips. At times her upper lip doesn't touch her lower for entire sentences. Peeling the upper lip over a characters head is a bad way to fake reality.
Gauss, I don't think it was only a raw mo-cap dump, there are attempts to clean it up. It looks like the mocap they used didn't include fingers which is pretty common, so the fingers need to be animated by hand. If you watch her hands they are pretty wooden at times but they did make attempts to toss poses in which comes across as pretty static/wooden compared to the smooth mocap. I also doubt the face was mocap'ed because it has a lot of floating key frames that carry on far too long at times, a hallmark of someone working on a big file and not paying attention to curves. The mixed quality leads me to believe they used some automated sync tool then attempted to clean it up, only concentrating on key parts.
Long story short its not just a raw dump, pretty close to it tho. Hopefully in the 2 years since they've managed to hire a 1 fingered chimp to clean up their mocap. But if in those two years all they have to show is a pair of eyes... its going to be a long time before we see anything playable.
Now if it was a raw real time dump and they had the actress in the shot and her 3D counter part reacting in real time to her movements I could bow in awe. But what they showed was a half assed attempt at crappy mocap clean up.
[/ QUOTE ]
amen to that.. lets hope all this focus on the tech doesn't distract them from the main task. That they use it as an enhancement rather than the main show
Her teeth aren't shadowed properly. I noticed that when this first came out.. The lighting in the shot above is very diffuse, like daylight through cloud cover - but the source (reflected in her eyes) is a spotlight. Our brains pick up that stuff straight away, whether we're conscious of it or not.
vig, i was referring to the newest one with the tight shot of the eyes. of course the older actress-in-the-kitchen is in no way a raw dump. probably not even the new one either, but it's labeled as such in the video.
and come on about the "just make a fun game" comment. there are a lot of professionals in this thread, we know that games don't work that way. there are artists who do art and there are designers that design. the people working on these characters are just trying to do their jobs the best they can, less emphasis on visuals doesn't somehow magically make a game more fun because that's the job of different people.
sure maybe they're putting too much of a high-level emphasis on this performance capture stuff, but so what? they've already got results that look stronger than a lot of the robert zemeckis stuff. if that's how they want to spend their money, fine.
[/ QUOTE ]
amen to that.. lets hope all this focus on the tech doesn't distract them from the main task. That they use it as an enhancement rather than the main show
[/ QUOTE ]
If the game is about an emotional, character driven story, isn't it vital to the main show to have this tech?
The emotional character driven story should be the main show, which is enabled by having the tech. I hope that they don't end up with super real characters doing and saying stuff I don't really care about. Having said all that the preview demo is really well done so maybe it's only the marketing guys with the fixation on the valley
Considering most studios are well on their way to beta in the time it has taken this studio to crank out two tech demos. I really hope they aren't wasting their time trying to bridge a gap when they could have just cleaned up what they had and it would have delivered the same emotional punch in a much more realistic way.
It should have taken 2-6 weeks not 2 years to clean up that scene and all they have to show for it is a close up of some eyes? Not winning me over with that. Sure it was an improvement but they could have improved the whole thing...
Someone has a bruised ego and is trying to make a point, I hope they don't forget why they started down this path. I hope they don't forget about their investors while they chase after some pointless goal. Think back to any game that championed tech this much then think about how long you cared about that tech while you where playing. For me its roughly 10min before I get used to it and it blends into the games landscape. If they want to keep people playing they'll need more not less substance. They've taken a tech step forward but a substantial leap backward in content.
When I look back over their E3 trailer the eyes are one of the last things I would have chosen to touch up, but they chose them first why? Why fix your strong point when there is so much more that lags behind? Show me impressive lip sync with realistic hand animations, show me facial emotions that blend correctly, show me they've learned how to mix story and tech together to make a playable game, show me characters reacting real time to random input from a player. Then I'll be impressed. If all they are doing is trying to make their cut scenes real time they haven't bridged any gaps, plenty of games have chosen to render cut scenes in engine by choosing gritty over pretty.
Was the eye twitch video cool? Sure, but how interactive was that?
Do customers care if it was running real time? Maybe a few die hards who piss away their days debating which tech is better 360 or ps3.
Was it a necessary improvement? The jury is still out...
Will it bring them closer to their goal of selling a story? Maybe but if they don't have a story, they'll end up in the bargain bin...
Vig, they are trying so hard to get their "cut-scenes" realtime because the entire game is a cut-scene. Much like Indigo Prophecy, the game is more akin to an interactive movie than a standard action game, and the major point of the game is your interaction with the characters and the emotions you feel when dealing with them.
There isn't a lot of room for gameplay innovation in a genre where the focus isn't on the gameplay. Sure you walk around and interact with things, but that's as complicated as it gets. Any action sequences are handled by timed button presses (ie: Left arrow pops up on screen, you press left and you watch as your character dodges a punch, jumps through the doorway, and knocks out the enemy).
I understand the initial resistance simply because they claimed to be perfect and clearly aren't there yet, but if you guys haven't played Indigo Prophecy I highly recommend it, it's an incredible game. Just be sure to turn the game off at the first appropriate ending and skip the atrocity of an ending that they tacked on for more gameplay hours.
I think someone has to make a whole new rig that really emulates the muscle structure found in a face. All I see is a hole that is supposed to be a mouth open and close while audio plays. Granted, it's not *that* bad, but it still is jarring to see. Her lips don't appear to move over her teeth and wrinkles don't form as she speaks.
So if these people are going to make groundbreaking tech then at least make a rig that can express it well.
When I look back over their E3 trailer the eyes are one of the last things I would have chosen to touch up, but they chose them first why? Why fix your strong point when there is so much more that lags behind?
[/ QUOTE ]
To me this is how i think the whole production is working right now. to many execs with no real focus and ideas focus on things that, in the grand scheme of things, doesnt really matter but seems like a showstopper that particular week.
So, we get in siutations were like, a week or two is spend tweaking poresize on our characters when the whole animaiton pipeline sucks and the chars looks like stickmen walking with their guns up their asses.
Roll on next week and its the way the belt buckle cast an unrealistic reflection
Replies
its just a matter of is it important to your game. crying eyeball tear tech is great for a game that involves you doing tons of crying and twitchy eyeball movements close up in a camera.. but the reason we haven't seen too much of this is that its not something that needs to be done realtime and i can't see a whole lot of gameplay possibilities this creates.. you can prerender crying crazy lady cinematic cause i don't think i want to actively participate in crazy girl crying.. she can go crazy and cry, but i am going to do the same thing in the game i would do in real life.. sit there and watch her go ape shit and think she is absolutely retarded. I don't think that needs to be realtime.. it can be completely canned, not anything to participate in, besides maybe leaving the house so i don't have to listen to the BS.
besides maybe leaving the house so i don't have to listen to the BS.
[/ QUOTE ]
Ah man, laughed hard to that!
I agree with most folks here, in that while it looks decent, it's by no means realistic enough to pass the valley. The eye demo looked pretty weak to me in terms of realism. The movement good, the rest, very cg. The advancement is nice, but doesn't seem to be the last straw in terms of getting out of the valley.
As far as whether or not a girl crying and emotionally freaking out in real time is needed... I'll hold judgment until the game is out. This is the company that did the Indigo Prophecy so I'm definitely curious to see what the end product is like.
Fell apart real fast though when I started jumping around the room and they just continued to read through their lines. If you could give the character enough AI and variation that she reacts to my disinterest by making it an affront to her vulnerability, that could get real interesting. Her gaze following me around the room only some of the time, not locked-on 100%, mechanical.
Real car:
Granted Gran Turismo focuses on cars... the in-game visuals are very close to photorealism. One of the major things that give it away is the overall image clarity of the game.
http://ca.youtube.com/watch?v=GT0W1PLTmN0
Huh Huh? Come on
[/ QUOTE ]
I haven't watched the latest clip it still loading, but I just watched the old youtube link and how can anyone be proud if that?
EDIT: Just watched a pair of eyes and kept waiting to see the rest of the face react. Hopefully they fixed the parts of their tech that cause the "whinny like a horse" animations to play whenever the character is talking.
that second shot with the pistol is awesome.
again not something that needs to be "implemented" in code.. we do this, its part of the rig, no programmers required.
[/ QUOTE ]
Please correct me if I'm wrong, but you're engine still has to read that rig right? As far as I understand typically eye controls are done using look at constraints with some sort of dummy controlling where they're looking right? It seems like with the addition of a max (or whatever program the rigging is being done in) constraint, you've gone beyond the regular ole IK FK bones set up and may require a programmer to integrate and to make your exporter recognise that. Now I doubt thats really a big deal, but just adding whatever you want to the rig doesn't gaurantee the engine is going to know what to do with it.
Since its already part of the rig you're working with I assume that sort of thing has already been worked through and integrated into the engine and exporters your working with. If you're working on a game with an engine that is still in development then theres really no assumption that ANYTHING will work without the attention of a programmer. I seriously doubt the artists and animators that worked on Crysis sat around with their thumbs up their asses until the engine was completely finished and every desired feature was running smoothly.
http://www.ageia.com/flash/heavyrain.swf (best to use the HD download)
my main complaint about this vid is the lipsyncing is HORRIBLE. Her mouth moves like she's deaf. In terms of emotion and story telling, though, it's awesome. I've shown this to non-artist folks and it's affected them more than any game I've ever seen.
As an aside, the end was funny, reminded me of casting tapes we made for one of those 90's FMV CDROM games I worked on. Seeing actors act their hearts out, and asking for the verdict that never comes. "We'll let you know." Heh.
Is this supposed to be in game graphics???
now complain!
this shits AWESOME!
[/ QUOTE ]
and here we have the "realdoll" i was referring to. thanks indian.
joking aside. it does look great. but marketing needs to shut its mouth.
Great mouths..
now complain!
this shits AWESOME!
[/ QUOTE ]
Sure most of the stills look fine, its the motion and the blending that kill it. Even in this shot her teeth aren't touching her lips. At times her upper lip doesn't touch her lower for entire sentences. Peeling the upper lip over a characters head is a bad way to fake reality.
Gauss, I don't think it was only a raw mo-cap dump, there are attempts to clean it up. It looks like the mocap they used didn't include fingers which is pretty common, so the fingers need to be animated by hand. If you watch her hands they are pretty wooden at times but they did make attempts to toss poses in which comes across as pretty static/wooden compared to the smooth mocap. I also doubt the face was mocap'ed because it has a lot of floating key frames that carry on far too long at times, a hallmark of someone working on a big file and not paying attention to curves. The mixed quality leads me to believe they used some automated sync tool then attempted to clean it up, only concentrating on key parts.
Long story short its not just a raw dump, pretty close to it tho. Hopefully in the 2 years since they've managed to hire a 1 fingered chimp to clean up their mocap. But if in those two years all they have to show is a pair of eyes... its going to be a long time before we see anything playable.
Now if it was a raw real time dump and they had the actress in the shot and her 3D counter part reacting in real time to her movements I could bow in awe. But what they showed was a half assed attempt at crappy mocap clean up.
Forget about uncanny valley and make a fun game.
Forget about uncanny valley and make a fun game.
[/ QUOTE ]
amen to that.. lets hope all this focus on the tech doesn't distract them from the main task. That they use it as an enhancement rather than the main show
and come on about the "just make a fun game" comment. there are a lot of professionals in this thread, we know that games don't work that way. there are artists who do art and there are designers that design. the people working on these characters are just trying to do their jobs the best they can, less emphasis on visuals doesn't somehow magically make a game more fun because that's the job of different people.
sure maybe they're putting too much of a high-level emphasis on this performance capture stuff, but so what? they've already got results that look stronger than a lot of the robert zemeckis stuff. if that's how they want to spend their money, fine.
[ QUOTE ]
Forget about uncanny valley and make a fun game.
[/ QUOTE ]
amen to that.. lets hope all this focus on the tech doesn't distract them from the main task. That they use it as an enhancement rather than the main show
[/ QUOTE ]
If the game is about an emotional, character driven story, isn't it vital to the main show to have this tech?
It should have taken 2-6 weeks not 2 years to clean up that scene and all they have to show for it is a close up of some eyes? Not winning me over with that. Sure it was an improvement but they could have improved the whole thing...
Someone has a bruised ego and is trying to make a point, I hope they don't forget why they started down this path. I hope they don't forget about their investors while they chase after some pointless goal. Think back to any game that championed tech this much then think about how long you cared about that tech while you where playing. For me its roughly 10min before I get used to it and it blends into the games landscape. If they want to keep people playing they'll need more not less substance. They've taken a tech step forward but a substantial leap backward in content.
When I look back over their E3 trailer the eyes are one of the last things I would have chosen to touch up, but they chose them first why? Why fix your strong point when there is so much more that lags behind? Show me impressive lip sync with realistic hand animations, show me facial emotions that blend correctly, show me they've learned how to mix story and tech together to make a playable game, show me characters reacting real time to random input from a player. Then I'll be impressed. If all they are doing is trying to make their cut scenes real time they haven't bridged any gaps, plenty of games have chosen to render cut scenes in engine by choosing gritty over pretty.
Was the eye twitch video cool? Sure, but how interactive was that?
Do customers care if it was running real time? Maybe a few die hards who piss away their days debating which tech is better 360 or ps3.
Was it a necessary improvement? The jury is still out...
Will it bring them closer to their goal of selling a story? Maybe but if they don't have a story, they'll end up in the bargain bin...
There isn't a lot of room for gameplay innovation in a genre where the focus isn't on the gameplay. Sure you walk around and interact with things, but that's as complicated as it gets. Any action sequences are handled by timed button presses (ie: Left arrow pops up on screen, you press left and you watch as your character dodges a punch, jumps through the doorway, and knocks out the enemy).
I understand the initial resistance simply because they claimed to be perfect and clearly aren't there yet, but if you guys haven't played Indigo Prophecy I highly recommend it, it's an incredible game. Just be sure to turn the game off at the first appropriate ending and skip the atrocity of an ending that they tacked on for more gameplay hours.
So if these people are going to make groundbreaking tech then at least make a rig that can express it well.
When I look back over their E3 trailer the eyes are one of the last things I would have chosen to touch up, but they chose them first why? Why fix your strong point when there is so much more that lags behind?
[/ QUOTE ]
To me this is how i think the whole production is working right now. to many execs with no real focus and ideas focus on things that, in the grand scheme of things, doesnt really matter but seems like a showstopper that particular week.
So, we get in siutations were like, a week or two is spend tweaking poresize on our characters when the whole animaiton pipeline sucks and the chars looks like stickmen walking with their guns up their asses.
Roll on next week and its the way the belt buckle cast an unrealistic reflection