Looks like something you'd get out of Autodesk 123d Catch. Even the artifacts and jaggyness with smaller objects. Not to mention it's also a cloud based scanning method.
I hate techno-babble, they would easily state 10 millimeter accuracy or 16^2 voxels per inch. I'd also like to hear the texture density, but those really only mater once they have a working game demo level.
Look at the edges of those manmade objects like the paint can... It's all wobbly! Looks awful. I noticed the 'realism level' as well, what a load of shit.
I hate techno-babble, they would easily state 10 millimeter accuracy or 16^2 voxels per inch. I'd also like to hear the texture density, but those really only mater once they have a working game demo level.
Pssst, they don't have texture density; voxels instead of pixels!
So I did some rummaging - they've literally made no progress at all since last year; all they've done is scan in some static scenes and render them the same way they always have done.
They're still totally CPU bound, too data intensive to actually compile any real scene, and they still can't even light or shadow their geometry (it's all baked into the voxel colour); let alone handle anything like rudimentary animation. So far they haven't even demonstrated that they can translate or rotate a single object within the scene.
I literally think that they've done nothing other than plug in a new scene.
What garbage. I've seen university students make more progress in less time than this supposedly professional team of engineers.
So I did some rummaging - they've literally made no progress at all since last year; all they've done is scan in some static scenes and render them the same way they always have done.
What they have done is set up a business selling their tech to display large scale laser scan data for building and real state businesses and the like. That would appear to be a far more plausible use of the software that they've put together.
Ooooh ... first that cardbox scene looking just like a bunch of 123catch captures, and now this stuff about real estate visualisation, Jack ?
It *really* sounds like the artists they hired basically are there to clean up 123catch "scan" data. (not scan at all actually - more like semi-automated photgrammetry ... working okay for organics but in need of complete re-modeling for architectural assets)
I suppose they are thinking that by "scanning" everything in they can get something somehow realistic - even tho it will obviously shade like shit or even has no dynamic lighting whatsoever. But even if it will look like crap to us (and rightfully so - by now it's even more obvious that their past claims of competing with solid current engines were totally BS) this kind of dry photo-based representation seems quite appropriate for real estate agents.
Now the irony is that all that stuff is already available in 3DCoat today
So basically ... they are now totally giving up the whole instanced rainbow island unlimited detail thing in order to build ... some kind of realtime voxel 3D walkthrough tool ?
It's covered in the link that Equil posted on the previous page http://www.euclideon.com/
As a tool for compressing, networking and displaying ultra high detail LiDAR information, it's probably pretty handy.
Based on the interview, I think they're still noodling with their voxel game engine, but by the sounds of things they've realised that if it's possible to make anything worthwhile at all, it's going to take a lot of time, work and money.
Yeah totally - but they cannot help it, they still love their trademark BS claims :
1 million point cloud atoms per virtual cubic inch
Even tho they seem to be going for a more practical application of their tech now ... I still wouldn't trust them in the slightest. I mean, their whole brochure is like that : full of plain vapid statements sounding big (Compress your data! Drag and drop your data! No need to copy to every computer! ) yet anyone in their target audience knows that this is all meaningless. It is as if they were marketing a super high end solution for people who never used a computer for anything else than Facebook.
If I were them, I'd have built a server side solution for handling a really high octane, high detail scan for use in something like arch-viz, and the software to funnel the rendered image down the web streamed as a web application. That kind of thing is viable.
As far as I know, and when sifting through their BS, that is exactly what they're doing, data is contained on a central computer, and streamed to the user.
As far as I know, and when sifting through their BS, that is exactly what they're doing, data is contained on a central computer, and streamed to the user.
Lol, you must be covered in shit now to have sifted through an gotten that nugget. :poly124:
They may actually be of some use if that turns out.
I allways feel the urge to comment on theese youtube videos but when i try i just give up after about a few sentences, i mean people are so gullible its ridcolus. I cant even muster the strength to argument with them.
I might not be as technical as many of the people in this thread, but its clear as bright day that its fishy. I think everyone has pointed out the flaws of the videos (like how they lack any technical information at all, how he dosn't even try to proove the counter arguments to be wrong, etc).
I mean, in the long interview with that biased reporter (or whatever he is) they acctually talk about notch's and Carmacks statements, and the only thing he replys with is "one guy said it cant be done and the other guy says that it has been done a million times before" and then laugh it off. What the fuck is that for an answer?
TLDR; Im getting pissed off watching theese videos.
Lol, you must be covered in shit now to have sifted through an gotten that nugget. :poly124:
They may actually be of some use if that turns out.
Lol, quite.
The thing is, what they're doing is nothing new, and I've seen plenty of point cloud based apps, there are quite a few used in architecture and land surveying.
Sometimes a building doesn't have the old blueprints, and as a land surveyor, you need to measuring the inside and outside of the building, and the outside, is most easily done with a tool that measures distance of points via laser, and that just plots them in 3D space inside an app. Pretty much exactly what this app does.
I allways feel the urge to comment on theese youtube videos but when i try i just give up after about a few sentences, i mean people are so gullible its ridcolus. I cant even muster the strength to argument with them.
I might not be as technical as many of the people in this thread, but its clear as bright day that its fishy. I think everyone has pointed out the flaws of the videos (like how they lack any technical information at all, how he dosn't even try to proove the counter arguments to be wrong, etc).
I mean, in the long interview with that biased reporter (or whatever he is) they acctually talk about notch's and Carmacks statements, and the only thing he replys with is "one guy said it cant be done and the other guy says that it has been done a million times before" and then laugh it off. What the fuck is that for an answer?
TLDR; Im getting pissed off watching theese videos.
Paraphrasing here: "We sort through the hard drive, temporarily send the data to RAM, and then straight to the screen. Most will tell you this is impossible!"
Feel free to correct me if I'm wrong, but isn't this how most, if not all, software operates?
Seems like they found their niche. Cool demo, nice to see them actually apply UD to something xD
+1
The games industry just wasn't a good fit. Or at least their message and "product" as was packaged just wasn't right.
"Ello game industry! Bow to us! Redesign everything, throw it all out, engines, tools and hardware, toss it all out. Then redo everything from scratch, now give us lots of money..."
They obviously put a lot of years into developing this stuff and I'm glad they found their niche.
Paraphrasing here: "We sort through the hard drive, temporarily send the data to RAM, and then straight to the screen. Most will tell you this is impossible!"
Feel free to correct me if I'm wrong, but isn't this how most, if not all, software operates?
Coming back to this; it's how euclidean operates, they spout out a bunch of factual nonsense and misinformation in the desperate hunt for investors or new areas, they also had that funny chart of processor speeds vs storage space without taking into consideration the fact that processors started to achieve more per clock cycle than they did before, and the wider use of multiple cores, or the fact that if we want to do the exponential comparison we look at transistors.
Every time they do their pitch they mention on how everything in the area has been doing it wrong all the way and that their sollution will solve everything, they will also neglect to mention what it cannot do and that it only does everything.
Kinda cool looking tool, a better use for unlimited detail than games, but I've seen 3d scan videos that seem to deal with just as massive point cloud data without issues, I'm sure there's more apps then the "slow" ones they demoed.
[start speculatively flippant comment]You guys do realise that they don't actually want to do anything with this other than sell the tech to Google for Street View right?[/end]
they also had that funny chart of processor speeds vs storage space without taking into consideration the fact that processors started to achieve more per clock cycle than they did before, and the wider use of multiple cores, or the fact that if we want to do the exponential comparison we look at transistors.
I suspect they're still single threaded and heavily CPU bound anyway.
Paraphrasing here: "We sort through the hard drive, temporarily send the data to RAM, and then straight to the screen. Most will tell you this is impossible!"
Feel free to correct me if I'm wrong, but isn't this how most, if not all, software operates?
What I got from the video was this:
Instead of reading a very large file off the hard drive, and storing the whole thing in RAM, they can index the file and pull smaller parts of it to RAM, or something. I dunno.
Not sure if they are talking out of their asses or not, but I do think they are better suited in that industry than the games industry. Good for them for finding a niche.
3D graphics innovator John Carmack says the technology is potentially feasible in the near future: "[N]o chance of a game on current gen systems, but maybe several years from now."
Last night I asked Carmack why he thinks it'll take 5 years for something like the Unlimited Detail demo to be feasible, and he kindly explained, also outlining his own plans for a voxel renderer:
"You can real time ray trace a static world on high end hardware today as a demo, but there is a long path between a demo and something that is competitive with rasterization in a real product. My plan for such technologies has always been to emit a depth buffer as well as color from the voxel/point cloud renderer and continue to use existing technologies for characters/particles/etc.
"It took us five years to go from a megatexture demo to a (almost) shipping game."
So in 5 years, we may see something like Unlimited Detail... only there's a good chance it'll come from Carmack's id Software. (I've re-titled this post to reflect these updates.)
In five years (probably less) we will have the first quantum computers and I know that will be a game changer regarding realism but it will be decades before games actually benefit from them.
Interesting times we live in :thumbup:
PS Still have my working zx spectrum 16k & 48K computers :poly124:
If I'm not mistaken, that quote is probably something like five years old now. There are some pretty impressive voxel and hybrid renderers out there - this isn't one of them.
Still comparing this tech to video games when they don't have reflections, lighting, shadows, or any sort of frensel or specular. It's full bright textures the game.
I love the quote at 4:08 "A lot of professionals on the internet said our claims were impossible, but they were wrong and now our technology is being used by some of the world's biggest companies!"
It's like they understand how shady they look and sound, but then say "It's alright, don't worry about it!"
They said "Here is some real life footage" as soon as I saw it I thought "This has been rendered this is not real life." Its pretty easy to spot to be honest.
I love how he uses DSA (or Realms of Arcania) - Shadow over riva for the build up (i think it's riva)...where did he find that game?
Still looks shady. I mean they show a camera flying through a static Voxel surrounding. that looks alright on the inside...looks horrible on the outside. Honestly i think current engines do a decent job. Scan data is nice...but Scan data of an alien space ship or a city in a final fantasy game might be hard to come by though...
Replies
These guys really love their technobabble
I hate techno-babble, they would easily state 10 millimeter accuracy or 16^2 voxels per inch. I'd also like to hear the texture density, but those really only mater once they have a working game demo level.
Of course!
That's what I've been doing wrong with my scenes! I just didn't turn it up high enough.
Pssst, they don't have texture density; voxels instead of pixels!
I know right?! Don't you hate it when that happens, although mine caps out at 60%, what about you?
They're still totally CPU bound, too data intensive to actually compile any real scene, and they still can't even light or shadow their geometry (it's all baked into the voxel colour); let alone handle anything like rudimentary animation. So far they haven't even demonstrated that they can translate or rotate a single object within the scene.
I literally think that they've done nothing other than plug in a new scene.
What garbage. I've seen university students make more progress in less time than this supposedly professional team of engineers.
Wait...
No, that's just a post-it note stuck to my monitor.
It *really* sounds like the artists they hired basically are there to clean up 123catch "scan" data. (not scan at all actually - more like semi-automated photgrammetry ... working okay for organics but in need of complete re-modeling for architectural assets)
I suppose they are thinking that by "scanning" everything in they can get something somehow realistic - even tho it will obviously shade like shit or even has no dynamic lighting whatsoever. But even if it will look like crap to us (and rightfully so - by now it's even more obvious that their past claims of competing with solid current engines were totally BS) this kind of dry photo-based representation seems quite appropriate for real estate agents.
Now the irony is that all that stuff is already available in 3DCoat today
So basically ... they are now totally giving up the whole instanced rainbow island unlimited detail thing in order to build ... some kind of realtime voxel 3D walkthrough tool ?
Lol
http://www.euclideon.com/
As a tool for compressing, networking and displaying ultra high detail LiDAR information, it's probably pretty handy.
Based on the interview, I think they're still noodling with their voxel game engine, but by the sounds of things they've realised that if it's possible to make anything worthwhile at all, it's going to take a lot of time, work and money.
Even tho they seem to be going for a more practical application of their tech now ... I still wouldn't trust them in the slightest. I mean, their whole brochure is like that : full of plain vapid statements sounding big (Compress your data! Drag and drop your data! No need to copy to every computer! ) yet anyone in their target audience knows that this is all meaningless. It is as if they were marketing a super high end solution for people who never used a computer for anything else than Facebook.
Lol, you must be covered in shit now to have sifted through an gotten that nugget. :poly124:
They may actually be of some use if that turns out.
Now it's just embarrassing.
But who knows, maybe one day they'll surprise us.
I might not be as technical as many of the people in this thread, but its clear as bright day that its fishy. I think everyone has pointed out the flaws of the videos (like how they lack any technical information at all, how he dosn't even try to proove the counter arguments to be wrong, etc).
I mean, in the long interview with that biased reporter (or whatever he is) they acctually talk about notch's and Carmacks statements, and the only thing he replys with is "one guy said it cant be done and the other guy says that it has been done a million times before" and then laugh it off. What the fuck is that for an answer?
TLDR; Im getting pissed off watching theese videos.
Lol, quite.
The thing is, what they're doing is nothing new, and I've seen plenty of point cloud based apps, there are quite a few used in architecture and land surveying.
Sometimes a building doesn't have the old blueprints, and as a land surveyor, you need to measuring the inside and outside of the building, and the outside, is most easily done with a tool that measures distance of points via laser, and that just plots them in 3D space inside an app. Pretty much exactly what this app does.
Get this: http://www.tannr.com/herp-derp-youtube-comments/
hehehe, no
I think you just saved my life.
[ame="http://www.youtube.com/watch?v=Irf-HJ4fBls"]Euclideon Geoverse 2013 - YouTube[/ame]
Feel free to correct me if I'm wrong, but isn't this how most, if not all, software operates?
I like how he is "flying arround in realtime" without touching the mouse ;-)
+1
The games industry just wasn't a good fit. Or at least their message and "product" as was packaged just wasn't right.
They obviously put a lot of years into developing this stuff and I'm glad they found their niche.
if this is half as nice as in the demo its really a useful tool for the industry
Coming back to this; it's how euclidean operates, they spout out a bunch of factual nonsense and misinformation in the desperate hunt for investors or new areas, they also had that funny chart of processor speeds vs storage space without taking into consideration the fact that processors started to achieve more per clock cycle than they did before, and the wider use of multiple cores, or the fact that if we want to do the exponential comparison we look at transistors.
Every time they do their pitch they mention on how everything in the area has been doing it wrong all the way and that their sollution will solve everything, they will also neglect to mention what it cannot do and that it only does everything.
I suspect they're still single threaded and heavily CPU bound anyway.
Every year for the last 10 years they've gone "but we're working on hardware-accelerating it!"
What I got from the video was this:
Instead of reading a very large file off the hard drive, and storing the whole thing in RAM, they can index the file and pull smaller parts of it to RAM, or something. I dunno.
Not sure if they are talking out of their asses or not, but I do think they are better suited in that industry than the games industry. Good for them for finding a niche.
http://nwn.blogs.com/nwn/2011/08/is-the-future-of-immersive-3d-in-atoms-euclideoncom.html
In five years (probably less) we will have the first quantum computers and I know that will be a game changer regarding realism but it will be decades before games actually benefit from them.
Interesting times we live in :thumbup:
PS Still have my working zx spectrum 16k & 48K computers :poly124:
https://www.youtube.com/watch?v=5AvCxa9Y9NU
It's like they understand how shady they look and sound, but then say "It's alright, don't worry about it!"
https://www.youtube.com/watch?v=BpT6MkCeP7Y
I will believe. When it will run on my pc.
Still looks shady. I mean they show a camera flying through a static Voxel surrounding. that looks alright on the inside...looks horrible on the outside. Honestly i think current engines do a decent job. Scan data is nice...but Scan data of an alien space ship or a city in a final fantasy game might be hard to come by though...