2 years ago I read a book about cloud computing, the big server complexes being built in Washington state, and how this is all gonna change the world by making it so that all you need is a keyboard and a monitor (or mobile equivalent) to do all your computing.
This book was frightening because that means every office building can cut its operating costs by getting rid of all their computers and most of their networking.
All those I.T. jobs gone, all those computer sales gone, and for Asia, all those manufacturing jobs gone.
Well at least computers will survive as a niche market for gamers, right?
http://www.nvidia.com/object/cloud-gaming.html
and the worst part is this...
The cable that runs from your graphics card to your monitor transmits data at almost 1.25 Giga
bytes/sec.
An average cable service here in America can offer data rates at
up to 12.5 Mega
bytes/sec.
So what is 1% of 1920 x 1080 @ 30 fps?
The odds of landlines being up-graded at this point are slim to none, its a massive amount money to get that done. The little we have in America (which is better off than most countries) was more or less subsidized by the government during the Cold War.
Wireless? Ever seen a map of the broadcast spectrum? It's beyond crowded, it's a mess, just ask the U.S. military.
In summary, Im sad.:(
Replies
The future is here. Actually, it was here Last year.
Back then I got the Onlive Game System and it came with a free copy of HomeFront.
I was playing it with just a keyboard, mouse and TV. It was pretty cool to say the least. almost no power usage, pretty much maxed out graphics, and lag free multiplayer as well, with the exception of getting the picture to your screen, since the multiplayer games are all hosted on their internal network "locally."
The Bitrate problem you mention does have a toll on the video quality, but I thought it was alright for what it is. You don't really notice it with all the bullets wizzing by when you're focusing on the game. Onlive does a great job with compression and is REALLY bandwidth effecient. Read this: Onlive Desktop The Verge
IDK who said it, maybe Sweeney or Carmack, but they were theorizing that we will eventually get eye tracking hardware that can make sure to only render the pixels that you are focusing on with full priority rather than what is in your periphery. That would be very interesting especially if you factor in compression and the added potential for graphical fidelity and power savings.
That sounds tremendously helpful.
Not to mention more RAM for running tons of apps simultaneously, cloud storage and "thin/portable clients" and even mobile 'clients'.
There is also great potential for software developers because of the lack of piracy and the ability to have subscription based business models with instant updates and consistent target hardware platforms. No more problems with drivers and software not working on a certain series of AMD cards for example.
Plus if you read that Verge article I posted, take note of the speed of their internal internet connection, it's INSANELY FAST, you could download anything to your cload computer insanely fast, be it an update for your 3d Software or some game you find online.
There will be less IT work perhaps, in the sense that they will move over to the server side and have to worry about less types of hardware, but it does make their job easier with instant and direct Technical Support access to customers with screen sharing and other benefits.
I see so many benifits, I hardly know what you are worried about. The biggest thing I see a problem with is privacy concerns.
Well, I spent some time researching it, but from what I can it's not really comparable. It's an innovation, sure, but of its own kind. No doubt we'll have more and more devices that cut out the middle men of certain content providing hardware, but that doesn't mean that the hardware is on the verge of being overthrown.
Now, consoles are one thing, mainly because all they do is provide limited services built on often unstable hardware; but we use them because they're an idea from an era of limited technology. However, computers are not consoles, and you also can't really look at them like middle men. They're not just for gaming, or movie watching, or document writing, but at this point they're for pretty much everything anyone could ever need. That and the fact that hardware developers are coming out with new tech all the time like SSD's and Multiterrabyte hardware storage and better video cards, and more RAM, points to something that's doing very well for itself.
Saying that this device (which is no doubt very cool and utilitarian) will replace computers is a bit like saying e-book reading will replace pen and paper. No doubt though, that this device could emulate more and more digital services given time... if given a lot of time.
But, that's just my two cents.
you actually do recognize that computers aren't around that many centuries and that they already changed A LOT since they first came up?
we had this yesterday in polycount chat
and all this now fits into your pocket, at once! unthinkable 2 centeuries ago. The cloud stuff will take over cetain parts, heck adobe already started licensing its products via cloud, its just a matter of time you don't need your own machine anymore, it's super usefull to the big software suppliers, with this technology they can finally make sure you pay for using their software, because without paying there will just be no stream for you. Right now it lacks standards for this, but i assume its just a matter of time.
Indeed, I understand that, but my point was computers have reached a certain point of fundamental-ness/basicness/intrensicness. As you've said they were not around many centuries. They didn't have the slowing down properties of their circumstantial rise at the time. Because we're also talking about a time where not a lot of people had computers, they were a niche item. We're talking about something where it was a landmark to sell 1 million. Between 1980 and the turn of millennium, sources such as the International Data Corp have released statistics that say less than 1 billion computers were sold; 835 million. According to Gartner Dataquest's statistics, in April 2002 the billionth personal computer was shipped. The second billion mark was supposedly reached in 2007. We're talking about just reaching 1/7th to 2/7th's of the world in almost 30 years. It's not really niche anymore, and it's also relatively very slow. You're talking about throwing a market, that will clearly have access to the entire world in no time flat, into the same crevice that computers originally had when they first started. And as the trendiest of the first worlders we're also forgetting the rest of the world in that respect. When you step out of the metropolitan, you begin to take a more pragmatic look at things. Especially in what are now growing and developing nations, but also in the first world. So while yes, I agree things will change, like your picture shows: it fits in your pocket. It's still there, it's still physical. When you're not connected to the internet it's still there. When you feel like you don't want to upgrade this year, it's still there. There were nothing like computers before it, it was its own precedent. Yes, it has evolved into smaller formats, but they're all still computers, they're all still hardware.
I'm simply saying that I think people are underestimating buyer choice and supplier's intentions, as well as overestimating the adoption of new technologies. In this same concept, I don't think matter transporters would be the next logical step of transportation evolution; or destroy the market of automobiles. And I also don't think that just because humans have evolved to be extremely special relatively to nature, that our next evolutionary step would be something fundamentally destructive, even if the evolution itself appears to us to have abilities over what we have now. In a pragmatic light.
That's all.
I think this comic illustrates some of what I'm talking about, even though it's about a different subject.
These work great because there's no limiting factor in input latency, any delay cannot be noticed.
Now take real-time applications, these require real-time input, and even though the best of tech has put to use, latency will be a constant issue. Some people don't care about latency, but many others will notice low latency much like you'd notice if your cursor in windows wasn't moving where you wanted it to.
There's room for both kinds of tech, low-intensive or stuff that requires latency-unrelated computations will be done pretty well on cloud computing.
High intensity low-latency stuff like gaming will be done on a dedicated machine in your house because in reality: hardware is cheap compared to the cost of games you'll end up buying, and the irony in computing is that we moved away from cloud computing and terminals way back because computers became so cheap we could have them in our homes.
What did you spend most money on, your console or the games for it? Would cloud gaming have saved your money?, would it have been worth the compressed feed and the latency?
Actually, laptops were around quite a while, they just weren't feasible, they never replaced desktops, they just give people on the go an alternative for them, then there were people that needed to do stuff that a laptop just would've been too much for, like check the mail or browse the web, and then came smartphones.
They're all essentially computers though.
Much like the massive arrival of casual gaming, none of it killed the predecessor, it's a tree-like evolution.
Couldn't agree more.
lol, nope, don't agree!
And what is up with people talking about centuries? ^^ In a century computers will be long gone, in the form we know them now atleast.
So in a sense this isn't new, it's just a level up of this dumb terminal network on a larger geographical scale.
I mainly fear for greed-driven development gaining an even stronger foothold then it already has. Sure its games are business and need profit, but give 'em an inch...
That said, I gotta admit that I've also wished auto-desk would have some kind of cloud based PLE. Of course that brings up the problem then of who owns your work.
The school my mum works in made the change to this at the start of the year. All PC labs only have monitors,keyboard,mouse, all linked to one central computer/sever. Apparently it's a world of problems just to save the school a few pounds
I don't see it coming into effect yet, just see the backlash from always-on games and the issues it produces.
Indeed, and schools don't have the best track record for problem solving (at least not here)
Who said 20 years? A century is 100 years.
And phones outsold desktops long ago. It's important to look at some basic laws surrounding computers: There will always be a desktop that is cheaper than a laptop, and there will always be a laptop that is cheaper than a tablet.
I can see a future where I can joke around and play xbox360 games on an emulator on my lightweight tablet, but when that happens I'll still have to start up my desktop to enter our super advanced virtual reality molecular simulating games.
Of course, in the far future when we've built dyson spheres around suns and mastered all that there is to be mastered and we've changes our own brains there might no longer be such issues.
Wheels, been around for ages, math is essentially the same, computers work the same way they did when they first arrived even though at hardware level we figure out new and more effective ways to have them do those things.
Computers will not change, hardware will, software will, but the very basic fundamental way that computers execute something that we want them to execute will stick around, they're advanced calculators, and much like math that will not change.
I myself have gone through a few netbooks, but that fact, and the fact that sales for tablets and portables have gone up will not in any way decrease the importance of desktop computers.
eld: that is not the same thing. I almost dont know anyone outside of gamedevs or hardcore PC gamers that own a moderately modern desktop. Everyone I know has switched to laptops. Also there will not always be a desktop that is cheaper, because since its a declining market the low-end stuff might aswell dissapear, and only high-end workstations remain.
Exactly, and this is already true, we're already doing cloud computing for things like this.
Google searches relies on google to compute search results for us, Siri on the iphone relies on apples online hardware to calculate answers.
Polycount relies on a server somewhere deciding what will happen to what you just posted, all your browser knows is that it got a html page with a box you can type in.
You're right, that comic isn't making the point you are trying to make at all. And your point doesn't have much evidence to support it... technology moves quite fast. It takes only about 5 years for things to change very significantly in the technology world.
Cloud computing is practical, and it will not result in a net loss of jobs, I don't think -- all of those server still need parts, and maintenance, and engineering, and all of the money companies save on IT will go other places. Who lays all the cables that will be needed to network the entire world fast enough for cloud computing to be cost effective vs local computing? Do you have any idea how many people actually need to be connected? How many parts need to be produced for this infrastructure to be developed? I personally believe cloud computing is coming, because it's a safe, logical bet -- it's a cost effective, practical use of resources. But it's going to use an otherworldly amount of resources to make it happen across the entire world. There will be work for a long time. Roles can be rendered obsolete, but the laws of scarcity remain. Everyone will always need things they dont have.
When making games for desktops you have to factor in many kinds of software and hardware set up.
Maybe I am misunderstanding you?
Re: cloud computing in general: not sure if I really want it to happen but in terms of hardware it would be kind of good, the server side box gets upgraded as new tech arrives so the end user can always be up to date.