Home General Discussion

News Flash, Intel Extreme Graphics not so extreme

http://www.news.com/8301-13579_3-9883439-37.html

Looks like game developer's aren't the only ones fed up with Intel's crappy chipsets polluting the PC market. Tell me you didn't see this coming...

Replies

  • Tumerboy
    Offline / Send Message
    Tumerboy polycounter lvl 17
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    It's an extreme piece of crap for sure. I hate how that piece of crap is just about everywhere now. Graphics chip my ass.

    Alex
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    while I dont like it from gamedev's and "multimedia" view, be fair that most desktop pcs are for business / office work... it's not intel's fault that graphical "gimmicks" are added to the OS, when you could just as well use win nt4 or whatever for the things you actually do...

    the only reason you cant actually use nt4 and have to adopt with new OS versions, are contracts and licensing deals as well as MS stopping support for older stuff...

    more "powerful" hardware often will mean more power consumption as well...
    I found this article
    http://hubpages.com/hub/_86_Mac_Plus_Vs_07_AMD_DualCore_You_Wont_Believe_Who_Wins

    rather interesting about "every day use"
  • ViPr
    Offline / Send Message
    ViPr polycounter lvl 17
    i need to know the reason Intel processors seem so much slower than the ones from ATI and Nvidia.

    plus i want to know why Tim Sweeney is praising Intel in this interview.
    http://interviews.teamxbox.com/xbox/2169/Epic-Games-Tim-Sweeney-Interview/p1/

    i have my own theories on all this but i want to know what other people think.
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    Crazybutcher I see what you are saying about the office thing, but the unpleasant reality is that XP runs terrible with that chip. I blame both Microsuck, and Intel for releasing crap. smile.gif That chip is an extreme piece of crap and that whole bs about integrated graphic card chips into the motherboard blows, not because it's not a good idea but because what these companies end up doing with the components. They get the crappiest shit they can get away with and sell it for top dollar.

    Alex
  • Mark Dygert
    ViPr, he isn't praising Intel's most common integrated video chip set he was compairing CPU's in consoles IBM to Intel.
    [ QUOTE ]
    Q: "In your opinion, is IBM’s Power architecture better than x86?"
    A: "... Let me be perfectly clear: Intel’s CPUs are much better than IBM’s CPUs. But the economics of that—performance-per-watt and performance-per-dollar are huge part of the decisions for consoles.

    [/ QUOTE ] I bet if you asked him if he liked the "Intel 82845g EXTREME Graphics controller" over anything put out by Nvidia he would laugh.
  • ViPr
    Offline / Send Message
    ViPr polycounter lvl 17
    i wasn't referring to that section, Vig. at the end of the interview on the last page Tim Sweeney was basically saying the CPU makers will make the GPU makers obsolete instead of the other way around. all the other evidence i had been seeing prior to that was indicating to me that the opposite would happen. considering what a major programmer Tim Sweeney is, we have to take him seriously. so i am quite confused now.
  • Fish
    Offline / Send Message
    Fish polycounter lvl 18
    [ QUOTE ]
    i wasn't referring to that section, Vig. at the end of the interview on the last page Tim Sweeney was basically saying the CPU makers will make the GPU makers obsolete instead of the other way around. all the other evidence i had been seeing prior to that was indicating to me that the opposite would happen. considering what a major programmer Tim Sweeney is, we have to take him seriously. so i am quite confused now.

    [/ QUOTE ]

    put it this way: once a cpu grows to 16+ cores gpu's would not be necessary. You can already see the effects of parallel computing (various real time raytraced demos) on real time graphics. All we need now is more cores to add the logic layer in the computation and you got yourself a game that only runs on the cpu.
  • hawken
    Offline / Send Message
    hawken polycounter lvl 19
    integrated intel graphics seems to work fine for me on the mac mini. at 1080p everything still runs all buttery. Admitttedly the only 3d game I've run on it is homeworld2 (which runs fine). 1080p Video and other 3d apps (cinema 4d) are smooth and fault free.

    Intel GMA 950 64mb
  • Mark Dygert
    [ QUOTE ]
    i wasn't referring to that section, Vig. at the end of the interview on the last page Tim Sweeney was basically saying the CPU makers will make the GPU makers obsolete instead of the other way around. all the other evidence i had been seeing prior to that was indicating to me that the opposite would happen. considering what a major programmer Tim Sweeney is, we have to take him seriously. so i am quite confused now.

    [/ QUOTE ]

    Ahh Gotcha. I have no problem with CPU's overtaking GPU duties Intel makes great CPU's and if they advance to the point that programmers would rather use them instead of GPU's then its a happy world we live in. Removing one twitchy piece of hardware that varies widely from the issue of making games will be great.

    The problem MS, and the Industry and myself have is that Intel can't make a GPU to save its life and they could really care less as long as people buy whatever it is they sell, in mass quantities. Hopefully Sweeney is right and this whole GPU-CPU thing will be a thing of the past. Honestly I can't wait it will be cheaper for everyone, and much more stable.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    intel is working on a graphics card themselves (or better say an architecture that can be applied to various things), as AMD they see the future in cpu/gpu blend... anyway intel has hired some gamedevs and also bought that "project offset" studio, which will also help them to forge their new graphics chip...
    intel doesnt like the fact people buy nvidia GPUs to do "cpu" work (gpgpu) they want that market back...

    http://arstechnica.com/articles/paedia/hardware/clearing-up-the-confusion-over-intels-larrabee.ars
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    Back in 1999, I saw SGI machines. The advantage the sales rep were selling was because the graphics card was integrated more directly with the processor it made the computer faster because the calculations were faster. A short time later graphics card got better and made those very good and expensive machines not worth it. I'm sure if microsuck wrote a more efficient software to use the hardware, pc would be a whole lot better as a whole, but the same can be said of intel. If they made better processors and graphics cards they would be ahead of the game. They are ahead in the CPU market? At any rate anytime these companies merge Autodesk buying Alias, AMD getting ATI it sends red flags out in my mind because usually what happens is that things get more expensive but the product doesn't improve that much. AMD, ATI, and Nvidia forced Intel to improve. They got happy and got lazy.

    People buy graphics cards more than cpus because they are cheaper and easier to upgrade, period. Maybe they should stop making so many different sockets and shit and things would be different. The reality is people can't afford to spend 800 bucks every two or three years to get a brand new cpu because you need to get a new motherboard, ram, vid card and power supply half the time. If they have 200 bucks and get a better graphics card and it seems to make things better people are happy. Do you really need a new socket release when new cpus are made? Maybe, but if we could get the same mileage out of the motherboard by making cpus fit the old socket people would save a lot of money wouldn't they? I would like to upgrade my cpu, but it costs almost 200 bucks to get an athlon xp that is faster than what I have but 100 bucks to get something better that is a lot faster. All I needed was a faster viewport display and thought that getting a 512 vid card would work. It did work. I don't like spending money on older hardware but it would take me a few years to save up 800 bucks to be able to use the faster hardware. If it was easier to upgrade cpus it would happen and Intel would not have lost the market share they had. If vid cards get sucked into the cpus, I hope nvidia starts making their own cpus to kick intels and amds ass. Why simple competition. The reason Intel doesn't make great graphics card is simple they don't have to, hence the problem.

    Alex
  • sonic
    Offline / Send Message
    sonic polycounter lvl 18
    [ QUOTE ]
    put it this way: once a cpu grows to 16+ cores gpu's would not be necessary. You can already see the effects of parallel computing (various real time raytraced demos) on real time graphics. All we need now is more cores to add the logic layer in the computation and you got yourself a game that only runs on the cpu.

    [/ QUOTE ]

    Why do people on the internet like to make stuff up? :P

    The advantage of having a GPU over a CPU is that it can work with floating point numbers at a much higher speed. There are several reasons for this, one being that it uses 128bit registers. This allows them to calculate more data at once and more efficiently, but their precision is much lower than CPUs. CPUs adhere to the IEEE 754 precision standard ( http://en.wikipedia.org/wiki/IEEE_754 ) which doesn't apply to GPUs. This means that although GPUs are much faster at doing FP operations they are much less accurate. This is fine when calculating pixels, but not fine when calculating other types of data. Even the latest Nvidia set of 8xxx cards only has 32bit precision at best. CPUs have double precision 64bit. They aren't even comparable.

    Basically what I'm saying is that the GPU can do certain things a lot better than the CPU, but only certain things. We could have a 500 core CPU at the current speeds and architecture and it wouldn't make the GPU obsolete in the least bit. Not everything can be made quicker by parallelization. We need both a CPU and a GPU (for now).

    [ QUOTE ]
    Back in 1999, I saw SGI machines. The advantage the sales rep were selling was because the graphics card was integrated more directly with the processor it made the computer faster because the calculations were faster. A short time later graphics card got better and made those very good and expensive machines not worth it. I'm sure if microsuck wrote a more efficient software to use the hardware, pc would be a whole lot better as a whole, but the same can be said of intel. If they made better processors and graphics cards they would be ahead of the game. They are ahead in the CPU market? At any rate anytime these companies merge Autodesk buying Alias, AMD getting ATI it sends red flags out in my mind because usually what happens is that things get more expensive but the product doesn't improve that much. AMD, ATI, and Nvidia forced Intel to improve. They got happy and got lazy.

    People buy graphics cards more than cpus because they are cheaper and easier to upgrade, period. Maybe they should stop making so many different sockets and shit and things would be different. The reality is people can't afford to spend 800 bucks every two or three years to get a brand new cpu because you need to get a new motherboard, ram, vid card and power supply half the time. If they have 200 bucks and get a better graphics card and it seems to make things better people are happy. Do you really need a new socket release when new cpus are made? Maybe, but if we could get the same mileage out of the motherboard by making cpus fit the old socket people would save a lot of money wouldn't they? I would like to upgrade my cpu, but it costs almost 200 bucks to get an athlon xp that is faster than what I have but 100 bucks to get something better that is a lot faster. All I needed was a faster viewport display and thought that getting a 512 vid card would work. It did work. I don't like spending money on older hardware but it would take me a few years to save up 800 bucks to be able to use the faster hardware. If it was easier to upgrade cpus it would happen and Intel would not have lost the market share they had. If vid cards get sucked into the cpus, I hope nvidia starts making their own cpus to kick intels and amds ass. Why simple competition. The reason Intel doesn't make great graphics card is simple they don't have to, hence the problem.

    Alex

    [/ QUOTE ]

    Clearly you have things figured out. After all, everything on every OS is slow because of Microsuck. After all, Microsuck makes the only OS in the entire world and they are single handedly keeping all computers from being as amazing as they should be.

    You'll find that minimum system requirements are the same across all OSs that share the same features. Compiz fusion's comparable plugins to Vista's Aero require the same and sometimes more than Aero. And on top of that it doesn't work with a lot of graphics cards and it's full of bugs. Damn those Linoosers and their crappy skills.

    And people buy new graphics cards because they make things that require graphics faster. A CPU makes certain things faster and a GPU makes certain things faster.
  • arshlevon
    Offline / Send Message
    arshlevon polycounter lvl 18
    after gdc i am almost certain the next generation of hardware will not have gpu's at all. the ps3 was suppose to just have 2 cell processors and no gpu but it was too expensive. the cell is better than the gpu in the ps3 and we are always taking advantage of that. i also talked with a few guys from intel about their new graphics chip, although i didn't see anything with my eyes, they told me it blows away anything nvidia or ati has done(i took this with a huge grain of salt, might be true, but by the time it comes out will it still be?) but he seemed to agree with the our lead engine programmer and many others that soon gpu's would be a thing of past thanks to faster cpu's and architecture changes that ibm and intel are making right now and already have made.

    i am going to trust my sources on this and believe that in 10 years there will be no such thing as a gpu.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    yes its all about vector processing / texture fetching, after all the only "3d gpu" specific task that is outside of this now more generalized process is primitive assembly and rasterizing, and that could be emulated on the CPU at low cost easily, too... I think what we will see is "addon" cards that allow you to beef up vector processing units, whatever they may be used for (physics, graphics, ...)
  • arshlevon
    Offline / Send Message
    arshlevon polycounter lvl 18
    this is pretty interesting as well
    http://crave.cnet.com/8301-1_105-9782443-1.html

    i am really interested in real time raytracing, seems it would throw out the need for rasterizing all together.
  • Mark Dygert
    [ QUOTE ]
    People buy graphics cards more than cpus because they are cheaper and easier to upgrade, period.

    [/ QUOTE ] Or they bought a computer that didn't come with a quality video card and now must upgrade... My years doing tech support I broke a lot of hearts of people who where in love with their new machine but didn't understand why the game wouldn't play.
  • aesir
    Offline / Send Message
    aesir polycounter lvl 18
    [ QUOTE ]
    [ QUOTE ]
    People buy graphics cards more than cpus because they are cheaper and easier to upgrade, period.

    [/ QUOTE ] Or they bought a computer that didn't come with a quality video card and now must upgrade... My years doing tech support I broke a lot of hearts of people who where in love with their new machine but didn't understand why the game wouldn't play.

    [/ QUOTE ]

    I've always felt that there needs to be something similar to sex education that explains to all the normal people out there about PCs and how to play games.
  • Robert Headley
    Offline / Send Message
    Robert Headley polycounter lvl 18
    Well, if you remember any of Tim Sweeney's interviews, from years gone by, he did make a prediction that eventually voxel's will be used to render everything. Hes a little off with the time frame tho. Sub Pixel triangles in 2005.. not quite.

    http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Interviews

    someone's got a fan.
Sign In or Register to comment.