Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Get your parts while you can. AMD is well......

Tyres100Tyres100 Member Posts: 704

AMD is hurting bad, so bad that they have removed all interest in FABS. Shuting them all down and again moving in a business manner they have teamed up with another company.

This could potentially ruin CPU chipset competition and increase prices across all manufacturers. I see future Intel CPU's going for 20% or more versus todays values. It is even possible that AMD can screw up ATI graphics with it's future plans to place them all onto the CPU and get away from video cards.



We would want AMD to be doing alright in the real world because it increases competition and drives prices down. It also helps to spure on new technology advances. Intel may become the new MONOPOLY on the CPU chipset market in the next couple years. Watch out for increasing prices and back to the $2000+ mid range machine price ranges.

Who let you in the VIP section?

Comments

  • saint4Godsaint4God Member Posts: 699

    As a long time fan and user of AMD, I was saddened by the news.  I appreciate having affordable chips that out-performed their Intel counterparts in gaming.  I even build two computers ->exactly the same<- with the only difference being the chip (person insisted on Intel and the motherboard maker had a AMD version or Intel version).  There was so much complaint on their end I had to convert theirs to the same as mine to fix the problem.  I hope this thing turns around for AMD, it would be the greatest loss for me as a consumer since Albatron. 

  • ThradarThradar Member Posts: 949

     A fab for a new process technology costs ($4-5 billion) more than they're worth right now.  They had no choice.  I don't expect their quality, capacity, throughput, or roadmap schedules to get any better by outsourcing their fabs.

  • humanrogue67humanrogue67 Member Posts: 14

    i'm a big fan of AMD. sad to hear this news.

  • CleffyCleffy Member RarePosts: 6,414

    I agree thier FABs weren't generating money.  This is mainly because of the limited capacity they have thier FABs process.  They aren't running at peak performance because they only work for AMD, IBM, and a couple other close partners.  This move opens the FABs up to any business.  FABs are also extremely expensive to maintain.

    This move probably will free up resources and get AMD to where it was when it started to produce good chips.  Just a chip maker who goes to a 3rd party FAB.  They will also probably have closer ties with the new company being made who runs the FABs for a discounted price.

  • ohsofresh42ohsofresh42 Member Posts: 68

    WoW this is really disheartening news since I am a HUGE amd fanboy (not afraid to admit it). I don't care what intel users say i wouldn't  trade my amd x2 6000+ 3.0ghz for the world right now... Well maybe I'd trade it for a quad but you better give me something else in the deal cause I'd have a hard time parting with it...

     

    Here's hoping things get better since i love their chipsets!

  • WharmasterWharmaster Member Posts: 234

    Very sad to hear this. I've been using AMD for years.

    I remember back in 1999, folks kept asking me if I was stockpiling food. I always answered, "No, I'm stockpiling ammo and making a list of people who are stockpiling food"

  • ZorvanZorvan Member CommonPosts: 8,912
    Originally posted by Tyres100


    AMD is hurting bad, so bad that they have removed all interest in FABS. Shuting them all down and again moving in a business manner they have teamed up with another company.


    This could potentially ruin CPU chipset competition and increase prices across all manufacturers. I see future Intel CPU's going for 20% or more versus todays values. It is even possible that AMD can screw up ATI graphics with it's future plans to place them all onto the CPU and get away from video cards.




    We would want AMD to be doing alright in the real world because it increases competition and drives prices down. It also helps to spure on new technology advances. Intel may become the new MONOPOLY on the CPU chipset market in the next couple years. Watch out for increasing prices and back to the $2000+ mid range machine price ranges.



     

    That would be a stupid way to go.

    1.) An onboard graphics chip cannot compete with a seperate dedicated GPU ( or two or three or 4 ).

    2.) Anti-trust laws forbid denying a competitors product by excluding it with your own ( Microsoft learned this expensive lesson when they were forced to stop bundling Internet Explorer and Windows media Player with Windows as an integrated product ). So they would have to make sure that you could place an Nvidia graphics card in a system with that processor/gpu combo and not take away from Nvidias performance/functionality. this in turn could mean having to throttle some of the performance the AMD/ATI chip would have had if it was a stand alone card.

    3.) Intel and Nvidia would have a field day since neither of them would have to worry about throttling performance or functionality due to keeping seperate cpu/gpus. Intel and Nvidia have both in the past talked of the desire to make integrated cpu/gpus. However, they intend to do this while also maintaining the seperate products as well. And that's where they're smart and why they'll stay ahead.

    4.) Expense. The combo cpu/gpu will be an excessive heat producer. This means a combo chip would have to run at slower clock speeds than a seperate chip in order to keep the same heat range as a higher clocked solo processor. So the consumer will not only have to pay the same price for a lower speed processor compared to Intel, but will have to pay extra for the same speeds/performance as the solo Intel PLUS water cool their computer. Intel will be the easier alternative. And hey, us geeks like faster and easier every time when it comes to hardware.

    In short, the combo cpu/gpu will be great for programs like SETI or mom and dad using a computer solely for web and email. But for gamers, which have been AMD/ATIs bread and butter since the beginning, Intel and Nvidia will have the undisputed advantage.

  • Tyres100Tyres100 Member Posts: 704
    Originally posted by Zorvan

    Originally posted by Tyres100


    AMD is hurting bad, so bad that they have removed all interest in FABS. Shuting them all down and again moving in a business manner they have teamed up with another company.
    This could potentially ruin CPU chipset competition and increase prices across all manufacturers. I see future Intel CPU's going for 20% or more versus todays values. It is even possible that AMD can screw up ATI graphics with it's future plans to place them all onto the CPU and get away from video cards.


    We would want AMD to be doing alright in the real world because it increases competition and drives prices down. It also helps to spure on new technology advances. Intel may become the new MONOPOLY on the CPU chipset market in the next couple years. Watch out for increasing prices and back to the $2000+ mid range machine price ranges.



     

    That would be a stupid way to go.

    1.) An onboard graphics chip cannot compete with a seperate dedicated GPU ( or two or three or 4 ).

    2.) Anti-trust laws forbid denying a competitors product by excluding it with your own ( Microsoft learned this expensive lesson when they were forced to stop bundling Internet Explorer and Windows media Player with Windows as an integrated product ). So they would have to make sure that you could place an Nvidia graphics card in a system with that processor/gpu combo and not take away from Nvidias performance/functionality. this in turn could mean having to throttle some of the performance the AMD/ATI chip would have had if it was a stand alone card.

    3.) Intel and Nvidia would have a field day since neither of them would have to worry about throttling performance or functionality due to keeping seperate cpu/gpus. Intel and Nvidia have both in the past talked of the desire to make integrated cpu/gpus. However, they intend to do this while also maintaining the seperate products as well. And that's where they're smart and why they'll stay ahead.

    4.) Expense. The combo cpu/gpu will be an excessive heat producer. This means a combo chip would have to run at slower clock speeds than a seperate chip in order to keep the same heat range as a higher clocked solo processor. So the consumer will not only have to pay the same price for a lower speed processor compared to Intel, but will have to pay extra for the same speeds/performance as the solo Intel PLUS water cool their computer. Intel will be the easier alternative. And hey, us geeks like faster and easier every time when it comes to hardware.

    In short, the combo cpu/gpu will be great for programs like SETI or mom and dad using a computer solely for web and email. But for gamers, which have been AMD/ATIs bread and butter since the beginning, Intel and Nvidia will have the undisputed advantage.

    I have to disagree with you on #4 because the CPU is optimized and so would the gpu on the cpu. Intel and AMD are in R&D on this atm and probably have a chip already made for reference. They want to take advantage of using a seperate pcie graphics card along with their integrated chip designs, giving the consumer higher performance when using them combined.

    There is no worry for heat if you place the gpu on the cpu die because it would be optimized and most likely will be a 45nm process fab or close to that. Right now an E8400 will OC on stock voltage up to 3.5 GHZ and small voltage increase to hit 4.0 with little to no heat generated. 36-42C temps with stock cooling.

    The reason why your current graphics cards get so hot is because of how the card is setup. Using the PCIE you need to use more power to get the frequency up enough to get the bandwidth needed. Why do you think you need extra power to be pluged in the card when it is in the PCIE slot, much like when they started to increase speeds on AGP we started to see power needed till they hit the limit. On a PCIE or agp lane in order to get increased bandwidth by the GPU design you need to add more power. The power is mostly disipated as heat.

    There is no real extra heat from using a cpu/gpu integrated design.

    Who let you in the VIP section?

  • JetrpgJetrpg Member UncommonPosts: 2,347
    Originally posted by Tyres100

    Originally posted by Zorvan

    Originally posted by Tyres100


    AMD is hurting bad, so bad that they have removed all interest in FABS. Shuting them all down and again moving in a business manner they have teamed up with another company.
    This could potentially ruin CPU chipset competition and increase prices across all manufacturers. I see future Intel CPU's going for 20% or more versus todays values. It is even possible that AMD can screw up ATI graphics with it's future plans to place them all onto the CPU and get away from video cards.


    We would want AMD to be doing alright in the real world because it increases competition and drives prices down. It also helps to spure on new technology advances. Intel may become the new MONOPOLY on the CPU chipset market in the next couple years. Watch out for increasing prices and back to the $2000+ mid range machine price ranges.



     

    That would be a stupid way to go.

    1.) An onboard graphics chip cannot compete with a seperate dedicated GPU ( or two or three or 4 ).

    2.) Anti-trust laws forbid denying a competitors product by excluding it with your own ( Microsoft learned this expensive lesson when they were forced to stop bundling Internet Explorer and Windows media Player with Windows as an integrated product ). So they would have to make sure that you could place an Nvidia graphics card in a system with that processor/gpu combo and not take away from Nvidias performance/functionality. this in turn could mean having to throttle some of the performance the AMD/ATI chip would have had if it was a stand alone card.

    3.) Intel and Nvidia would have a field day since neither of them would have to worry about throttling performance or functionality due to keeping seperate cpu/gpus. Intel and Nvidia have both in the past talked of the desire to make integrated cpu/gpus. However, they intend to do this while also maintaining the seperate products as well. And that's where they're smart and why they'll stay ahead.

    4.) Expense. The combo cpu/gpu will be an excessive heat producer. This means a combo chip would have to run at slower clock speeds than a seperate chip in order to keep the same heat range as a higher clocked solo processor. So the consumer will not only have to pay the same price for a lower speed processor compared to Intel, but will have to pay extra for the same speeds/performance as the solo Intel PLUS water cool their computer. Intel will be the easier alternative. And hey, us geeks like faster and easier every time when it comes to hardware.

    In short, the combo cpu/gpu will be great for programs like SETI or mom and dad using a computer solely for web and email. But for gamers, which have been AMD/ATIs bread and butter since the beginning, Intel and Nvidia will have the undisputed advantage.

    I have to disagree with you on #4 because the CPU is optimized and so would the gpu on the cpu. Intel and AMD are in R&D on this atm and probably have a chip already made for reference. They want to take advantage of using a seperate pcie graphics card along with their integrated chip designs, giving the consumer higher performance when using them combined.

    There is no worry for heat if you place the gpu on the cpu die because it would be optimized and most likely will be a 45nm process fab or close to that. Right now an E8400 will OC on stock voltage up to 3.5 GHZ and small voltage increase to hit 4.0 with little to no heat generated. 36-42C temps with stock cooling.

    The reason why your current graphics cards get so hot is because of how the card is setup. Using the PCIE you need to use more power to get the frequency up enough to get the bandwidth needed. Why do you think you need extra power to be pluged in the card when it is in the PCIE slot, much like when they started to increase speeds on AGP we started to see power needed till they hit the limit. On a PCIE or agp lane in order to get increased bandwidth by the GPU design you need to add more power. The power is mostly disipated as heat.

    There is no real extra heat from using a cpu/gpu integrated design.

    AMD x-box anyone?

    Seriously id like to add my one ram in my video card. Why take a step backwards?

    "Society in every state is a blessing, but government even in its best state is but a necessary evil; in its worst state an intolerable one ..." - Thomas Paine

Sign In or Register to comment.