Sunday 21 July 2013

AMD FX 9590 Benchmarks Part III



More Benchmarks from PCTuning

Stock AMD FX 9590 @ 4.7 Ghz / 5 Ghz Turbo 
vs.
Stock Intel i7 4770k @ 3.5 Ghz / 3.9 Ghz Turbo


Higher is Better





Memory Bandwidth


Lower is Better


POWER
Remember that the FX 9590 is paired with a 7990! 

TEMPS
Intel runs HOT .. no surprise



Compare results for the FX 6300 and the FX 9590. Imagine if AMD instead released a 12 Core Variant that overclocks only to 4.1 Ghz, this would be equivalent to the FX 9590 on all cores but with much less IPC. Its no wonder AMD scrapped their 10 Core FX chips for higher clocked 8 Cores.

Steamroller FX will trounce the FX 9590 at lower clocks and TDP. 

212 comments:

  1. Intel haswell still uses far less power than ANYTHING amd has right now. Steamroller won't change that and Intel will have 14nm in 2014 vs 28nm from AMD FIGHT AMD badly loses and thus ends AMD suckage!!!!!

    ReplyDelete
  2. Ahahahahah...AMD's power usage is SKY HIGH!!!! Intel haswell puts AMD TO SHAME in more ways than one. SUCK IT AMD not even Kaveri will help you now YOU ARE FINISHED AMD!!!!!!

    ReplyDelete
    Replies
    1. you do realize if amd goes under that intel will have no real competition and intel will raise there prices very high for there cpu's and intel wont spend much time in designing new cpu's so intels newer cpu lines wont be much of a uprgade from the previous line

      Delete
    2. Sorry fool that has already happened. Since AMD is no longer competition for Intel. Intel is now competing with ARM so its designing its CPU to go against ARM not AMD. AMD IS TOAST!!!!

      Delete
  3. Temperature of Intel is sky high .. how is that any different than power consumption? You base your ideas of what defines a good chip from absolute bullshit lol .. Remember the FX 9590 is paired with a 7990.. and look at the power consumption. 1000W psu's arent even needed anymore! Power consumption is "good enough" for 99.999% of the people because it is a DESKTOP CPU....

    ReplyDelete
    Replies
    1. So what big deal it just stands make AMD look even worse they there already are. Wrong Power consumption is important in desktops too. The electric bill from using a desktop 24/7 that use's an AMD CPU is ridiculous high compared to Intel's haswell which uses half the power. NO IT IS NOT GOOD ENOUGH FAR FROM IT!!!! Now if you said Intel's haswell power usage was good enough I MIGHT believe you but you said AMD's high power usage is good enough NO IT ISN'T FAR FROM IT!!!! You know what makes it worse Intel haswell with the HD7870 only uses 11 more watts of power but IS SO MUCH FASTER like night an day difference. It just goes to show you how bad AMD's CPU's REALLY ARE!!!!!

      Delete
    2. Grammar correction:
      So what is the big deal! The AMD FX-9590 makes AMD look even worse than they already are! High power consumption is important in desktops too! The electric bill from using a desktop 24/7 with an AMD CPU is ridiculously high compared to Intel's Haswell I7 4770K which uses less than half the power! No this cpu is not good enough, and is far from being good enough! If you said Intel I7 4770K's power usage (84 watts) was good enough I might believe you, but you said AMD's high power usage (248 watts) is good enough. No it isn't good enough, and it is far from being good enough! You know what makes this worse? An Intel Haswell I7 4770K cpu has an 84 watt tdp. The AMD 9590 has 248 watt tdp! The AMD FX-9590 uses 164 more wants than the Intel I7 4770K! This just goes to show you how bad AMD's cpus really are!

      Delete
  4. Not to mention the countless benchmarks where AMD trounces intel.. NEW games and NEW benchmarks put AMD at a better position than intel...

    ReplyDelete
    Replies
    1. I see you are trying make AMD look good when IT FLAT OUT DOESN'T. AMD SUCKS and the SUCK down power like no tomorrow. AMD is never faster than Intel's ALMIGHTY haswell!!!!! Intel for the WIN and AMD FOR THE LOSING!!!!!!

      Delete
  5. Sole reason for Haswell be more economical ...manufacturing technology ...
    3D FinFET ...

    AMD and GlobalFoundries will reach this level in 2014.

    ReplyDelete
    Replies
    1. Really that quickly huh? Too late for AMD thought.

      Delete
  6. in idle is power consumption OK, in very hard load a bit expensive. But example FX-8320 6300 are OK in power consumption. Not big diferencies in practice against Intel. The true is, AMD FX are better than older FX Bulldozer in all. The multithread performance is good and ussually hand in hand with i7 4c/8t. The price of AMD CPUs is lower.
    The lower power consumption of Intel are this reasons:
    - 22nm
    - lower die size of Ivy/Haswell (the Gulftown and FX has similar die size and similar power consumption at 32nm)

    ReplyDelete
    Replies
    1. AHAHAHAHAHAH...Wrong Intel 2 to 3 times more efficient than AMD fx depending on if you overclock the FX. The FX series still sucks ass and isn't much better than SHITDOZER in 2011. More Cores ARE NOT BETTER. Intel's High IPC and performance per watt crush AMD into the GROUND!!!!

      Delete
  7. http://www.agner.org/optimize/blog/read.php?i=49#269 - The truth

    ReplyDelete
    Replies
    1. SO what I already know about that. I have know about that for a very long time. I like Intel chips because they are superior to AMD chips and have far better performance per watt. When AMD has something I want I will buy from them instead till then Intel still gets my $$$.

      Delete
    2. So you and I and rest of the world does not know the true performance of AMD chips, many many games and programs use this compiler that cripples the performance of AMD's processors thus those programs and games use illegal compiler. So what if all that was fixed and shows that AMD is actually on par or better then Intel? Wouldt you budge a little because of that because then you are in reality getting scammed by Intel since its compilers are used and thus in the end its false advertising and fraud?

      Delete
    3. True and its unfortunate. Although Metro 2033 is much faster on the AMD fx-8350 than any of Intel's chips. Most people will still recommend Intel because most people think its faster. Yeah I would budge a little bit if that was fixed. Getting scammed by Intel does suck but AMD right doesn't anything that matches Intel is performance per watt and IPC.

      Delete
  8. The more i see of the FX-9590 the more i feel disappointed at AMD... they promise a new series of FX CPUs along new Radeon GPUs for this year in their latest roadmap, now i am starting to think there will be no new FX this year and we may not see a new FX through all 2015.

    I am not willing to pay 800 Bucks for a horrible CPU, i was totally expecting a new FX with a least 15% performance improvement over Piledriver... they should had used the Piledriver+ Arch used in Richland APU and release a new Damn FX CPU with a at least 28nm Process, it will take GlobalFoundries another year to catch up with 14nm FinFet 3D.

    ReplyDelete
    Replies
    1. the 9590 IS piledriver - the 9650 is the steamroller chip.
      I get the feeling that the architecture that they're using in the APU's isn't the same is the one in the FX chips. Very similar, based on the same thing, but not the same. the APUs use less power for the CPU, and seem to be built for more efficiency, whereas the FX are built less for efficiency and more for computational power.

      Delete
  9. Oh, and there are tons of rumors about Kaveri getting delayed for the first half of 2014... that means no FX Steamroller for at least another 6 more months after Kaveri gets released.

    I am starting to think Globalfoundries are to blame for AMD horrible performance these last 2 years, AMD is at their mercy and GF seems to be screwing them with each new node, they totally need to get rid of GF as soon as they can.

    ReplyDelete
  10. Ok I find this really getting annoying. JML12 or Jesse Lee, whatever you want to go by, grow up please?

    I mean, is this all you do, spend countless hours writing posts trolling on AMD forums because your clearly butt hurt about something? Trolling tends to be funny at times, but not when the person drives the trolling so far into the ground it starts to sound pathetic and just gets uncomfortable. your trolling lost every bit of funny after you have made hundreds upon hundreds of the same exact post with the exact same troll tone and argument calling everyone reading a blog a fan boy over and over.

    As far as Intel goes, they are right now the most powerful CPU to watt chips on the market in the Desktop area, and the price is outrageous in most cases. Now is that a bad thing, no, they have the most IPC on their chips and it shows, so of course they are going to charge a premium. But the 1000 Dollar ranged CPU's from them off poor power consumption and are not much better than the 500 dollar 6 core chip they offer. Both the FX series 8 core chips and the Intel 6 core chips chug down power like no tomorrow especially when overclocked, yet the Intel is superior, but on a dated architecture that is starting to show its age in some areas.

    Now I personally, am looking forward to Ivy-Bridge E and the steamroller chips to see who is going to be at the top of the food chain and by how much. I was syced out waiting for that beautiful 5-600 dollar 6 core chip on the 2011 socket to come out soon, and then my dream got crushed when I noticed that they have only announced a 6 core at the 1000 dollar mark and the 4 core E series chip. That's really hurt my chance at wanting something that nice because now I feel I will not pay 1000 dollars for a CPU unless it is at least 1.5 times the power of the 500 dollar chip which it isn't.

    On the AMD side, I cant wait to see what steamroller does and that's partially why im reading this blog and hearing some rumors, preliminary tests, things like that. I want to see how the tides turn and what the future brings. That's what I love about computers, the future is never certain. Its fun and its always been that way, the tides of the computer are ever changing and as of right now, both companies have made totally different decisions on what they believe is the future, and we will see what happens this round.

    As the tides of battle singe on, the battle is like this as it shows.

    AMD:
    Better GPU on the Chip (Mobile and Desktop)
    Better Multi-Threading
    Better Price
    Monopoly on all gaming consoles
    Better Overclocking

    Intel:
    Better IPC
    Lowest power Usage
    Better mobile gaming (When paired with a high end GPU)
    The most powerful Desktop Chip on the market

    I love my MSI gaming laptop carrying the Ivy-Bridge I7 and a GTX 675mx Chips inside which stomp away at games pretty well. But on my desktop, I love my FX 8350 and HD 6990's in crossfire. Does my desktop drain a lot of power? Yea it does, does it matter to me though, nope! On my laptop does that matter, yea somewhat, it matter to me only when on battery mode which limit my GPU to keep the power down if I game on the battery. On my tablet, I have an I5 3337u with HD 4000, it has good battery life, and does a decent job with games. Though honestly, I feel the processor is better and the GPU holds it back. I want a good balance of a GPU and a CPU. Intel does not offer that sadly, the HD 4 and 5000 series are not all that great and show in the mobile market. AMD chips do hurt in the area of sheer CPU performance but are even for the most part on the low end spectrum except when it comes to the GPU which is better. Now because of that the chip uses more power, but we get a better balanced performance. Having the most power and power efficient chip in the world to me means nothing if your GPU blows because you cant do high resolution games. But then again, neither side is really up to that except on the AMD top of the line APU's (At least in the GPU spectrum).

    ReplyDelete
    Replies
    1. Now there is my opinion, and heres my last comment to you JML, take a shower, make yourself more appealing to look at before making a picture like that, do some research, grow up and learn when trolling get stupid, and then start talking about chips when you actually understand what your talking about.

      Now I will be blocking you which people is very easy, click on his name and go to his profile to block posts by him. That's what im doing right now, it will make things so much easier to read and I wont have to scroll through a mountain of trolling that hit rock bottom 10 pages back.

      Delete
    2. Hey Idiot thanks for the tip I just blocked you. You're the ugly one but that's irrelevant.

      Delete
    3. Having a monopoly on consoles is almost worthless. It sucks to be AMD right now and you dumb fanboi make me want to puke in disgust of AMD. Wrong Intel has better mutil-threading NOT AMD. AMD uses twice the power of any Intel chip out there. AMD doesn't have good balance of CPU and GPU it mostly GPU the CPU sucks ASS. Gamers like you MAKE ME WANT TO VOMIT POWER HOGGER!!!!

      Delete
    4. "You're the ugly one but that's irrelevant". That was your comeback.... Sorry I did not know you were 5 years old, great come back, must have taken you a couple hours to work out that one.

      Delete
    5. Oh and also

      "Although Metro 2033 is much faster on the AMD fx-8350 than any of Intel's chips." By Jesse Lee

      Uht oh, did you just say that AMD is better than intel? Whose the fan boy now?

      Delete
    6. Yeah I did. 20 fps for AMD compared to 12 fps for Intel. I am not an Intel fanboi It just that I like small systems and I don't like overly high electric bills so AMD isn't the best option for me. Sarcasm I see.

      Delete
    7. I think the developers of Metro 2033 never used Intel compiler code in their engine, that is the only logical explanation in the world full of Intel's compilers..

      Delete
    8. Really that is just sad for them. They should of then Intel would of been SO MUCH faster than it current is in that game.

      Delete
    9. They probably used GCC, which is arguably the best compiler.

      Intel's compiler builds different versions of the code - one to use on intel, one to use on "other".

      GCC builds a few different versions - a different one will run depending on the avaliable instruction sets. It is usually less than a single percent off the intel compiler in tests on intel cores, however it can increase performance on AMD and VIA cores by up to a double digit percentage.

      The reason they would use it is probably because if they use GCC, a wider number of people get better performance, and those with intel don't suffer.

      If it favored multi-threading without extensive memory read/write, then CMT (AMDs threading approach) is far, far more effective than SMT (Intel's Hyper-Threading) and no compiler could change that.

      Delete
    10. Really well that's good i have heard of GCC. I wonder why so many people then use Intel's compilers? Really I thought CMT sucks because it's performance is mediocre and it guzzles down power like no tomorrow.

      Delete
    11. People use intel's compiler because it is endorsed by intel and easy to use.
      People don't use GCC because it is difficult to use, and usually takes longer to compile, even though that's because it makes the code for more possibilities.

      Delete
    12. Then it works great for Intel efficient CPU's and it sucks to be an AMD fanboi in this world where almost every game is pro intel. I wouldn't care if it takes more time I would however care if its much harder to use. No wonder why Intel WINS they got the best tools and the best CPU and the best performance per watt. With that INTEL CAN'T BE BEAT EVER!!!!!!

      Delete
    13. I'm not sure you get this....
      The reason GCC is harder to use is because it is so versatile.
      there are far more options with GCC than with Intel's compiler (which, by the way, costs money).
      The reason it takes longer is because it is working to compile optimal paths for different situations. It compiles for ALL instruction sets, not just intel approved ones.

      Crysis 3
      Tomb Raider
      Far Cry 3
      DMC
      Hitman Absolution
      Bioshock Infinite

      Optimised for AMD, not intel

      Crysis 3 really benefits from the extra cores.
      3770k vs 8350 (at least, in my experience) with same cards and stuff, are almost even in loading times but I usually get about 5 fps higher (out of 50-60 fps) with the 8350

      Delete
    14. If I was a game developer I wouldn't care too much if took more time I would care if its much harder to use. Oh good a whole list of games to avoid while using an Intel CPU because those games are going to suck on Intel because they don't use the special compiler to increase performance on Intel only CPU's. Good thing I don't play those games anyways I don't like any of them so SUCK IT AMD SUCK IT LONG AND SUCK HARD AMD!!!!!

      Delete
    15. "If I was a game developer I wouldn't care too much if took more time I would care if its much harder to use."

      Exactly why you are not and never will be.

      Your going to avoid games that are optimized to use more cores and don't use a biased compiler that doesn't help the Intel chip in any way over the other one...Ok have fun missing most of the top rated games of the year.

      Delete
    16. Uh that's fine I don't play those "top" games of the year because they don't suit me. Oh and why is that? Not on purpose i'm afraid but I am not a big fan of any of the games you just listed. I also dislike Call of duty with a passion!!!

      Delete
    17. Well, what sort of games do you like?
      which games do you like?
      I'll see if I can buy one and bench it with my 8350.

      Who doesn't dislike CoD? I mean, anyone who is conscious should at least be starting to question why you are constantly killing people with foreign accents.

      I actually let of of my CoD mad friends play Metro 2033. He stopped about 10 minutes in. I let him play Spec-Ops. He played through it with glee. He then wanted to know which i7 I had, because his i5 didn't feel as smooth.

      True story. He was almost as much of a fanboy as you.

      Delete
    18. Games like Unreal tournament and doom 3 BFG edition. I have been recently playing postal 2 which kinda sucks but its fun. Also Games like Id's Rage I really like that game. Another game I like is A.R.E.S extinction agenda and also borderlands 2 occasionally. Interesting he's almost as much of an Intel fanboi as I am?

      Delete
    19. Also I forgot about half life 2 and Unreal tournament 2004 and Unreal tournament 3.

      Delete
    20. Postal 2?
      portal 2, you mean..
      You do kinda need a Nvidia card to play it at max settings on because of the PhysX and Cuda utilization, but I managed 60+ FPS constantly at max settings (HD 7950) thanks to the extra cores running the PhysX code.

      Delete
    21. No but i did play portal 2. I meant postal 2 is not that great of a game. Really I didn't know that thanks for telling me and another AMD isn't good at. is portal 2 really that demanding? Yeah I know I pay some really weird games. I did play GTA IV but that runs like shit on my Core i3-2100T/AMD radeon 5570 combo which is also a piece of dog crap.

      Delete
    22. wait, what?
      you just said that even though I got 60+ FPS (my monitor is only 60FPS) it's bad?

      It is that demanding if you only have 2 or 4 cores to run all the code without an nvidia card.

      Postal 2...
      hmmm..

      It looks kinda like a sadistic shooter.

      Delete
    23. No I just thought portal 2 wasn't that demanding as a game especially on default settings. Also I was referring to PhsyX and cuda and how they suck on anything that isn't Nvidia. How do you enable PhysX and CUDA on Portal 2? You would be correct in guessing that postal 2 is a sadistic shooter. Its a lot of fun but the gameplay and graphics leave much to be desired!!!!

      Delete
    24. They should automatically be enabled if you have a higher end nvidia card or have installed the Nvidia CUDA/PhysX CPU patch. (I think this is on the metro 2033 install)

      There should be some links on the steam forums where people want their games to look better.
      Just search for "Run PhysX on CPU"

      http://www.nvidia.com/object/physx_8.09.04_whql.html
      http://www.nvidia.com/object/physx-9.13.0604-driver.html
      http://www.nvidia.com/object/physx-9.13.0604-legacy-driver.html

      Install in that order


      That SHOULD let you run PhysX on your i3 2100T, but it won't help at all - Portal 2 already is dual core game, adding PhysX will just slow it down more.

      Portal 2 isn't really demanding until you get past the bit where you fall and have to start dealing with massive scenes and fluids everywhere.

      Delete
    25. I have neither one. What difference does it make to have PhysX and CUDA on anyway? I don't think my CPU is beefy enough to run portal 2 with PhysX and CUDA turned on. Really I didn't know that.

      Delete
    26. 1. I figured
      2. The fluid simulations are, pardon the pun, generally more fluid.
      If you have an Nvidia card, it will offload the PhysX and CUDA calculations to that. It will do the number crunching, you get to play the game at a higher framerate. If you have the patch installed, it basically goes "What cores are avaliable? Those? Let them eat Physics calculations!".

      So no, it won't help on a 2100T, because i3's only have 2 cores anyway. I'm guessing that in the huge scenes with fluids and stuff you get some framerate drops?

      If you didn't I would be really surprised, my mate with a 2500k gets drops around those parts. (he has a HD 6950)

      Delete
    27. Really I see. So the patch is worthless to me right now. I need something that uses far less power than what my PC uses now. And yes I do get some framerate drops during playing portal 2. You can blame ARM for my much higher standards.

      Delete
  11. http://www.youtube.com/watch?v=UuCL1gv4vR0 - love this video :3

    ReplyDelete
    Replies
    1. That video was biased towards AMD. The author is a PROUD AMD fanboi. He had the stupidity to switch from a faster Intel CPU to a power GUZZLING crappy AMD FX CPU.

      Delete
    2. You should have watched his previous video you idiot...
      -
      He was on Intel first, then he got an FX 8350 and then he switched from Intel to AMD because AMD is cheaper and FX 8350 beasted i5 3570k in streaming and you know that an i5 3570k is defeated by FX 8350 in streaming.

      You fail, just because he started complimenting AMD and saying pros and he switched from Intel to AMD, that does not mean he is an AMD fanboi and he did a performance comparison and he said that i5 3570k is better in GW2 thus you fail and your hypothesis that he is an AMD fanboi has failed. So deal with it.

      Delete
    3. I know he switched his Intel PC with a slower AMD one and that's why I can him biased and a fanboi.

      Delete
    4. i5 3570k is slower, FX 8350 is faster... You fail.

      Delete
    5. Wrong the i5-3570k is faster than the FX 8350 and when you factor in single core performance there is no contest INTEL WINS AMD LOSES!!!!! Intel's IPC advantage can't be beat.

      Delete
    6. Yet benchmarks show otherwise.

      http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-3570K+%40+3.40GHz

      That's why the i5 does so poorly in sales except towards people who only want a straight budget gaming machine.

      Delete
    7. Most people who buy an i5 are only gaming and not doing super high end gaming. People who buy fx 8350 are doing editing and other tasks along with gaming. Where as people who buy an i7 4 core are normall gaming as well and also doing video editing, or other tasks, but are still mostly focused on gaming. People who buy the 6 core i7s are not too limited by budget and want the most powerful system money can buy and are doing hard core gaming/editing.

      Delete
    8. The i5 has far better IPC than the SHITTY FX-8350. Yeah that is true people who buy an i5 want gaming and that's what they will get. See Intel for the WIN AMD for the losing!!!!

      Delete
    9. Yet the fx 8350 benchmarks better in cpumark because the chip has a better overall balance in performance in all the other areas.

      The i5 is a budget medium end gamer pc chip designed only for the mid range areas. Most people that have them don't have anything beyond a mid range card (or in rare cases a top of the line single card). With the fx an i7, you see more people with ultra setups including dual triple and quad cards.

      I know very few i5 gamers or hear about them, I hear more about i7 gamers or fx 8 series gamers. But if I want a high end gaming machine, I'm getting an i7 6 core or an fx 8 core for the multi-threaded power and to do quad sli/cfx gaming, the i5 cannot do that.

      Delete
    10. It barely beats an i5 and gets CRUSHED by an i7. Most of the time I don't want to raise my electric bill thought the roof so AMD is out of the question and so are top of the line cards. Yeah if you want a high end power hog you moron.

      Delete
    11. Ok let's do math since you can't apparently do that as well. The average of a 3570k is 7118 where as the average score of an fx-8350 is 9127. The ratio of the 3570k to the 8350 is ~1.29.

      Now the i7 4770k scores a 10152 average compared to the fx average, that's a ratio of 1.11 increase over the fx. So if by your definition the i7 "crushes" the fx8350, then the fx8350 Destroys the i5 while costing a whole lot less.

      Oh yes I love my high end machine, I just bumped the clocks on my system to 5.0ghz and 1ghz on both 6990s, I can feel the power draw my 1300watt power supply sucking it down to my system while playing bf3.

      Delete
    12. Yep you are using way too much power with your setups. Wrong it doesn't cost a whole less in fact it doesn't cost less at all. The price is competitive remember. Face it AMD sucks and there are only a few 8 core games so SUCK IT AMD!!!!!

      Delete
    13. So besides math you can't read as well. So besides contradicting yourself every other line, you can't read or do simple math. The fx 8350 costs 200 US while the 3570k costs 230. That's less, in case that's too hard for you to understand as well. As for games that actually use 8 cores, 2 years ago there were hardly any games that used more than 2 cores, yet now there all over the place.

      My system beats anything you have based on the fact your probably gaming on a very low end machine. Also to reiterate on the whole "8 core game" argument, both next gen consoles sport 8 core processors meaning that developers are going to have to start learning to use 8 cores to be efficient.

      Benchmarks don't lie and you even confessed the i5 is worse than the fx chip. Yet the i5 and mother boards cost a whole lot more while not doing anything different.

      Delete
    14. The amd fx 8320 costs 159 dollars, or as low as 149. And i3 ivy bridge is 129.. The fx 8320 can overclock higher than a stock fx 9590... What does intel have at that price range??? Locked i5's are an ebarasment to an unlocked 5 ghz 8 core piledriver fx, and cost more..

      Delete
    15. I see but Intel is faster than AMD for most tasks and most games. You see most games don't use 8 cores in fact most barely use 4 cores as of right now. Now that will change with the advent of the new gen consoles. But for Now Intel WINS AMD loses and besides AMD is far behind in everyday tasks where IPC and clockspeed win out above more cores. Web Browsing isn't heavily threaded nor is emulation. SO AMD IS STILL IN HOT WATER !!!!! INTEL FOR WINNING AMD FOR THE LOSING!!!!!

      Delete
    16. Two years ago games barely used 2 cores and 4 core machines were new to the market. So it will happen and its already begun as many new games already use 6 cores and are optimized for it. If clockspeed was all that mattered, AMD would be in no contest destroying Intel, so you have once again made an error. "Web Browsing isn't heavily threaded" are you kidding me, so we are going to discuss which processor Internet explorer or Google Chrome opens faster? Yea theres any difference in that low end task...

      "Wrong it doesn't cost a whole less in fact it doesn't cost less at all. The price is competitive remember."
      Wow you really cant read very well, Intel Motherboards and processors cost a whole lot more and most of the chips are locked so they can be overclocked and the ones that are unlocked cost a whole lot more. The i5 as you have even admitted is worse that the FX-8350 yet costs more and the motherboards for it do as well.

      AMD is not in any hot water, they are doing exactly as planned, if anything especially with that recent Intel Annoucement for haswell about locking the chips even more, Intel is in hot water and pissing more and more people off.

      Delete
    17. Wrong AMD is still in hot water and Intel is pissing off people like you I know that already. Some games now use 4 cores this is not new to me. Wrong Intel would still crush AMD to the ground due to Much higher IPC. Its not that much less and you get sucky AMD motherboards in that price range. The Intel ones at about $40 more don't suck at all. Intel with its much higher IPC is faster for low end tasks. In a few games somehow the FX-8350 pulls ahead but for most Intel is KING and AMD won't be able to uncrown Intel.

      Delete
    18. jm12 .. you IGNORE the fact that:

      i3 3220 - 129 USD - locked
      AMD FX 8320 - 159 USD - unlocked

      Intel motherboard - more than 30 dollars more than an AMD one..

      ie..SAME COMBO PRICE

      i3 3220 stuck at 3.3 Ghz on all 2 cores
      AMD FX 8320 OC to 5 Ghz on all 8 cores

      the AMD has HIGHER SINGLE CORE PERFORMANCE.... QUADRUPLE the CORES...same price for mobo + cpu...

      Your argument will most likely fall back onto power consumption ... which is pathetic

      Delete
    19. The fact is AMD has much better price/performance advantages at certain price ranges .. to say otherwise is completely ignorant

      Delete
    20. IPC and power consumption is the only argument you have, and even then IPC is not everything as proven with benchmarks including the CPU mark. Your already admitted the i5 in all its forms get beaten by the FX-8350 while costing less. Name a feature Intel boards have over AMD boards, I would love to know what amazing features your speaking of when constantly smiting motherboards that you have obviously never used so have no idea how good they are.

      Intel is pissing everyone off, only the raging fan boys like you are ignoring stupid announcements like Bios updates restricting chipsets and overclocking.

      I like how your fist argument is wrong amd is still in hot water yet you don't follow up with any reasoning which is common for you. "Some Games use 4 cores now", almost every game that comes out now comes at least optimized for 4 cores you dolt, the only game I can even think of that didn't do that was starcraft 2 HOTS because the original engine was Dual Core support already and they can change that with an expansion.

      "In a few games somehow the FX-8350", theres no somehow about it, its called MULTI-THREADING, something you understand nothing about. Its called the game designers optimizing for higher core counts. Theres no mystery about it, its part of being a game designer to look towards the future when making games.

      I see you switched accounts again, what trying to hide that unwashed face of your in shame because you realize your arguments have no merits. Could you get a life maybe and leave here? Your at the point even other Intel fan boys are cringing at how stupid you sound.

      Delete
    21. Because I don't care about overclocking I only care about performance per watt and IPC so AMD SUCKS for that!!!! the I3-3220 uses a lot less power but is likely not much faster than my current PC. AMD uses far too much power to be in any of my rigs. I don't want to rise my power bill thought the roof on non summer months with an AMD rig because I would have to pay for it and then some. No that isn't why I changed accounts. Nope you are wrong Starcraft 2 only uses 2 cores and its a recent game so it sucks to you right now AMD FANBOI!!!!! Intel boards with their awesome CPU's have much better IGP's than Cheapo AMD boards!!!!! Off course Intel is pissing people like you off they are going full on mobile which is good for me because of higher performance per watt and much better power usage.

      Delete
    22. Your i3-2100T is based off of Nehalem Architecture and is a gen 1 core series, its complete garbage. You have zero room to talk shit about ANY AMD rig when that's what your gaming on.

      Your claiming you only care about power consumption so you chose an I3 to game on that's a 35watt TDP chip... But your going to be hitting that TDP constantly because almost everything is at least running dual threads so your chip is constantly under stress. Meaning your using more power than other people are doing the same task on a quad core of any of the companies.

      You have never used an AMD board and have no intention to, so you have no idea what your talking about nor can you even prove or show it in any way. So your word is useless on that area. My board smokes your crappy board, I guarantee it.

      If your worried about your power bill to the point your buying an i3 to save on power, then you have lots more issues and need to get your life on track. you've obviously got some serious financial issues if a whole maybe 20 dollars a year to run a computer is causing you to have to buying extremely low end junk.

      Also StarCraft 2 came out July27 2010, I would know I pre-ordered ad played the game with friends all the time. That was back when Dual cores reigned supreme. I played it on my Core 2 Duo machine a lot and at max settings. StarCraft 2 HOTS is an expansion to the previous game, not a new game, its still on the old engine. Your an idiot, even the 4300 can run StarCraft 2 on Ultra, my Old laptop with a core 2 Duo did.

      Delete
    23. My mistake was not cari g or paying attention, i3-2100t is lga 1155 sandy-bridge, so it's gen 2. It's still complete garbage.

      Delete
    24. I'll admit it sucks and my current PC combo uses 85 watts at full load if not more when its fully stressed. I won't always use more power than a quad core user would. Maybe but the game is still being updated with new expansions. Not until AMD actually provides something oh I don't know GOOD FOR ONCE!!!!! It has to meet my very high standards which Intel barely meets. My PC is on 24/7 so AMD WON'T CUT IT!!!!! But I care about performance per watt instead of ramping up my power bill using inferior AMD products!!!!

      Delete
    25. your very high standards?
      what, your standards are satisfied with a 2100T?
      that's extraordinarily high....

      Delete
    26. No they are not which is why I need a new PC. I just couldn't find anything good enough in the past two years I had this PC. I see the sarcasm there. ARM has increased my standards for what I expect in a PC for Performance per watt.

      Delete
    27. He has a standard in believing power is the most important thing about a computer, yet he leaves his on 24/7 with it doing nothing wasting power.

      He says he could not find anything better, than a low power, sandy bridge i3. I can name 40 chips off the top of my head that outperform that and 15 that have better power consumption.

      Hes a trolling intel fanboy who has no proof of hiss claims.

      Delete
    28. 15 chips that are better in power consumption please do tell? Almost anything these days will outperform it I know that. It is the most important thing.

      Delete
  12. off topic

    Asus "FM2 +" for AMD Kaveri

    http://www.computerbase.de/news/2013-07/erste-asus-mainboards-mit-fm2-plus-fuer-amds-kaveri/

    ReplyDelete
    Replies
    1. Wow... This could indicating that Kaveri could be released soon, when is APU13 conference? I think they will showcase AMD's APU's there. If Kaveri is delayed then I think AMD is just wanting to replace FM2 boards with FM2+ boards to ease transfer from Trinity/Richland to Kaveri and it has native USB 3.0 support most likely(the FM2+ socket/motherboard).

      Delete
    2. I don´t think so, since FM2 CPUs are compatible with FM2+ boards it means that they are just preparing the Boards for whenever kaveri gets released.

      There are a few rumors about Kaveri getting delayed again till 2014 because of Bad Yield by GF.. AGAIN... fucking globalfoundries, leeching into AMD to the death... If AMD had access to FinFet and all the new techies Intel has, it would not take them more than a year to beat intel.

      Delete
    3. But AMD doesn't have access to FinFET and GF isn't great and so AMD IS DEAD without much of a fight AHAHAHAHAH. Kaveri is delayed to all hell and AMD is going to pay for it. Intel 14nm vs AMD 28nm FIGHT!!!!! AMD LOSES BADLY and gets CRUSHED by Broadwell Mobile MWHAHAHAHAH!!!! If I were to buy a new motherboard it would be an Intel one because right now AMD's prime advantage is pretty much gone.

      Delete
    4. It'd be kinda funny though. I mean, you have to be embarrased when your competitor has tools half as powerful as yours and still manages to put up a considerable fight. Intel's 14nm vs AMD's 28nm... I reckon AMD will probably get close to intel's performance even if they're two nodes behind. they can't win, but what they're doing would have (and pretty much does, apart from high-end) smash intel's offerings at a given node for efficiency and price/performance. It's just that intel has a node advantage and is always a step ahead

      Delete
    5. Ahahahaha...That's a laugh right there. AMD won't even come close to Intel Broadwell. AMD Kaveri will not be able to match Broadwell's ultra low power consumption in fact Kaveri won't even be compete with Intel haswell. Intel Broadwell will even beat AMD in the GPU department so AMD: https://www.youtube.com/watch?v=1rwNILBBXU4.

      Intel broadwell wil win by a landslide over AMD and AMD will soon cease to exist AHAHAHAHAHAH!!!!! Intel is that much better than AMD.

      Delete
    6. Meanwhile...

      AMD is developing Excavator, 20nm node and implementing "High Density" that is used in AMD graphics cards thus reducing overall chip size up to 30% so 20nm process + "high density" equals a size/density of 14nm.

      Delete
    7. For 2015 and then it will be too late because Intel would of refined its 14nm process by then. That's great to hear that AMD is doing that but it won't even come close to Intel's performance per watt. INTEL WINS AMD LOSES!!!!!

      Delete
  13. GlobalFoundries will have 14nm FinFet in 2014, will it give it to AMD? Most probably not... fucking asshole, i hate that Foundry... it should die right away and leave space to less mediocre Foundries like Samsung and TSMC.

    Intel Lead is due to smaller nodes, not to architecture and without their FinFet Tech Instel is nothing out of ordinary.

    ReplyDelete
    Replies
    1. As you can see AMD is giving quite a Competition to Intel, considering AMD is at 32nm and Intel at 22nm Tri-Gate that is two nods behind, and still AMD is very close to Intel in the Multi Threading race.

      Just imagine what AMD could do with 14nm and 3D Stacking Transistors, new shrinks is what AMD needs to keep up with Intel, they architecture is more efficient that whatever Intel throws but they can`t truly compete unless they are at the same Node.

      Delete
    2. AMD won't get 14nm FinFet in 2014. ARM will most likely get it thought. Those shrinks will help but it won't happen for AMD so AMD is: http://www.youtube.com/watch?v=hzEwJ31F82A and http://youtu.be/MtCdQtbHJiY?t=15s. But Globo is serving AMD chips right now so AMD is pretty much screwed with no way out.

      Delete
  14. Yes, Kaveri further delayed to beggining 2014:

    http://wccftech.com/amd-kaveri-apu-delay-confirmed-launches-pushed-midfebruary-2014/

    ReplyDelete
    Replies
    1. That's right and by then an already late product will be of no use to AMD because Broadwell will be out soon and it will CRUSH AMD IN EVERY WAY POSSIBLE!!!! Not to mention Haswell which already crushes Kaveri to the GROUND. Which Equals only one thing AMD getting SMACKED DOWN INTO THE GROUND BY INTEL!!! GOODBYE AMD NICE KNOWING YOU!!!!

      Delete
    2. Its called delaying to iprove, broadwell is not coming out fast either and wont be out for another year kid. The fact that Richland already smokes Intel in GPU performace even on their new HD 4500 chips means that the gap will just rise.

      Mad fan boy is mad.

      Delete
    3. Nope Intel broadwell is coming out in 2H2014 which is a lot more than I can say for AMD. Face it AMD is history right now THEY SUCK!!!!!! It doesn't smoke it Richland is just a little faster. Mad AMD fanboi is MAD!!!! IT SUCKS TO BE AN AMD FANBOI RIGHT NOW!!!!!

      Delete
    4. "It doesn't smoke it Richland is just a little faster"

      Once again admits amd is better yet trying to pretend he's not mad. Since you have to switch accounts constantly because your so embarrassed of yourself. You've now admitted on two occasions everything you have said is crap by admitting the fx and Richland chips are faster than intel.

      Now we see why you rage so much, you do know amd is faster and you can't stand it being better so you spend most of your day on your computer crying on the Internet about it.

      Delete
    5. What are you smoking YOU IDIOT!!!!!! What are you talking about YOU MORON while AMD does pull ahead in a few games Intel is still BY FAR THE BETTER CHOICE!!!!! I rage so much because you don't get IT!!!!! And you are wrong!!

      Delete
    6. So far the only one constantly wrong is you. Very post you say one thing, then change up and say ok it's worse but blah blah blah lower power.

      Your running a 2100t and an ancient low power video card (ironically an amd card). Your saying you have ridiculously high standards yet you run on a pos and say that amd (which every fx processor in the lineup will outperform that chip and is unlocked) is garbage. Why anyone leaves a computer on 24/7 on a desktop rig especially you who is so concerned with power consumption leaves it on is beyond me. My computer when not in use is off and will boot in 3 seconds flat.

      My Microsoft surface Pro will outperform your PC and uses the same amount of power.

      Delete
    7. Wrong the surface PRO First off is a piece of DOGSHIT!!!! Second The Surface Pro uses far less power but has less performance than my aging PC. Mine takes about 1 minute to boot up. Because I am too lazy to turn it off and turn it on again. Anyone who owns a surface is an M$ fanboi. But every AMD chip will use at least two to three times the power that my pc uses and won't be at least 3 times faster.

      Delete
    8. Ok let's put your theory to the test

      I3-2100t cpumark
      http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i3-2100T+%40+2.50GHz
      Score 2998

      I5-3317U cpumark
      http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-3317U+%40+1.70GHz
      Score 3130

      So my surface outperforms yours and uses less power at lower clocks.

      So you care about power consumption but you are too lazy to turn it off and on...that by itself speak magnitudes about your character.

      Ok let's put your other theory to the test.

      FX-8350
      http://www.cpubenchmark.net/cpu.php?cpu=AMD+FX-8350+Eight-Core
      Score 9126

      Power Consumption under load
      http://www.bit-tech.net/hardware/2012/11/06/amd-fx-8350-review/7
      160w

      I3-2100t
      http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i3-2100T+%40+2.50GHz
      Score 2998

      Power consumption under load
      http://www.avsforum.com/t/1303066/official-sandy-bridge-lga1155-for-htpcs-thread/1470
      65w

      Can't argue with benchmarks and facts, next time don't open your mouth if you don't want to be proven wrong, I just countered everything you just said with actual benchmarks and proof.

      Delete
    9. No you're DEAD WRONG ABOUT AMD'S POWER consumption you read the graphs Wrong you IDIOT. AMD FX-8350 uses 160 Watts IN IDLE overclocked but 364 Watts under load at the same speed. SO no It's faster but it uses almost 5 TIMES the power that My PC uses. It also uses 213 watts at full load at stock settings. Wrong I can argue with your facts because they are WRONG!!!!! Wrong You didn't counter much of anything Sorry TRY HARDER!!!! But you did get the CPU score for my CPU and the surface correct. But the Surface is a piece of DOG shit for other reasons.

      Delete
    10. Your an idiot, I was still flat right, that was overclocked to 4.8ghz retard, of course a CPU's power consumption is going to jump up with an 800mhz boost. 65x3 = 195 and 2998x3 = 8994. At the differences, the CPU power consumption ratio is an increase of 1.09 where as the performance increase is 1.015. So a CPU that has 3 times the performance, 4 times the cores, with the ratio has a non existence difference in power (~.08difference on the ratios) which is comparing the STOCK speeds. Comparing an overclocked processor to a locked non-overclocked processor in Power performance is stupid.

      Also the 160 was Overclocked Idle, which ironically is lower at idle than all the Intel Counterparts listed there.

      I love how a kid who has the shittiest gaming rig I have ever heard of is trying to compare facts when he has to hide on a different account now. If the Surface is crap and it out performs your PC, then what does that make your machine?

      Your right I did read the graph wrong, but so did you because your still proven wrong even at the increased power. My chip has 6 more cores than yours, is clocked higher, will outperform yours, and will last a whole lot longer in terms of gaming than that thing lasted which it was considered old when you bought it new.

      Delete
    11. Hey Idiot more cores are NOT better Intel has proven you wrong time and time again. Wrong AMD's performance per watt is DOWN RIGHT TERRIBLE on all of their Chips. Sure you thing might last longer but it also was outdated when you got it you should bought an Intel/Nvidia combo. No it really isn't for what I look for. 160 at idle HOLY SHIT that's WAY too much power. Your chips have 6 more SUCKY CORES than mine does so what its not that much faster in single core tasks like emulation. For me AMD FX-8350 is NOT ONLY a piece of utter DOG shit it also uses 3 to 4 times more power and has horrible IPC.

      Delete
    12. Yet my math and proof above has proved you wrong and you just cant accept it. At stock speeds, my chip destroys your chip by more than 3 times the performance and 3 times the power usage while containing 6 more cores. Sure my rig is outdated, that's why it plays every game on ultra in 3 monitor eyefinity on Ultra while your struggling to play games on low settings.

      I have an Intel/Nvidia laptop idiot, its great. But AMD cards are superior to Nvidia at this time has lost the crown till they released their next gen, once AMD does that we will again see who holds that crown.

      160 at idle once again is overclocked idiot, I like how you bring up the overclocked idle when comparing your shitty stock chip to the FX chips, if we overclocked yours on a 1155 Z board, yours would be sucking power down like no other. When comparing the stock idle, its lower than the intel counterparts, under load, its not that much higher, the TDP value is put higher meaning that's how much it can dissipate (That's why AMD's overclock generally better).

      Benchmarks show otherwise as my machine dominates yours and even with 4 times the cores, it only uses 3 times the power and gives 3 times the performance, so each core is actually more efficient than yours and will complete tasks a lot faster and return to its idle faster meaning that mine will be under load a lot less than yours.

      Also like you have even said in the past, not everything uses all 8 cores, that max power usage is only when all 8 cores are stressed to 100%, that means that in a quad core optimized program, im only using a little over half the top power.

      Your logic fails, and has no proof to back it up, so your just a troll that not one person on here or anywhere listens to.

      Delete
    13. Wow you're an IDIOT but you are right no one here will listen because you are already knee deep DRUNK in AMD'S COOL AID!!!! That aside Its only 3 times faster WHEN ALL THE CORES ARE IN USE which would be ALMOST NEVER for ME!!!! In emulation AMD's FX series SUCKS SERIOUS EGGS!!!!! In most games INTEL Trumps AMD OUTRIGHT!!!!! Only in a few games that I DON'T PLAT AMD has ANY sort of advantage. But you FAIL and you only get half the performance of Intel in quad core optimized APPS. I don't think my i3-2100T can EVEN BE overclocked? I bought it for the low power and it somewhat delivers. AMD's drivers SUCK ASS and they are even MORE HORRENDOUS in LINUX. I am not like you who doesn't care about his power bill because unlike you I have to pay mine. SO I can't afford the vast amount of power used by an AMD rig.

      Delete
    14. Yeah you don't play those games because your system couldn't play those games. Most games the performance is at least 60FPS which is what we all run games at (Well your lucky to get 30 on LOW settings).

      Oh yes, on a Z77 board you can change the motherboard frequency to make the chip overclock.

      AMD drivers are just fine idiot, I work with servers and I have seen just as many opteron servers on Linux as with Xeons, so you can quit making that up. The only issue was with graphics drivers in the past which they have resolved over a few years ago.

      I pay my electric bill as well kid, my bill was 46 bucks this month with an AMD rig and I play loads of intensive games along with an AC, TVS, lights, Water Heater. Pretending your the only one who pays for electricity and that you even care enough about it (Since its such a huge issue for you yet you leave your computer on 24/7).

      I don't get half the performance as the Intel i7 in quad core apps, in the scoring, an i7 is using Hyperthreading to boost its performance in the multi-threaded category so mine performs well enough in everything including BF3 (Optimised for a Quad Core yet I maintain 60 FPS constant and I can even maintain 80 Constant if I want to).

      Not one person on here likes you or believes anything you say, your trolling, you cant provide facts to back up any of your claims, you think TDP is what the processor is actually using, and you have been proven wrong with facts and backtracked your comments so many times than no one takes you seriously.

      Delete
    15. Only partiality true I don't play those games because none of them suit me. You don't keep things on for very long do you? Yeah I know that you DUMB AMD FANBOI's and there is no going back from being a troll anyways. That's right and until more apps and games are optimized for 8 cores AMD going to left in the dust by Intel. Yeah I do think that because its true. Wrong AMD's Drivers still SUCK ASS ON LINUX!!!! Wrong they still haven't resolved their low fps in games on Linux issue.

      Delete
    16. All the major games that came out this year are optimized for 6 or more cores excluding a few optimized for 4 (The list provided above was a big chunk of said games) and Battlefield 4 which is a huge game coming out is being optimized for higher core counts as well.

      Gaming on Linux? Really, that's your argument? I have a server with Dual Xeon Quad Cores that I have an AMD Firepro GPU that I have done minecraft and some other games on and it works fine (Though its primary function is a game server so it rarely gets used for that).

      You keep calling us Fanbois for defending AMD, yet your constantly waving an Intel flag around and making up benchmark results off the top of your head. So far everyone hear has proven you wrong with cold hard facts, yet you just pull things out of thin air and pretend that's the truth. you don't understand anything about power consumption, TPD's, iGPU's, or anything regarding chips other than reading the sticker on the box.

      Delete
    17. Don't be so sure about that. Wrong we are not above 4 core gaming just yet. That's not going to happen until the new consoles are released. Also I don't like Intel that much either for different reasons but that's for another post. Yeah you don't game on there otherwise you would know how much IT SUCKS!!!! I sure do call you guys fanbois because you act like AMD fanbois!!!

      Delete
    18. Says the guy screaming Intel around every corner, has to make tons of accounts because he keeps getting blocked, and uses made up logic to make intel sound better.

      Intel Fanboy

      You say that, yet most of the big name games this year are optimized in the range of 6 or more cores which included pretty much all the big name PC titles with the rest going to quad core just like in the past with the Dual to 4 core leap.

      Delete
    19. I see AMD Fanboi!!!! I am not an Intel Fanboi Sorry IDIOT!!! Which games then? Most are still going to use dual or quad core CPU's. I'll bet you I don't play the games you will mention Call of duty Fanboi!!!!!

      Delete
    20. So your also a closet cod fan boy along with a closet intel fanboy? Because your clearly a intel fanboy and the only person talking about cod is you.

      Metro 2033 and last light, battlefields from bad company 2 to 3 use all 8 threads when available, tomb raider, crysis 3, and bioshock infinite to name a few major titles that are being optimised for more than 4 cores. Most new game engines are now optimised to use at least 6 threads and developers are doing this for future proofing.

      Who cares what games you play, you can't play almost any games on the pos of a system you have.

      Delete
  15. Hey HJH

    O.o

    http://www.umart.com.au/pro/products_listnew.phtml?id=10&bid=2&id2=14

    More specifically

    http://www.umart.com.au/pro/products_listnew.phtml?id=10&id2=14&bid=2&sid=137030

    We have FX-9370s in Australia now. Turns out 9590 is incorrect. It's actually 9370 and 9560 or something, instead of 9590 and 9650. Oh well, at least it makes a little more sense now.

    ReplyDelete
  16. Its AMD FX 9370/9590 - Piledriver for $329, $849

    Its AMD FX 9450/9650 - Steamroller for $249, $449

    The clocks on the Piledrivers are higher, but have a 220W TDP (vs 125W for the steamroller FX 9450/9650)

    Sorry for the confusion

    ReplyDelete
    Replies
    1. ahhhhhh, ok. now more confused again.

      will there be a FX-8550? or have they abandoned that idea?

      We're all pretty sure that you're from AMD, but really, I don't care. unless it means I get a free CPU..... :P

      and that steamroller computing power....
      better than the most powerful piledriver for 100W less.
      Will probably wait for Excavator, but holy crap it's impressive, that improvement. At this rate, if they decide to go for a fifth generation...

      +20% performance on intel's core arch, maybe? Only if they manage to get to 22nm or smaller, though. Quite possible, if they get TSMC to manufacture their chips instead of GloFo

      Delete
    2. How much faster the FX 9450 is expected to be compared to FX 8359 in %, this question is aimed at HJH.

      Delete
  17. AMD Steamroller FX 9450 is 3.8 Ghz / 4.2 Turbo

    vs

    AMD Piledriver FX 8350 is 4.0 Ghz / 4.2 Turbo

    The 249 USD FX 9450 Steamroller FX will be roughly 18%-25% faster than the FX 8350 stock. Overclocked to 5.0-5.4 Ghz Ghz it will be roughly 30-35% faster than a FX 9590 placing it in overclocked intel 3970x territory in multithread, and single core around a stock 4770k.

    ReplyDelete
    Replies
    1. NOT GOOD ENOUGH!!! It will still be the power hogging CPU that I know and loathe from AMD!!!! Not good enough for the high electric bills it will cause. Plus Intel Haswell will still crush it in most tasks including emulation. And Intel Broadwell will COMPLETELY OWN EVERYTHING AMD HAS OUT!!!!!

      Delete
    2. ehhhhhh it MIGHT be worth my money....
      I still want to wait a bit for excavator though.

      Delete
    3. Daaamn! then i will surely get the FX9450, the price seems just about right and the performance increase over FX8350 seems excelent.

      Delete
    4. It won't be as cheap as Intel's i5-4670K nor the FX-8350. Sure its better than AMD's current solution but it won't use any less power so IT STILL SUCKS OUT LOUD!!!!!

      Delete
    5. "The 249 USD FX 9450 Steamroller FX"

      it's cheaper than a i5-4670k, at least in Australia

      Delete
    6. Intel haswell is cheaper than AMD but not but much this time. Steamroller FX will still suck because it uses the same amount of power as the current FX series which already uses FAR TOO MUCH POWER!!!!!!

      Delete
    7. For 250? I would go directly for a lesser priced processor, but AMD for sure.

      Delete
    8. wait....
      are..
      what?

      I am thoroughly confused by jml12 ,and either I do not comprehend, or he is being weird again.
      I say that a $250 chip is cheaper than a $270 chip.
      This is, I am sure, Correct. apparently not.

      The 4670k is actually on special at umart right now, so it'll probably go back up to $300 soon.

      Delete
    9. AMD uses far too much power for me to consider on the desktop. Their Mobile APU's aren't all that bad but they lack performance per watt due to weak IPC and a mediocre IGP.

      Delete
    10. Better than anything Intel has for the igp

      Delete
    11. Nope Iris PRO beats the SHIT OUT OF EVERYTHING AMD has. As for Kaveri Broadwell's IGP will increase by at least 50% so Intel will still be FAR ahead of Kaveri. AMD HAS NO CHANCE OF WINNING!!!!!!

      Delete
    12. Iris Pro or the HD 5200 is not out yet so its all speculation. The HD 5000 is out and it scores a 604 in passmark which is still much lower than the 8670D in the AMD Apu's. The only major difference will be slightly higher clockspeed and edram which will bump the performance, but not enough to catch up to AMD's 8670D which scores a cozy 913.

      http://videocardbenchmark.net/gpu.php?gpu=Intel+HD+5000&id=2552

      Proof right there suck it fanboyyyyy

      Delete
  18. 28nm Excavator will decimate intel.. The Excavator APU's will be going up to 6 cores up from 4, and TDP dropping from 100W to 65W for the Flagship APU.

    ReplyDelete
    Replies
    1. AHAHAHAHAHAH....NOT GOING TO HAPPEN FANBOI!!!! Really that is all pure speculation and its not going to happen AMD FANBOI!!!!! AMD still be a power hog of an APU and thus not suited for me.

      Delete
    2. 28nm? i was expecting AMD to use Tri-Gate and at least 20nm in 2015.

      Why are AMD sticking to 28nm in 2015? they need to move at least to 22nm to give some competition.

      Delete
    3. At 28nm Excavator is going to SUCK HARD!!!!! Because AMD is cheap and lazy that's why but if they do that THEY ARE FINISHED FOR SURE!!!! You're right AMD needs to be at least 20nm in order to compete.

      Delete
  19. Now i get it, AMDfx.blogspot.com does not ban jml12 since he is a perfect example how *beep* and *beep* then *beep* a fanboy really is. Thanks jml12, people will hate Intel more thus buy AMD instead.

    ReplyDelete
    Replies
    1. Heh. Not likely i'm afraid because of Intel's vast marketing $$$$. Most people don't even know AMD exists.

      Delete
    2. Your really stupid if you think people don't know who AMD is, they have been making chips since way before you were born.

      Delete
    3. That's not it. Its just Intel is much more known and 90% of PC geeks me included recommend Intel to friends and family. Because Intel is much faster than AMD in 90% of tasks and ties it most everything else. A few cases where AMD does shine fall into the 3% out of 100% so 7% of time AMD ties Intel while 90% of the time Intel CRUSHES AMD INTO GROUND!!!!

      Delete
    4. Just a question - how are you defining those tasks and the faster thing?
      faster for price? Nope.
      Faster for Power? depends on what it is
      Faster for Cores? Intel

      If you are comparing by cores, that's freaking ridiculous, if comparing by power it's usually Intel, but if you're comparing by price you are asking to be a laughing stock.

      Delete
    5. Single core performance actually. The Price is almost the same so Intel wins my $$ because its High IPC and its better emulation and Power usage. On most tasks INTEL WINS HANDS DOWN!!!! In Price Intel isn't that bad because they win in everything else so its worth it.

      Delete
    6. jm12 your numbers are so off its hilarious...

      5 times the power consumption???? LOL... you are a joke

      http://www.extreme.outervision.com/PSUEngine

      EIGHT CORES + BARE MINIMUM GPU FOR MODERN GAMING

      AMD AM3+ + FX 8320 + 7750 GPU + 2 sticks of DDR3 + 2 fans + 1 dvd drive + 1 HD

      = 287W under load

      TWO CORES + SHIT GPU

      Intel LGA i3 2100T + 5570 GPU + 2 sticks of DDR3 + 2 fans + 1 dvd drive + 1HD

      = 203W under load

      The AMD rig crushes yours in every way possible, and requires not much more power.. Its 8 cores vs 2... the GPU is more than triple the performance as yours.. and idle the power consumption is DEAD EVEN!!!!

      According to jm12 the AMD rig above requires a MINIMUM PSU of 1000W!!!! KILLER MATH MAN!!

      Delete
    7. Yea, sadly AMD was first on the market with Zero Core (Yeah try looking that up) and intel followed up. He claims his rig only uses 85 watts which is a joke because hes only counting the TDP's which is not how much power the chip actually uses.

      Hes so outdated and thinks that his rig is such a power saver, but in truth, its actually a poor performing hog at this point.

      Hes claiming Intel wins in most tasks but the only chips that really beat AMDS chips are the E series and they cost a mint to purchase. Hes an idiot fanboy with no proof of anything thinking that anyone here would believe a word he says. I doubt even his family and friends believe what he says, they probably just buy the opposite stuff of what he recommends because they know hes wrong.

      Delete
    8. BTW JEsse Lee, theres a PSU calculator that recommends Power supplies based off components, TDP, ETC. Your machine uses WAYYYYY more than 85 watts, TDP is not how much power its using idiot.

      http://www.extreme.outervision.com/PSUEngine

      Your machine just guessing on lowest possible junk with the components you listed shows over 150 watts

      An APU build, with 2 more cores, just relying on built in graphics (Which smokes your card), better ram, and the same everything else, will use just over 10 watts more and will destroy your rig. Use the calculator and see for yourself how full of crap you really are.

      Delete
    9. That power calculator is inaccurate at best. I used it years ago but it didn't tell me much that I needed to know. Oh by the 85 watts is max load reading with a kill-a-watt you dumbass. Wrong an APU build would be faster but it would use twice the power that MY sucky rig uses. Wrong its half that fool. Oh and the FX-8350 uses at least 364 watts under full load while my sucky rig uses about 80 watts. I can't comprehend how stupid you people really are its mind boggling.

      Delete
    10. Yea, would love to see proof of that, your just adding up the TDP of your components and claiming that's how much power your using. Your using over 150 watts with your rig, ive shown you proof in a previous post how much your chip and GPU uses under load which your machine is always at because it takes everything it has to play games on low.

      364 watts is what the FX-8350 uses overclocked to 5.0ghz, keep pulling made up facts out of the air, your just looking more and more stupid by the moment.

      That calculator uses legitimate facts, results, and math to calculate things, it may not be 100% accurate because things are subject to change, however it is way more accurate than your stupid comments.

      If you cant show proof of your claims, then your comments are pointless.

      Delete
    11. Your calculator is less accurate than you think my comments are. Here you IDIOT: http://www.xbitlabs.com/articles/cpu/display/core-i3-2100t_11.html#sect0. SO what it still uses way too much power. The Calculator is flawed because it looks at only the TDP of the components in question however its close to right about AMD's power hogging parts. Damn you are stupid using old facts just in an attempt to make AMD look good when it SUCKS OUT LOUD!!!!

      Delete
    12. Your comments are so innacurate, everyone on here knows your full of it. THe calculator does NOT rely on TDP idiot, if it did, then once again since you cant do simple math, 35 (i3) + 47 (HD 5570) is 83 which is what your labeling your supposed PC runs at under load which is beyond false. The Calulator gets its information based off of benchmark data imputed to it from Power usage under load and gives your a estimated power usage based on that and the PSU recommneded to be needed to run said machine.

      Your an idiot, your machine under load uses in excess of 150 watts based off of REAL benchmarking, not using a bur nin test which does not truly stress the Chip, it only heats it up and tests it for problems with an overclock idiot. That website did not do a benchmark/natural usage test to see its power comsumption.

      http://www.silentpcreview.com/article1202-page6.html

      An ACTUAL test of your chip under load, it uses an average of 50% light load and 50% heavy load to get a rough estimate of what the chip on average of using the chip would be. In this case, its a bit over 51watts (Which means your chip uses more than 51 watts under heavy load, IE GAMING). That means that each core is using round 26 watts under heavy load. Now on the AMD side that you keep claiming my chip uses significant more power per core than yours does is 213 watts under ALL 8 Cores heavy load. This being a disadvantage for the FX-8350 since the same website did not do a half and half average like the site I showed you did, where as this site just posted it under 100% stress its power consumption. Dividing 213 by 8 is ~26.625 per core under full stress. What does that mean? Well it means that on stock clocks, my Chip has the same power per core with a slight margin of error.

      Now mine is not as power efficient overall because of the 8 cores, but it outperforms yours by a humongous margin and does not have to use all cores all the time like your chip is stuck with constantly. Plus my machine is OFF when not in use unlike yours which is on 24/7 wasting lots of power sitting around twidling its thumbs.

      While your trying to say AMD is super inefficient, its actually better performance than Intel in Power per Core because an 8 Core AMD chip has a TDP of 125 where as the 6 Core Intel Chip has a TDP of 150 and its clocked lower.

      Delete
    13. AHAHAHAHAHAHAH....MY PC uses 85 watts when running Id RAGE. Uh you forget that AMD's SUCKY 8 cores ONLY EQUAL FOUR OF INTEL'S ALMIGHTY CORES!!! Wrong It has never went past 85 watts unless I use the DVD-ROM drive but that adds 10 watts AT MOST!!!! 35 (i3) + 47 (HD 5570) is 83 is correct The PSU calculator IS WRONG!!!! Because AMD is super inefficient and that's fact. TDP's don't matter when it comes to AMD because all of AMD's CPU's uses two to three TIMES THE TDP IN POWER consumption.

      Delete
    14. More cores, means better Multi-Threaded performance which is why they do it and Intel even does it with their 6 core series so quit acting like theres no reason for multi-cores.

      Saying wrong constantly even though facts are facts and you trying to make up your computer using the exact TDP its rated at even though ALL chips use higher than the TDP they are rated at. Your trying to say AMD is the only one going beyond their TDP the chip is rated at yet ALL chips do that, the TDP is just what the manufacturer sticks on to show an average area of power usage and heat dissipation which is common knowledge.

      The PSU calculator has been around for along time and is used by most enthusiasts who want to check the power consumption. Your an idiot, if it was calculating TDPs, it would have put up 83, but it put up 150 because its calculating the ACTUAL power usage of both the Chip and GPU under full load.

      Your an idiot, Intel goes just as far as AMD does beyond the TDP just like AMD, ALLL CHIPS DO IT. Stop making
      shit up, there are charts for EVERY cpu showing this and your just looking dumber and dumber every post.

      i7 3770k
      77Watt TDP
      Idle Power 99Watts
      Load Power 166Watts

      Much higher than a 77Watt TDP

      i5 3570k
      77Watt TDP
      Idle Power 97Watts
      Load Power 161Watts

      Again Higher than a 77watt TDP Not Enough?

      Phenom II X6 1100T
      125Watt TDP
      Idle Power 117Watts
      Load Power 236

      Again Higher

      ALL CPUS 99% of the time go beyond their TDP rating. The TDP is mostly for Fans and to let manufacturers and people choose a Fan system for COOLING.

      http://www.bit-tech.net/hardware/2012/04/23/intel-core-i7-3770k-review/8

      Delete
  20. The Sad truth is ... anyone with a PC has a minimum power supply of 350W .. which is more than enough for the above rigs.. you act like 280W IS TOO MUCH POWERRRRRRRR ... your pathetic

    ReplyDelete
    Replies
    1. Yeah, I mean honestly, I know many people that power is a concern, but buying a setup like that is complete garbage and cant even be listed as a gaming machine. If people really want a low power gaming rig, the best option honestly is an A10-6800K mostly due to the fact you don't need a separate graphics card to play games at decent settings anymore (low - Medium). His rig is a joke and hes just talking this way because he thinks the world thinks like him.

      My Past 2 Rigs were more powerful than his current one, and both my laptop and previous laptop were more powerful as well.

      Delete
    2. Damn hjh you are really stupid. BY my logic AMD would use 400 watts and that's about right while my PC uses 80 watts which is also WAY TOO MUCH power used. The A10-6800K IS NO BETTER than my sucky card. No most of the world thinks LIKE YOU IDIOTS!!!!! Because 280 Watts IS WAY TOO MUCH POWER used you half wit!!!!!!

      Delete
  21. Your an idiot, a flat out idiot, once again ill prove you wrong with FACTS.

    This is the GPU 8670D which the the graphics card contained inside the 6800K
    http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+HD+8670D&id=2543
    Score is 913

    Here is your ancient low end GPU
    http://www.videocardbenchmark.net/gpu.php?gpu=Radeon+HD+5570
    Score is 713

    913>713 since you have proven you cant do simple math.

    Your Logic? You mean your made up logic with no proof shown just off the top of your head from what you believe. Now who are we going to believe, the obvious troll who posts no proof, no logic, and makes things up out of thin air. Or the benchmarks, video results, and posts of other on the internet with proof of their findings...Tough call.

    ReplyDelete
    Replies
    1. Looks like I am still right even thought its better its only slightly better. Not worth it and most certainly not worth the power usage increase that would come from using said configuration.

      Delete
    2. It recieves a 5500 on CPUmark and a 913 on passmark where as your rig scores a 2998 and a 713. Thats significantly higher than your POS. Put in an SSD, DDR3 2133, and stuff, that machine would use low power, and performs amazing for a chip that relies on everything on one component.

      The A10-6800K and A10-6700 Both contain the same GPU 8670D at the same clocks on the GPU while the 6800k has a 100Watt TDP (But its unlocked and clocked over 4ghz) and the 6700 has a 65TDP.

      THe performance of those chips with the ir on board GPU SMoke your machine by a huge margin and the Consume low amounts of power (6800k consumes under gaming load an average of 153 watts where as the 6700 on average consumes 115 with the Metro 2033 Benchmark).

      So in other words, the A10 will consume roughly a slight bit more power than your machine while greatly outperforming it, where as the 6700 will also outperform it and uses MUCH less power. Proof is in the benchmarks and results.

      http://www.tomshardware.com/reviews/a10-6700-a10-6800k-richland-review,3528-11.html

      Delete
    3. Both the 6700 and the 6800K use FAR TOO MUCH power but the Core i3-3320 doesn't but its GPU isn't as fast but it uses a third of the power. You don't get it AMD has a huge problem with CPU's using FAR more power than the TDP states and until they get that fixed they won't get my money. For 65 watt APU 115 watts IS UNACCEPTABLE!!!!!! Same goes for the 100 Watt APU!!!!! Sure its faster but it dissipates more heat and uses 40 watts more than My PC uses. I trying to get something with far better performance per watt than my PC and the 6800K isn't any better than MY PC for Performance per watt. AMD 6000 series APU's ARE NOT LOW POWER they are HIGH power. As I said before ARM has increased my standards for low power computing. You call 153 low power I called that High power.

      Delete
    4. Blame GloFo for that not AMD itself... GloFo mediocre performance has AMD at it´s knees.

      http://www.techpowerup.com/reviews/AMD/A10-6800K/5.html

      Roughly 80W for the 8600k at 32nm, the 6800k at 28nm would be either 32% faster (same power draw as it is now) or 30% more power efficient with the same performance, that would put it at 56W just by using 28nm.

      At 22nm the 6800k would be easily 60% more powerfull with the same power draw as original or about 65% more power efficient with the same performance, that would put it at about 28W by just jumping to from 32nm to 22nm.

      Each new shrink gives an average of at least 25-30% more performance with the same power draw as the previous generation, it depends on what AMD/Intel chooses, like more performance for the same power draw or less wattage with the exact same performance (this is what Intel did with Haswell).

      AMD at 22nm or lower would be incredibly fast, remember that the PhenomII x6 1100T has the exact same Multi Threaded performance as the i7 960 but in Single Threaded the i7 wins by about 5%, both CPUs at 45nm with the i7 Max TDP of 130W while the 1100T uses a max TDP of 125W. Difference? price! the i7 960 was at retail for about $380 while the PhenomII x6 1100T arrived at $260.

      i7 960:

      http://ark.intel.com/products/37151/ (Check Max TDP of 130W!)

      Phenom II x6 1100T:

      http://www.ebay.com/ctg/AMD-Phenom-II-X6-1100T-3-3-GHz-Six-Core-HDE00ZFBGRBOX-Processor-/109635247 (Max TPD of 125W!)

      So you see, at the same node they are pretty much at the same level of performance, Intel running hotter with lower Frequencies because of using Bulk, while AMD runs cooler with much higher frequencies because it used either FD/PD-SOI.

      AMD does not have the same wallet as Intel, remember intel is about 13 times bigger and they can indeed spend tons of billions just for the "lulz", while AMD has to use what it has which is... GlobalFoundry... the cheap ass mediocre Foundry who loves to cash on AMD and delay their shrinks by at least 2-3 years, so thanks GlobalFoundries if AMD ever dies.

      Intel can and would rather spend the whole AMD worth in billions by bribing PC Vendors to Troll AMD to death as they did with Dell, Acer and Toshiba back in 2004.

      Give AMD the same shrinks and the same Intel budget and they will turn Intel into dust in 3 years.

      You would say that AMD does not has the same IPC and that Intel needs more frequency to match Intel and that may be partially right, AMD gives similar performance for less price, at the end of the day AMD is giving almost the same performance overall as Intel for less price, mind not the clocks since is similar to what they do against nVidia, GeForce Cuda Cores are much more powerfull than Radeon GCN Cores, but AMD gives more less powerfull Shaders that works the same or better.

      You can`t seriously expect AMD to give better performance with such outdated GloFo nodes at lower Intel prices, they do fuck up from times to times because they go dessesperate at times, they have always been almost run out of resources since 2006 era and released horrible and lame CPUs PhenomI, Zambezi and Vishera FX 9590 out of pure dessesperation... can`t blame them... at least they are still a float and this last fiscal quarter they started losing much less, next quarter they will surely get out of Red Numbers since many years.

      Delete
    5. But things are different this time around, AMD is making sure no one is there to bribe vendor, nor Software Developers crippling AMD with Intel Compilers, also they made sure to grab the whole next Console Generation so it benefits their hardware, bring HSA and HUMA into the table and AMD may have a win.

      I can see the Kaveri APUs competing easily against the i5 and getting close somewhat even at 32nm on NEWER APPLICATIONS (not the old ones) HSA and HUMA Enabled.

      Delete
    6. Its True that Globo is fucking up AMD big time. But AMD won't be at 20nm anytime soon thanks to poor planning. AMd lost less sure that isn't going to dig themselves out the HUGE hole they created. Yeah AMD sure was desperate to get every consoles. So much so that they won't make much money off them. Kaveri will be outclassed in every way by Haswell i5 and especially by haswell ULT!!!!! Kaveri can't hope to match Intel's performance per watt nor its IPC. Kaveri's IGP will be faster than haswell's but by the time Kaveri is fully released Intel Broadwell WILL CRUSH ANY AND ALL HOPES that AMD has of competing. Broadwell will use even less power and it will widen the gap so far that AMD will never catch up. Phenom I, Zambezi and Vishera FX 9590 are all PURE GRADE A Piece's of DOG SHIT!!!!

      Delete
    7. Once again, you have no room to talk while on your GRADE A Piece of Dog Shit Machine.

      They make plenty of money off consoles, they didn't have to do much to beat in that war.

      The Core I3-3220 Scores a 4220 on Cpumark
      intel HD 4000 scores a 465

      The TDP alone for the i3 3220 is 55watts for that pitiful performance because you claim all you care about is power consumption, that thing to even do anything close to what you play at (Which is so low as it is). Including your GPU, that's an extra 47TDP (Which isn't the amount of power it uses, it uses more but since all you seem to be able to read are TDP's, ill make it simple.

      So your total for that build would be 102Watts just calculating the TDP (which is pointless but again, that's all you can read).

      Now the AMD-6700 has a TDP of 65Watts and scores a 4983 on the CPU and a 913 on the built in GPU which destroys your 5570. Not needing a separate GPU cuts the power consumption down significantly and your getting a real quad core.

      My point is proven by facts right there using your false logic and using real logic.



      Delete
    8. The TDP of the i3-3220 would be lower If I didn't use the on board IGP then I could save about 10 watts and use that towards an actually good dGPU that would CRUSH the 8670D INTO NOTHING!!!! No you AMD fanboi's have no logic just fanaticism. Meaning the Core i3-3220 with a dGPU is by far the best choice AMD's IGP uses too much power in fact it uses almost 3 times as much as Intel's IGP. As the Core i3-3220 is almost as fast as the AMD-6700 and uses half the power. AMD's IGP takes an extra 47 watts so I'd rather get Intel and then get a dGPU that uses about the same amount of power. You point is invalid for the most part. Wrong the i3-3220 uses 61 watts while the 6700 uses 115 and the 6800K uses an unacceptable 153 watts.

      Delete
    9. Under load testing of both igp and CPU, then i3-3220 uses 78 watts on a MICRO atx board which can account for up to 20 watts lower power consumption while its TDP is 55.

      http://www.bit-tech.net/hardware/2012/11/26/intel-core-i3-3220-review/7

      That's FACT

      Heres the review on silentpc for the richland APUs comparing power consumption and performance against the other on die GPU's. For any person when power is a primary concern, the most demanding part of a machine in most cases is the GPU, on die GPU's save a lot of energy in the long run hence why things like Ultra books or APU laptops don't havea separate one and get great battery life.

      The fact is the HD 4000 scores a 495 in passmark while the 8670D scores a 913 in the same test, at a TDP difference of 10 watts over the i3 just comparing on die GPUS, the CPU outperforms the i3 and the GPU nearly double the performance. The TDP ratings are 55 for the i3 and 65 for the A10-6700 and it outperforms in a gaming rig setup while using a difference of 10 watts (Under stress it shows a max of 89 watts versus the i3's 78 which is both testing with the on iGPU (Since your bad at math, that is a difference of 11 watts for a powerfuller CPU and a double performing GPU).

      Since you want to say dropping off the i3's igpu would save 10 watts, ok lets assume that and say under load it uses 68 (Even though this is complete bs but since you really wanna use your dumb logic). Ok the GPU has to beat the 8670D and use roughly the same power.

      The GT 640 scores a 1300 on passmark which is higher in score over the 8670D but it has a power consumption (According to NVidia) of 49 watts bringing the total to 117 watts.

      The AMD 7750 uses at peak 44 watts meaning the total would be 113 watts using the i3 without the on die GPU and the performance of the 7750 is 1626 (Higher than the GT 640 in performance, but better power consumption).

      You keep pulling Intels Power consumption out of your ass but then post AMDS power consumption from a website in true fanboy fassion trying to hide the truth. You also compare power consumption from the AMD chip using its on board GPU while testing the i3's power without using the HD 4000.

      Just like how you claim your system only uses what the TDP says yet everywhere else says it uses much more. an A10-6700 uses far less than your system and more than double the performance.

      Delete
    10. Yeah I know the huge difference in power between Micro ATX and ATX. Uh the 61 watts was from same the EXACT place you dumb fanboi's. TDP doesn't matter expect on AMD chips where its actually right. That's just the TDP that has nothing to do with POWER consumption which is what I am counting you moron. So what if the Intel i3-3220 and dGPU combo do use 120 watts if the performance per watt is FAR GREATER than AMD's APU's. Wrong again you moronic Twit. My system uses far less power than AMD's power hogging APU's. As I said AMD WILL NEVER GET MY MONEY until they fix their power consumption problem. AMD's IGP is faster than Intel's HD 4000 but it uses TWO TO THREE times the power. Yeah even the AMD 7750 and Intel i3-3220 is a far better combo than the AMD APU. 78 watts AT MAX LOAD that's as high as it will really go. AMD uses TWICE the power but its not twice as fast. Those TDP rating ARE WORTHLESS as both INTEL AND AMD have proven in the past.

      Delete
    11. You constantly change your argument when im right...Intel idiot fanboy.

      Ok by your "definition" (Using that term extremely loosely) the power consumption of the Intel HD4000 was 10 watts I believe you said (That's wrong but lets use that anyway). Then you said that AMD's on GPU die uses 2 - 3 times the power, lets go ahead and put that in the center at 25watts to be an even in the middle of your guess. So if the I3 3220 uses 78 under load, subtract 10 and you get 68. Now for the 6700, subtract 25 from the 89 you get 64, so under load then by your definition, the 6700 without the on board GPU uses 4 less watts and performs better than the i3.

      Were just using your flawed logic there.

      Delete
    12. AMD's GPU I learned from one the links you fanboi's so kindly sent me uses 41 watts I was actually shocked by that. I see that you do a lot to make AMD look good when they don't. Wrong it uses more than that. Wrong 6700 is only slightly faster but the i3-3220 is a DUAL core not a quad core. You forgot that Intel's 4 cores equal AMD's 8 cores. SOrry I made some number but Intel uses more than that and amd uses even more than Intel.

      Delete
  22. Since my processor is FX-4100, first-gen, I was thinking of getting a Third-gen processor. So, will my mobo support these new FX processors?

    Gigabyte Technology Co., Ltd. GA-78LMT-USB3 (Socket M2) AM3+ mobo.
    AMD 760G + SB710 Chipset

    If not, I will have to think about getting an FX-6350.
    Or a completely new PC with Kaveri or the next FX, maybe a 6-core variant.

    Topping it off with the GTX 860, when it comes out(Unless AMD has something good).

    My laptop comes with the Second-gen Trinity E2-1800 and it was dog-slow until I turned-off stuff from Windows, which I feel like I shouldn't and my old 2008 laptop with 2.1GHz Dual-core wasn't slow with everything turned-on. It should be that this 1.7Ghz would compete with a 2.3Ghz of 2008, so these E-series are an insult. Anyways, I want to get at least the Fourth-gen. I am skipping one generation on each.

    ReplyDelete
  23. Umm, its still going to be on the AM3+ platform, so as long the the vendor updates the bios to support it you will.

    ReplyDelete
    Replies
    1. Okay because I was hearing new Mobos or we have to have an 990x mobo.
      Buying a new PC is not in my mind but I will if I have to.

      Delete
    2. Steamroller FX will use AM3+ while Steamroller APU will use FM2/FM2+.

      I am starting to think there will not be any Excavator FX CPUs and only Excavator APUs.

      I hope AMD has plans for FX-6450 or FX-8420, i may get any of these two and OC them as much as possible.

      Delete
    3. Well the new chipset is the 1090 chipset, but as long as your board is AM3+ it will work so long as the Vendor updates the bios. As for Excavator, they are keeping that under wraps, they normally talk about the APU line first before the FX line of chips, that will be announced when the time is right.

      Delete
    4. Eta, you do know that AMD releases processors on FM2 sockets, right?
      So, they can just release the APU and disable the GPU, there you go with a processor.

      Delete
  24. They don't actually do that, FM2 is totally different in design from am3. They build the cores similarly, but they build things around with extra from no gpu.

    ReplyDelete
  25. I am sick of people arguing in here, can you just stop?

    Jimmy Lee, jml12 or jml... Stop going around forums and sites, stop with the trolling and stop with being dump.

    Everyone... SHUT THE FUCK UP!

    You can discuss, debate and "fight" for days... This is like atrition war, just shup up and move along. Ignore each other.

    Both camps are full of fools.

    ReplyDelete
    Replies
    1. And you are no different but I am not a fool. How about you AMD fanboi? This is fun by the way. And NO I will not do that sorry MR DUMB Spaniard.

      Delete
  26. Since you all are arguing and some of you don't know the facts then it seems its up to me to teach you all...

    Kaveri will not be compatible with FM2 socket, Kaveri will be on new FM2+ socket and Trinity/Richland will be compatible on it. FM2+ socket supports native USB 3.0 and PCI-E 3.0.

    Kaveri will have up to 4 Steamroller core with up to 512 GCN 2.0 GPU cores. According to various leaks and informations, Kaveri will be up to 25% faster than Trinity/Richland clock to clock, GCN 2.0 is an successor to GCN and it will have considerably lower power consumption. It will support 2400mhz DDR3, possibly triple channel memory.

    APU codenamed "Carrizo" is planned to be 20nm and being compatible with FM2+ socket, it will have Excavator cores and 20nm GCN 3.0 or 4.0... It's maximum projected is TDP is 65 watts while they will be aiming for maximum TDP of 45 watts. "Carrizo" will most likely support DDR4 since it is going to be released in 2015.


    ReplyDelete
    Replies
    1. Yeah, I'm just sick of this little Jesse kid, he has no idea what he's talking about, constantly trolls, and he's changing his argument/debate with fictitious facts he makes up on the spot. One minute he's counting tdp's and saying that all that matters and the next he's saying I tells tdp is right and AMD's is wrong, then he says tdp does not matter (which is the closest to a correct statement but he's still wrong).

      HjH, it's you blog, block both of his accounts please.

      On a different note, yea fm2+ and 1090 am3+ is going to be an interesting generation, if a 25% increase clock to clock is true, that's a huge performance boost. I'm actually looking forward to those fx chips because if it can still overclock to ~4.7-5.0ghz, that's going to be a beast!

      Anyone caught word of any more release dates yet, ivy bridge-e is the only thing there is a confirmed timeframe on and I want to map out to compare chips.

      Delete
    2. There is a gap of 11% in favor of Richland compared to Trinity on IPC/CPU overall Performance, and Richland is just a small refresh compared with Kaveri, so AMD pretty much did Amazing Magic with the Steamroller arch.

      I can guess AMD actually improved up to 45% the IPC from Vishera to Steamroller and hence the reason for the FX-9650 when the FX-8450 was supposed to be the flagship for next Generation.

      I still have no idea why they did not instead released a CPU with the performance between the future Steamroller FX and the FX-8350, it would had been much better than the expensive nasty crap called FX-9590 that only "selected" Motherboads support, what on earth where they thinking?

      The new Piledrivers CPU AMD promised in their Roadmaps was not a new gen of Piledriver Arch as i thought, but instead it was the damn FX-9590...

      Sometimes i feel i should just move to Intel CPU`s too and be done with it, i already dropped my Radeons GPU and got a GTX 470 that works beautifull even when it is supposed to be a mid-end GPU, no issues with installing/uninstalling Drivers nor Latency issues.

      Delete
    3. AMD will only improve's IPC by a measly 25% I know that and its NOT enough. Trolling is fun and some of you fanboi's don't have your facts straight. The Era of the high power desktop IS OVER stop living in the past. Because AMD is run by a bunch of MORONS!!!! Intel got the memo but AMD did NOT. Low power is the future but AMD is blind to that future it only has one semi low power product and that is jaguar. Everything else they have uses too much power. Steamroller is nothing to get excited about too bad for the fanbois. A 45 Watt TDP in 2015 that is way too much when almost everything Intel will have at that point will have much lower TDP's and power usage. Don't bet on triple channel and/or GDDR5 memory and without those AMD has no chance against Intel. Kaveri will use the same amount of power than Piledriver uses TOO BAD FOR YOU!!!! Hence why Intel/Nvidia is best gaming combo money can buy.

      Delete
    4. Measly 25%? Are you an idiot, if 25% is measly then jump from Pentium 4 to Core 2 Duo is measly by your standards.

      Not enough? Oh you are an retard, any improvement is better and if you can improve better than your competition then you are outpaceing them and catching up.

      Trolling is fun if you are an idiot, a parasite...

      Who is living in the past? Who? It is you... Not us.

      You say AMD is ran by bunch of morons while those morons are cutting unneeded expenses and soon coming back from negative to positive clash flow also they are morons because they release competitive products? Where is your logic?

      AMD get your so called "memo", hence they developed bobcat and released it in 2011 and you most likely never heard of Desna APU's for netbooks and tablets. Acer Iconia W500 had C50 Desna Dual Core and was the most powerful Windows tablet. C50 consumed only 9 watts in 2011.

      If AMD was blind then they would not develop and release Desna APU's in 2011 that consumes only 9 watts. If Jaguar is lame and semi-low power then the press and reviewers would say so but their opinion is just oposite from yours. Intel's GPU's(Fermi based) use way too much power...

      You can everyone a fanboi, even yourself. Admit it!

      Steamroller could be a valid reason for people to be excited if 25%(P4 to C2D jump) clock to clock improvement compared to Piledriver is true, Haswell is only 5 to 10% at best so in case its true then AMD is catching up very fast.

      "A 45 Watt TDP in 2015 that is way too much when almost everything Intel will have at that point will have much lower TDP's and power usage." - Congrats for failing, again.

      You hope that I am betting on that, but I am not you piece of shit fanboi that hopes and wants to believe that I am doing just that what he believes I am doing but I am not!

      AMD may planned BGA Kaveri's with GDDR5, but they probavly scrapped it because it would create some controversy. Nobody likes BGA, no upgrade capability.

      AMD planned DDR4 for Kaveri, but since it won't be available ultil Q3/Q4 2013 then they scrapped that and the only move to feed GCN 2.0 is by triple channel memory, even if they do not do that. The good side of GCN is that is less dependant on memory bandwitdh and its 15 to 50% faster.

      If Kaveri uses the same amount of power, it is not as bad as you want to believe because it is going to have much faster GPU and possibly considerably faster CPU. 25% better performance per watt? That's great, maybe not for your spoiled redneck ass that is an web terorist and terorise people that do not share the same opinion.

      When I get enough cash then I am going to sue you for breaking the law.

      Also please stop creating new mails and accounts, you are a such social reject that is so pathetic to watch...

      Delete
    5. What law did I break you idiot I have free speech just like everyone else does? Well I can't say you are right about that. I see AMD is in the PAST BGA is the future screw upgrade ability because its overrated because most people replace their PC's anyway and they do it about once every 3-5 years. Oh by the way 45 watt TDP for AMD is way too much because most SoC's will have TDP ratings that are far less including INTEL!!!! Yeah Trolling is fun so whats the big deal. Both ARM and MIPS will be in the game by then with SoC's that will CRUSH AMD to BITS oh and I forgot Intel who will also have Skylake which will make AMD look even worse by comparison. Also Desna used WAY more than 9 watts why do you think no wanted it? No one used AMD Apu's in tablets so far because AMD Desna sucked just as much power as bobcat. Also Wrong Intel's GPU may use a lot of power but AMD's use EVEN MORE POWER!!!!! Jaguar still sucks down when compared to Intel's Bay Trail and ARM's Cortex A-15 and is inferior to them both!!!! How about NOT YOU SOCIAL REJECT!!!! You're the one who is pathetic to watch AMD FANBOI!!!! I am not a fanboi!!! I don't like Intel or AMD they both suck for different reasons.

      Delete
    6. http://www.youtube.com/watch?v=lse3J6LkGk8

      Leave us alone!

      Delete
    7. How about not AMD fanboi!!!! And what point are you trying to convey with the DUMB YouTube video? You don't respond because I am right and you are wrong AHAHAHAH!!!!!

      Delete
    8. The funniest thing here is all the contradictions you constantly screw up with. You make shit up then try countering your mistake later by screaming more and you continue looking more dumb.

      The best part is you claim power consumption is all you care about because of your electric bill, yet you leave your PC on 24/7.

      Yea you really care about saving electricity...

      Delete
    9. I am too lazy to turn off my pc so yeah AMD IS NOT AN OPTION FOR ME. Nah I just don't want a high power bill because if I ge a gaming rig I have to pay for the electric to run it so its not an option.

      Delete
    10. I wonder how much dollars you have wasted... rofl

      Look, your system consumes more or same power compared to an an A10 5700/6800k. AMD's A10 6800k outperforms it in CPU and GPU performance compared to your system.

      Delete
    11. Eh, only about 200 dollars on the crappy combo that I use right now. I am currently waiting for the Apple Mac mini with INTEL'S ALMIGHTY HASWELL!!!!! Or something like it. And look here I don't want the same performance per watt I want much better performance per watt and that's something AMD will never be good at. Intel haswell is awesome at performance per watt and the same goes for ARM.

      Delete
  27. Jhon Carmack talking about HSA and AMD:

    http://www.pcper.com/reviews/Editorial/John-Carmack-Keynote-Quakecon-2013/Carmack-Continued-Displays-Software-and-Everyth

    ReplyDelete
    Replies
    1. John Carmack is a fucking joke nothing he says has any value anymore. HSA is just an excuse for AMD to have weak ass CPU's and use their GPU's to pick up the slack. Oh did you know he said that Xbone and the PS4 have the same power WHICH IS NOT TRUE IN THE LEAST!!!!

      Delete
    2. If he's a joke then your a laughing stock fan boy. He has far more credibility then you do.

      Delete
    3. No he doesn't not in the eyes of real gamers which you guys aren't.

      Delete
    4. Bill Gates named him a Genius a while back.

      Delete
    5. "Not in the eyes of real gamers which you guys aren't"

      Says the kid "gaming" (Using that term lightly) on a i3 2100t and a HD 5570. Yea, you cant call yourself a gamer so don't even talk.

      Exactly ETA, good point right there.

      Delete
    6. I Never once said I was a gamer I just play games that's all. You can't call yourself a gamer either because you believe John carmack the same person who made some of the worst FPS's in history. Bill Gates a washed up has been.

      Delete
    7. It is kinda funny how an A10 6700 65 watt TDP APU beats your CPU and GPU in performance per watt and dollar :3

      Delete
    8. AHAHAH...If it does it isn't by a whole lot I would be far better off with an Intel and dGPU solution or even a PS4 for crying out loud. Mobile ALWAYS has better performance per watt than Desktop so that's why I won't buy desktop chips for myself. Sure mobile chips cost more but they use less power and they produce less heat.

      Delete
  28. http://www.youtube.com/watch?v=Hiyjh_WFopw&feature=youtu.be&t=1h11m30s

    http://www.youtube.com/watch?v=o2bH7da_9Os&feature=youtu.be&t=2h42s

    ReplyDelete
    Replies
    1. Ahahahaha...These guys can't see the writing on the wall. AMD IS DOOMED they are finished!!!! It doesn't matter if HSA takes off or not AMD's fate is SEALED!!! These guys are talking like AMD has a chance but not entirely AMD's fault AMD can't make good enough chips to compete with Intel and ARM. 45 watt APU chips in 2015 that's a big joke everyone else will have APU chips with MUCH lower TDPs and they will be MUCH faster than anything AMD's got.

      Delete
    2. These and all other guys saw the writing except they saw a straw that could save AMD otherwise they would not have positive thoughts about AMD.

      It doesn't matter if HSA takes off you say? Programmers will adopt HSA and hUMA because it is far simpler to program for than current UMA processors, you can offload so many things to a GPU for example web browsing or AI pathfinding...

      AMD's fate is not sealed, if that was true then Intel's fate would be sealed when AMD kicked their butt with Athlon.

      They have the insights, you don't have none of the insights into the semiconductor industry so they know what are they talking about while you not.

      If AMD "can't good enough" chips then nobody would buy chips from them, where is your logic? Also AMD is going to release ARM based server chip that is also to be first 64 bit ARM chip. Didn't you know that ARM conglomerate is backing up AMD in HSA. ARM chips will use HSA in 2014 and it will crush Bay Trail and its successor.

      Please, jml12... Shut the fuck up, stop embarrasing yourself.

      Nobody want to believe into things that you want to believe because its plain and simple really stupid.

      Delete
    3. Yeah because you guys are blind and I have been down the same road you guys are on now believing in AMD. That got crushed hard by Shitdozer and AMD has not recovered since then. Wrong the reason Intel's fate would never be sealed by AMD having the victory is quite simple Intel has a lot of $$$$ AMD doesn't. If the same thing happened to AMD which it has AMD will never be able to recover they just don't have the funding to do so. Doesn't matter I don't need to be an insider to know that AMD is screwed!!!! Only some people buy them including you dumb fanbois. Please Luka Preradovic SHUT THE FUCK UP AMD FANBOI YOU'RE AN IDIOT!!!! AMD isn't doing as well your blind eyes think they are doing. Oh Nobody wants to believe the stupid crap YOU SPEW out your mouth Luka. HSA and hUMA will never take off as proof AMD is forcing into the PS4 and Xbone because otherwise they would be finished already. AHAHAHAHAH...I am well aware that AMD is going to release ARM chips and that will be AMD's FINAL mistake. Here: http://semiaccurate.com/2012/10/30/amd-announces-an-arm64-server-soc-for-seamicro/. AMD's ARM cores will be vastly inferior to everyone elses resulting in drop in profit margins and thus putting AMD into a state of zero recovery.

      Delete
    4. "Shitdozer" was a new architecture and it was not that good, it was like Pentium 4 but less of a flop compared to it.

      Blame Dirk Meyer for screwing AMD, he is the idiot that screwed it so hard that Rory Read barely filled the holes.

      If AMD hasn't recovered then it would be bankrupt right now, except it is not. Rory Read restructured the company, he delayed things that had to be delayed, he fired a lot of people while he later hired a lot of industry veterans.

      A lot of people recently joined AMD, a lot of ex employees came back, some veteran employees from other companies joined AMD. That would not happen if those people didn't knew that AMD has slim chances to survive.

      AMD is not screwed as it was screwed when Dirk Meyer was the CEO, AMD is going to have positive cash flow by end of this year. It will survive #dealwithit

      I am a dumb fanboi, look at yourself. You say you want power efficiency while your system is less efficient than laters AMD's APU's. You idiot.

      You say my real name and lastname while you hide like a coward behind an alias and you dare me to call me a fanboi while I am compared to you just a single grain of salt.

      I am blind? Look at yourself, I know that AMD is not doing well but that does not mean they are doomed for the fuck sake, by your logic Intel was doomed since the Athlon.

      Nobody? You mean minority, just fuck off nazi.

      "HSA and hUMA will never take off as proof AMD is forcing into the PS4 and Xbone because otherwise they would be finished already." - programmers and game developers asked for it, it is simpler to program for and AMD is not forcing it because Microsoft and Sony wanted it and AMD managed to fulfill their needs and requirements.

      AMD making an ARM chip is a mistake? Oh you are a fool, a good company must be diverse also those cores are not AMD's but ARM's Cortex A57 except AMD modified them to support 64-bit and aquired SEAMIRCO thus they will use Freedom Fabric on their ARM chip, clusters of chips in one small server. AMD will offer to many companies their solution and it could really really take off.

      ARM is not really a thin margin as you expect, they are easy to produce and cheap while turn a great deal of profit if sold in mass quantities and thus SeaMicro and ARM chips come in play also AMD's Sea Micro sells servers with Intel chips thus its a win for AMD because no matter how it plays out, they still get some cash one way or another. ;)

      Delete
    5. All Rory Read did was make AMD suck even more!!!! None of AMD's CEO's were all that great including the current one. I am no coward sorry. AMD was desperate and bid as low as possible to get those contracts. ARM themselves makes little money on each chip and FYI all ARMv8 chips are 64-bit and also AMD's ARMv8 is NOTHING MORE THAN A PLAIN VANILLA ARMV8 SOC. Everyone else will beat AMD in power usage and/or performance. So AMD is pretty much screwed from that angle. They are going into a world with at least a dozen agile companies how do you that's going to end? Not well for AMD. AMD is much too slow to compete with high number of ARM SoC Makers not to mention MIPS is making a comeback. AMD's Good enough computing is coming to bite it back in the ass SUCKS TO BE AMD right now. Nah I'd rather NOT!!! AMd is paying them big $$$ so of course they are going to join regardless of the pitiful state AMD IS RIGHT NOW. Heh I somehow doubt AMD will survive with no good products. Also another thing AMD screwed itself over with this one. The fact that AMD saved GDDR5 memory for the PS4 might be a good short term boost but its going to hurt AMD several times over in the next few years.

      Delete
  29. Trolling is fun for sure, but it is smart trolling.
    Stupid trolling is shit, so by my logic jml12 must love shit since he talks shit all the time.

    Sorry for my manner.

    ReplyDelete
  30. VR-Zone.com claims FX-8590 is the last FX:

    http://vr-zone.com/articles/hold-for-publication-why-its-a-big-deal-that-amd-is-delaying-kaveri/49389.html

    If this were to be true, it means no FX Steamroller but i already been thinking this will be happening... oh well, as long as they still sell 4-8 Cores in FM2+ Socker i don´t care, but AMD scrapped all their plans for the 6 Core Steamroller... Fucking Global Foundries, it will bring the death of AMD at this rate.

    I hope they at least use fast DDR4 on Kaveri or some sort of sideport memory a la XBox One in order to push performance.

    ReplyDelete
    Replies
    1. Damn... most probably it is true, since there are no more FX CPU in any of their new roadmaps... RIP AM3+

      I just hope they do some nice strong 8 core APUs then.

      Delete