Comments Locked

203 Comments

Back to Article

  • WorldWithoutMadness - Tuesday, May 30, 2017 - link

    18C/36T. Soon, AMD'll revise threadripper up to 20C/40T and we'd have two giant fighting of +1/+2 of each others.
  • SunLord - Tuesday, May 30, 2017 - link

    AMD can probably move to 32C/64T at any time they please as Threadripper uses the same socket as the Naples/Epyc server cpu the biggest limiter would be motherboards as they are designed to limits AMD imposed on the Threadripper system such as quad channel and 44 pcie so AMD would have to do some tweaking to make it work with those limiters.
  • tuxfool - Tuesday, May 30, 2017 - link

    AFAIK, it isn't the same socket. Threadripper uses SP3r2 whereas Epyc uses R3 sockets.
  • Samus - Tuesday, May 30, 2017 - link

    Much like Intel's HEDT CPU's use a different socket (2011 opposed to 1151) so it makes sense to have a different socket especially if more PCIe lanes are being introduced.

    Thank you AMD for making Intel interesting again. LOL.
  • ddriver - Thursday, June 1, 2017 - link

    The difference is that unlike intel, amd is not going to introduce a new and incompatible socket revision for every CPU revision.

    Obviously, the increased number of i\o requires a new socket, so you are going to have one for mainstream, one for hedt and one for server, that is pretty much inevitable. What is achievable is socket longevity, an aspect in which intel deliberately sucks big time.
  • JKflipflop98 - Saturday, June 3, 2017 - link

    You know, I see AMD fanboys all over the internet use this same line and it just befuddles me. Why on Earth do you even care about socket compatibilities? Why would you EVER buy a brand new CPU, then immediately castrate its performance across the board by shoving it into some old and outdated motherboard? If you're really that strapped for cash, why are you spending money on PC upgrades?
  • ddriver - Sunday, June 4, 2017 - link

    It is because you are technologically ignorant. Motherboards do not bottleneck CPUs. Even memory controllers are now integrated in the CPU, so how fast your memory is depends on the CPU, the mobo only provides the slot to plug it into.

    Intel is not deliberately rendering sockets obsolete to maximize performance, but to force people to purchase more mobos, thus more chipsets from them.

    A good mobo can be 300-500+$ investment. That's a significant amount of money to save. It can enable to you get substantially faster CPU or GPU for that saved money.
  • JKflipflop98 - Thursday, June 8, 2017 - link

    Amazing how many people here are completely clueless how electronics actually work. Thanks for providing an example there, ddriver. Your moronic posts never cease to entertain.
  • cpupro - Sunday, June 18, 2017 - link

    @JKflipflop98

    Explain why owners of original high-end expensive Intel motherboards, I think it was X99 chipset, were required to buy new motherborad while third-party motherboard manufacturers required only BIOS update to support new revision of Intel CPU for same socket?
  • TheinsanegamerN - Monday, June 5, 2017 - link

    "old and outdated" doesnt really apply anymore. The only difference between my p8z77 v-pro and a brand new mobo is the number of USB 3 ports. other than that, it does everything I need it to.

    It isnt the 90s when mobo designs were leaping ahead.
  • Alexvrb - Tuesday, June 6, 2017 - link

    [In the near future:]
    Oh man, they just released a board with THREE M.2 slots! My old board with only TWO (one populated) is now old and outdated!
  • Iketh - Wednesday, June 7, 2017 - link

    You're all technologically ignorant. JKflipflop is most correct here because even tho what ddriver says is true, the cpu must still be designed and traced to work with an existing pin array instead of creating the cpu with a pin array that is efficient to the new cpu architecture. It's not the motherboard anymore, it's the signaling and power routing inside the cpu that matters most.

    In other words, if JKflip had said "Why would you EVER buy a brand new CPU, then immediately castrate its performance across the board by forcing it to route power and signaling in a way that doesn't jive with it's architecture?" he would have been correct.
  • theuglyman0war - Thursday, June 8, 2017 - link

    Still on x58 with an i7 980x and to be honest I just keep upgrading my gpu's and resent incremental cpu advancement. It is actually the chipset loss that keeps my eyes wandering to ddr4 pci 3.0 lanes and nvme not to mention my horrible sata 3 speeds on my rog III rampage ex which are hard to get around and not feel ghetto despite the pascal ti sli.
    :(
    Them chipset features sure do add up after a while.
  • sharath.naik - Thursday, June 8, 2017 - link

    JKflipflop, Iketh you both are brainwashed. If you are not go head and explain how much more you need to pay for boot raid options with x299?(Or you did not know you will have to pay up to 300$ more to unlock features of x299 motherboard)? if you did not know this, then yes brainwashed is the only word that can be used for you two.
  • LithiumFirefly - Friday, June 9, 2017 - link

    What completely baffles me is why an Intel fanboy would defend buying a new Intel high-end desktop line after the last one, x99. The X99 PCH I bought only had six chips made for it, four of them are bonkers price and the other two are gimped. The Broadwell-e update was a joke the older Haswell chips overclocked way better so they were faster than the newer stuff yeah I'm definitely going to try the new Intel stuff after that. /s
  • melgross - Thursday, June 1, 2017 - link

    You can't just double the core count. Where are they going to put those cores? I assume that the silicon isn't just sitting there waiting for them.
  • mickulty - Saturday, June 3, 2017 - link

    All of AMD's high-end CPUs are based on the same 8-core die, "zeppelin". Ryzen is one zeppelin, threadripper is two connected by infinity fabric on a multi-chip module, naples is four again connected by infinity fabric on a MCM. AMD could very easily put out a chip with more zeppelins, although maintaining socket compatibility would mean losing some i/o capability.

    Interestingly this means Ryzen has 32 PCIe lanes on the chip but only 16 are actually available on AM4. Presumably this is something to do with Bristol Ridge and Raven Ridge AM4 compatibility since they have less lanes.
  • theuglyman0war - Thursday, June 8, 2017 - link

    why not? just make the socket bigger and increase my utility bill ( or at least give me the option to suffer power if I wanna )
    Supposedly processing power is only limited by the size of the universe theoretically. :)
  • theuglyman0war - Thursday, June 8, 2017 - link

    isn't silicon just sand?
  • ddriver - Tuesday, May 30, 2017 - link

    AMD will not and doesn't need to launch anything other than 16 core. Intel is simply playing the core count game, much like it played the Mhz game back in the days of pentium4. More cores must be better.

    But at that core count you are already limited by thermal design. So if you have more cores, they will be clocked lower. So it kind of defeats the purpose.

    More cores would be beneficial for servers, where the chips are clocked significantly lower, around 2.5 Ghz, allowing to hit the best power/performance ratio by running defacto underclocked cores.

    But that won't do much good in a HEDT scenario. And AMD does appear to have a slight IPC/watt advantage. Not to mention offering significantly better value due to better price/performance ratio.

    So even if intel were to launch an 18 core design, that's just a desperate "we got two more cores" that will do little to impress potential customers for that market niche. It will be underclocked and expensive, and even if manages to take tangible lead against a 16 core threadripper, it will not be worth the money.
  • Strunf - Tuesday, May 30, 2017 - link

    "even if manages to take tangible lead against a 16 core threadripper, it will not be worth the money." on this market niche money means nothing... AMD needs to have a 10%+ performance advantage to be considered cause Intel has a much better brand value, if anything the 16 core Threadripper is a desperate attempt by AMD to actually gain some traction on the HEDT.
    About the thermal limit, yes there's a wall but with the new Turbo the two best cores of a CPU can be clocked higher than the rest and hence give you better single thread performance when need, this is the future no doubt about it.

    You guys need to realize it's not cause AMD releases a product that is better on all metrics that everyone will shift to AMD, brand value counts and in the case of CPU the motherboard matters too, sure AMD has some nice motherboards but overall the Intel motherboards seem to be better furnished albeit at a higher cost.
  • ddriver - Tuesday, May 30, 2017 - link

    Sure, intel's high prices are justified by several things:

    corporate brand loyalty
    amd's limited production capacity
    fanboyism

    But all in all, money is EVERYTHING, the whole industry cares primarily about one thing, and that's profit. There is absolutely no good reason to pay 100% more for 10% more. I mean not unless someone else does the actual paying.

    Only an idiot would care about "brand value". Computers are supposed to do work, not make up for your poor self-esteem. Any intelligent person who needs performance would put his money where he'd get the most bang for the buck. Workstation grade workloads render particularly well to multithreading but also to clustering. So if you want more performance, the smart solution is to aim for the best price/performance product, and get a lot of it, rather than getting the single most expensive product.

    AMD is not desperately trying anything. It's desktop line pretty much annihilated intel's existing HEDT offerings at significantly lower price points. It is intel desperately trying to not lose the HEDT market to AMD's mainstream offerings. They'd rather throw in a couple of extra cores even if it makes zero sense, just to not disillusion their fanboys.

    I am not speaking of any brand loyalty point, I have like 70 active systems and they all run intel CPUs. I am however very happy and eager to diversify and replace most of those, which are aging 3770k chips with something that offers higher performance and better power/performance ratio.
  • Hxx - Tuesday, May 30, 2017 - link

    well, first off general comments have no place in the tech industry due to the variety of use cases and products. Folks care about brand value on certain items , say motherboard brands but not so much maybe on CPUs.
    Second, AMD did not annihilate Intel by any stretch of the imagination. where do u guys get this info? Probably from wccftech.com . Anyway their ryzen release is solid but they need cpus with higher IPCs or higher than Intel which they currently don't have.
    Third, I'm not sure what you mean by intel and desperately but there is nothing desperate about this current announcement. CPUs don't take 2 months to develop. Its not like Intel said in response to Ryzen "oh yeah? lets build a better cpu". these cpus have been fully developed and waiting retail release, maybe Ryzen pushed them to prioritize this release but these were not build as a "response to Ryzen" by any means.
  • ddriver - Tuesday, May 30, 2017 - link

    Ryzen offered intel's E series of performance at half the cost. That's twice the value. You don't need imagination, much less to stretch it, to realize that 100% better value is tad amount to annihilation. This is over-exponated by the fact it was mainstream CPUs against premium HEDT.

    And YES, it is desperation, because this product was never intended for HEDT, this is not a case of intel holding a trump card just in case amd finally decides to stop sitting on its hands. The 18 core chip was intended for server parts, and its arrival is exactly on time to be directly caused by the Ryzen launch. Intel simply too a server part with some deffective or disabled cores, in order to gain TDP headroom to boos the clocks of the remaining cores higher. It is not like intel sat down and "let's design a whole new chip in response to ryzen" - that would take significantly more time, they simply took a server part, crippled it a bit, overclocked it a bit, just so they can have a HEDT product with 2 more cores, and in doing so, sacrificing the amount of money they will make on that chip just to save face, as it would have been significantly more expensive as a xeon branded product.

    Had amd not launched ryzen, intel's current gen HEDT would have capped out at 12 cores. The 18 core solution is a last resort, last moment solution, and not too economically viable either. So yeah, it is desperation.

    But then again, expecting someone who cannot property format a paragraph to get common sense might be pushing it...
  • ddriver - Tuesday, May 30, 2017 - link

    Keep in mind had not intel sacrificed xeons to make that 18 core chip, its HEDT line would have been stuck at 12 cores, meaning that threadripper would have made intel look like a second-class CPU maker in that segment.

    So yes, it is quite literally burning money to save face for intel.
  • Kjella - Tuesday, May 30, 2017 - link

    Burning money? Ever since Bulldozer started lagging behind Intel has been printing money like crazy, this is just a return to normal profit margins because AMD is back on the field. Intel made $10 billion profit last year, I'm sure they'll survive this horrible "loss".
  • ddriver - Wednesday, May 31, 2017 - link

    The "desperation" is not for their survival, they survived the netburst fiasco when their product was marginally inferior.

    The desperation is to not look like a second grade choice in the HEDT market, thus sacrificing a much more profitable die to save face.
  • rocky12345 - Thursday, June 1, 2017 - link

    "The "desperation" is not for their survival, they survived the netburst fiasco when their product was marginally inferior."

    Back in Netburst days AMD was a lot better with what they had to offer. Heck AMD CPU running at 2000Mhz was able to keep up to or surpass a Pentium 4 @3.2Ghz. It only got worse when dual core Ahtlon's came about and Intel had to make the Pentium 4 D's but still running much much faster clock rate just to stay in the game. Very few people seem to remember Intel had a lot of bad years as well. Pentium 4 series all sucked Donkey Nutz nuf said.

    As others have said if AMD did not release Ryzen that competes nicely with Intel's HEDT platform at half the price then AMD say's oh we have a 16/32 Threadripper as well Intel would not be releasing the 18/36 CPU right now they would have kept that CPU in the Zeon line where they make the big bucks hell that 18 core is probably a cut down 20/40 Zeon retro fitted to be a X series chip. Anyways all this means it is good for us the consumers we get more choice and hopefully at a better price also.
  • ddriver - Friday, June 2, 2017 - link

    Don't forget that with AMD you get marginally better value. So even if the 18 core intel HEDT chip is tangibly faster than the top tier threadripper, for 2000$ AMD could get you a 32 core Epyc that will beat the 18 core in performance, and pretty much every other chip intel have at any price point.

    The 18 core number is also interesting as AMD's design is practically incapable of efficiently producing such a SKU, so even if intel don't get the fastest single chip, they will still be technically getting the performance crown in HEDT, albeit with a server chip they shoehorned there, and with a unique core count that AMD cannot exactly match, even if they can significantly outmatch.
  • Azethoth - Tuesday, May 30, 2017 - link

    Dude, you are missing an opportunity to really diss Intel here. Why just compare AMD to the last gen Intel chips from many years ago when you can go back decades!

    Compare to the pentium. Then you can claim that AMD annihilated Intel, scraped up the ashes then decimated those, then threw them in the microwave and nuked them before getting hookers to pee on the dust and leaving it out to blow around in the sun and wind!

    As it is your post is too weak to take seriously.
  • ddriver - Tuesday, May 30, 2017 - link

    You fail miserably at making a point. I am comparing the latest and greatest both companies had on the market at the time. Now back under the bridge with you!
  • Ranger1065 - Wednesday, May 31, 2017 - link

    ddriver, I always respect your perspective at Anandtech particularly when it differs from general opinion as, thankfully, it often does. I also admire the tenacity with which you stick to your guns. The comments section would certainly be an infinitely more boring and narrow minded place without you. Keep up your excellent posts.
  • fanofanand - Wednesday, May 31, 2017 - link

    +1
    He is the most entertaining person here, I love reading his take on things.
  • Ro_Ja - Thursday, June 1, 2017 - link

    Your comment is the reason why I scroll down this section.
  • Hxx - Tuesday, May 30, 2017 - link

    I dont think you understand the meaning of the word desperate at least in this context. Maybe intel redesigned their release schedule in response to ryzen who the f knows except their upper mngmt and thats irrelevant, in the end what matters is what the consumer gets and for what PRICE. If intel was truly desperate that we would have at least seen a price cut in their current cpu lineup and Im not seein that. These CPUs are also targeted at the enthusiast crowd and nowhere near Ryzen's price point so wheres the desperation again?
  • rarson - Wednesday, May 31, 2017 - link

    The marketing alone, never mind the fact that Intel's cannibalizing their own sales selling HCC chips to consumers, reeks of desperation.
  • DC Architect - Tuesday, May 30, 2017 - link

    If you think CIO's give a damn about "brand loyalty" over profit margins then you are high. Also... 99.8% of the people using computers couldn't tell you what a motherboard IS or what components are in their "hard drive box" let alone have any loyalty to those brands. The guys making the call on these kinds of decisions could give a rats ass what the guy on the floor wants when he can increase the margins by 100% and only lose 1% in IPC.

    We're not talking about server CPU's here that are parsing huge databases 24/7. That 1-5% IPC loss for your Joe Shmoe user matters a lot less when you can tell the CEO that you only need half your normal operating budget this year for workstations.
  • Icehawk - Tuesday, May 30, 2017 - link

    Brand loyalty is HUGE in the corporate/server marketplace, it's foolish to think otherwise. Most large companies lock in with one or two vendors and buy whatever they offer that closest fits their needs or are able to get custom orders if the volume is enough. Never in my 19 years in IT have I seen or used a single AMD server, and only in a very few (crappy) laptops. Even back in the Athlon days we would stick with Intel as they are a known quantity and well supported.

    Hell where I work now they buy i7s for their grunts when an i3 would be fine - but it is easier on accounting to just deal with one SKU and easier for IT as well to only deal with a single configuration. The hardware cost differential can be offset by factors such as these.

    On non server side, I am really happy to see AMD doing better - I probably will go with the 7820 though as I do value single threaded a lot (gaming) and also do a ton of reencoding to x265 where more cores would really help.
  • theuglyman0war - Thursday, June 8, 2017 - link

    to be fair... I assume all those TBD's certainly do represent an "upcoming" response to ryzen that we would not had seen to whatever degree the final form takes. And that is awesome.
    The healthy competitive market is officially TBD! :)
    Anyone with any reason is waiting for the dust to settle and the market to correct itself with a consumer response the way healthy markets should function.
  • alpha754293 - Friday, June 2, 2017 - link

    It's "funny" reading your comment only because so much of it is so wrong/unfounded on so many levels.

    I used to be a strictly AMD-only shop because they offered a much better economic efficiency (FLOP/$).

    Now, pretty much all of my new systems are all Intel because Intel is now better in terms of FLOP/$. (And also in just pure, brute-force performance).

    AMD really screwed the pooch when they went with the shared FPU between two ALU design in their MCMs rather than having a dedicated FPU PER ALU (something which the UltraSPARC Niagara T1 originally did, and then revised it with T2).

    It was choking/strangling itself. WHYYY Derrick Meyer (being a former EE himself) would allow that is beyond me.

    I pick it pretty much solely based on FLOP/$ (as long as the $ piece of it isn't SO high that I can't afford to pay for it/buy it).

    There ARE some times when you might want or even NEED a super high powered, many, many many core count system because if you can do a lot of development work deskside, you refine your model piece by piece without having to spend a great deal of time re-running the whole thing always, everytime; and once your model is set up, THEN you ship it/send it off to the cluster and let the cluster go at it.

    If you are doing your debugging work on the cluster, you're just taking valuable time from the cluster away. (Like when I am doing some of my simulation stuff, the volume of data that gets generated is in the TBs now, so having to send the data back and forth when you have "bad" data (say from a run that errored out) - you're literally just shovelling crap around, which takes time and doesn't do anything useful or productive.

    re: your 70 active systems
    On the assumption that they're ALL 3770K chips, that's about 280 cores. You can probably get yourself a bunch of these:
    http://www.ebay.com/itm/2U-Supermicro-6027TR-HTRF-...

    to replace your compute farm.

    I would be willing to bet that between 2-4 of those can replace your entire farm and still give you better FLOP/$.
  • ddriver - Friday, June 2, 2017 - link

    "I would be willing to bet that between 2-4 of those can replace your entire farm and still give you better FLOP/$."

    Not really. Aside from the 3770k's running at 4.4 Ghz, most of the performance actually comes from GPU compute. You can't pack those tiny stock rackmount systems with GPUs. Not that 256 cores @2.9 GHz would come anywhere near 256 cores @4.4 Ghz, even if they had the I/O to accommodate the GPUs.

    And no, Intel is NO LONGER better at flops/$. Actually it may not have ever been, considering how cheap Amd processors are. Amd was simply too slow and too power inefficient for me until now.

    And since the launch of Ryzen, Amd offers 50-100% better flops/$, so it is a no brainer, especially when performance is not only so affordable but actually ample.

    Your who post narrative basically says "intel fanboy in disguise". I guess it is back to the drawing board for you.
  • Meteor2 - Saturday, June 3, 2017 - link

    Ddriver is our friendly local troll; best ignored and not fed.
  • trivor - Saturday, June 3, 2017 - link

    Whether you're a large corporation with $Billion IT budget with dedicated IT or a SOHO (Small Office Home Office) user with a very limited budget everyone is looking for bang for the buck. While most people on this site are enthusiasts we all have some kind of budget to keep. Where do we find the sweet spot for gaming (intersection of CPU/GPU for the resolution we want) and more and more having a fairly quiet system (and even more for a HTPC) is important. While some corporations might be tied to certain vendors (Microsoft, Dell, Lenovo, etc.) they don't necessarily care what components are in there because it is the vendor that will be warranting the system. For pure home users, all of the systems are not for us. Ryzen 5/7, i5/i7, and maybe i9 are the cpus and SOCs for us. Anything more than that will not help our gaming performance or even other tasks (Video Editing/Encoding) because even multi core aware programs (Handbrake) can't necessarily use 16-20 cores. The absolute sweet spot right now are the CPUs around $200 (Ryzen 5 1600/1600x, Core i5) because you can get a very nice system in the $600 range. That will give you good performance in most games and other home user tasks.
  • swkerr - Wednesday, May 31, 2017 - link

    There may be brand loyalty on the Retail side but it does not exist in the Corporate world. Data Center mangers will look at total cost of ownership. Performance per watt will be key as well as the cost of the CPU and motherboard. What the Corporate world s loyal to is the brand of server and if Dell\Hp etc make AMD based servers than they will add them if the total cost of ownership looks good.

  • Namisecond - Wednesday, May 31, 2017 - link

    Actually, even for the consumer retail side, there isn't brandy loyalty at the CPU level (excepting a very vocal subset of the small "enthusiast" community) Brandy loyalty is at the PC manufacturer level: Apple, Dell, HP, Lenovo, etc.
  • bcronce - Tuesday, May 30, 2017 - link

    "But at that core count you are already limited by thermal design. So if you have more cores, they will be clocked lower. So it kind of defeats the purpose."

    TDP scales with the square of the voltage. Reduce the voltage 25%, reduce the TDP by almost 50%. Voltage scales non-linearly with frequency. Near the high end of the stock frequency, you're gaining 10% clock for a 30% increase in power consumption because of the large increase in voltage to keep the clock rate stable.
  • ddriver - Tuesday, May 30, 2017 - link

    The paragraph next to the one you quoted explicitly states that lower clocks is where you hit the peak of the power/performance ratio curve. Even to an average AT reader it should be implied that lowered clocks come with lowered voltage.

    There is no "magic formula" like for example the quadratic rate of intensity decay for a point light source. TDP vs voltage vs clocks in a function of process scale, maturity, leakage and operating environment. It is however true that the more you push above the optimal spot the less performance you will get for every extra watt.
  • boeush - Tuesday, May 30, 2017 - link

    "More cores would be beneficial for servers, where the chips are clocked significantly lower, around 2.5 Ghz, allowing to hit the best power/performance ratio by running defacto underclocked cores.

    But that won't do much good in a HEDT scenario."

    I work on software development projects where one frequently compiles/links huge numbers if files into a very large application. For such workloads, you can never have enough cores.

    Similarly, I imagine any sort of high-resolution (4k, 8k, 16k) raytracing or video processing workloads would benefit tremendously from many-core CPUs.

    Ditto for complex modelling tasks, such as running fluid dynamics, heat transfer, or finite element stress/deformation analysis.

    Ditto for quantum/molecular simulations.

    And so on, and on. Point being, servers are not the only type of system to benefit from high core counts. There are many easily-parallelizable problems in the engineering, research, and general R&D spheres that can benefit hugely.
  • ddriver - Tuesday, May 30, 2017 - link

    The problem is that the industry wants to push HEDT as gaming hardware. They could lower clocks and voltages, and add more cores, which would be beneficial to pretty much anything time staking like compilation, rendering, encoding or simulations, as all of those render themselves very well to multithreading and scale up nicely.

    But that would be too detrimental to gaming performance, so they will lose gamers as potential customers for HEDT. They'd go for the significantly cheaper, lower core count, higher clocked CPU. So higher margins market would be lost.
  • Netmsm - Thursday, June 1, 2017 - link

    "AMD will not and doesn't need to launch anything other than 16 core. Intel is simply playing the core count game, much like it played the Mhz game back in the days of pentium4."
    Exactly ^_^ That's it.
  • n31l - Sunday, June 4, 2017 - link

    not sure about that.. I've just done the 'microcode' unlock of a 2695 v3 (x99 single socket system) even with such an old architecture, with all 14 cores fully stressed @3199, no core goes above 50c, they have plenty or room to move on clock rates 'if' they wanted..

    I think Intel is just trying to find out where 'threadripper' will fit within the market.. worst case.., all they need to do is shift 'families' left bring the E7 to E5 and stop selling E5-16xx (i7 consumer parts) with a Xeon premium (especially now v4's are locked!! How I'll miss E5-1620@4.6 with ECC memory)

    Unfortunately, imho, Intel can only deal with AMD in a half-arsed manor, if they wanted they could kill AMD but then they will be broken up for being a monopoly if they do.. damned if you do, damned if you don't..

    Personally I'd like to see Nvidia and Intel cross-licencing to 'officially' come to an end and for Nvidia to revive the 'Transmeta x86' IP they bought (but weren't allowed to use due to GPU licence agreement) or maybe for Microsoft to extend what's happening with windows on 'ARM' and just let NVidia lose amongst the pigeons.. or crazier still, as I believe Microsoft still has the 'golden' share option from the xbox days.. how about they buy Nvidia and make a custom 'windows CPU' and take google on head first before it's too late.. ;-)
  • theuglyman0war - Thursday, June 8, 2017 - link

    what do u consider a hedt scenario that doesn't leverage moar cores?
    Workstation? A workstation creative that doesn't render interactively? Light baking complex radiosity?
    If I did not scream at progress bars aLL DAY. I would have no reason to upgrade for years now. I do not see that happening in my lifetime and if I keep screaming at progress bars without relief I will eventually commit bloody criminal solutions perhaps even to my poor suffering soul. Considering 90 percent of my progress bars are wrecked everytime I advance advanced cores...
    I wonder what is this hedt market that does not leverage moar cores?
  • 3DVagabond - Monday, June 19, 2017 - link

    Nah. They'll just sell their's for $999.
  • Chaitanya - Tuesday, May 30, 2017 - link

    Intel desperately scrambling for ideas.
  • mschira - Tuesday, May 30, 2017 - link

    $2000?
    Are they kidding? In what world do they live?
    I rather get a dual socket AMD system for the same money. If I would care enough about that many thread performance.

    Thank God AMD got their act back together, Intel has gone completely insane.
    M.
  • nevcairiel - Tuesday, May 30, 2017 - link

    Considering how much the 10-core BDW-E already cost before, $2000 is actually lower then I would have expected.
  • Notmyusualid - Tuesday, May 30, 2017 - link

    @nevcairiel

    Agreed.
  • ddriver - Tuesday, May 30, 2017 - link

    Considering it is actually a xeon they'd sell for 4000$ had there not been the desperate need to save face, 2000$ as expressive as it may be, is quite generous of intel, you know... relative to their standards for generosity...
  • smilingcrow - Tuesday, May 30, 2017 - link

    Broadwell-EP 18 Core parts start at under $2,500 so Skylake-EP or whatever it will be called will likely offer more cores per dollar and that is the comparison to be made.
    Intel can make as many of them as they like so selling them at 'only' $2k to prosumers hardly undermines their business model as it's not as if they will end up in true Workstations/Servers anyway.
    I don't see that much financial upside or downside to Intel for HEDT parts over $1K as that's a small market.
    But AMD are putting downward price pressure on the sub $1K chips which will hurt more.
    Plus the 16 Core AMD part is likely usable in a Workstation/Server with it supporting ECC memory so that is another attack on Intel.
    So I think you have missed the mark in giving much emphasis to these re-positioned HCC chips, the play is elsewhere.
  • theuglyman0war - Thursday, June 8, 2017 - link

    I am hoping the $849 16 core threadripper rumor is true. As awesome as that would be it still comes down to benchmarks. A workstation u will b suffering with for 2 or 3 years because it owns now vs the hassle of upgrading the cheaper solution every year that doesn't quite own. Where that line lie with me usually depends on how impressed I am at the time saved rendering extreme complexity. Or how much those core make my pipeline more interactive/productive. The more AMD forces Intel to cannibalize the XEON line the better. Kind of bitter that AMD wasn't more hedt relevant for a while now. Kind of wonder why they did not spend the bucks on the architecture talent for all these years if that's all it took to jump start things like the current excitement. And I haven't been excited like this in a long time.
  • Kevin G - Tuesday, May 30, 2017 - link

    $2000 isn't bad considering that the previous Broadwell-E only had 10 cores for roughly the same price. So in a generation, performance/$ will nearly double. Not a bad thing if you can use all the thread available.
  • mschira - Tuesday, May 30, 2017 - link

    Broadwell-E is a server CPU, very different thing.
  • T1beriu - Tuesday, May 30, 2017 - link

    Wow. You're clueless. :)
  • andychow - Tuesday, May 30, 2017 - link

    Broadwell-E is a high-end desktop CPU, not server class. Server processors use the Xeon branding.
  • bigboxes - Tuesday, May 30, 2017 - link

    You just lost your IT cred. Hang it up. LOL
  • theuglyman0war - Thursday, June 8, 2017 - link

    WORKSTATION
  • Makaveli - Tuesday, May 30, 2017 - link

    lol are you actually surprised?

    Not you personally but i've been seeing post which look like they are CS BRO Gamers types who think AMD and Intel are going to be release they high core cpu's for $500.....

    Just have to price sticker shock once they release the actual prices of these cpus.

    These cpu's are for Professionals and high end users and the price tags will reflect that. They already have products for gamers out stick to those they will match your wallet better.
  • nathanddrews - Tuesday, May 30, 2017 - link

    "Are they kidding? In what world do they live?"

    In a world where HEDT buyers paid $1,700 for a 10C/20T CPU. :-/
  • Cygni - Tuesday, May 30, 2017 - link

    They are likely living in the world where they beat AMD on IPC while still having a significant clockspeed advantage, I would imagine...

    I'm pulling for AMD to offer competition here, and I think they have forced Intel to react with more aggressive pricing and products than they otherwise would have, but the price of the XE is hardly a surprise. It's essentially a rebranded bigcore Xeon.
  • Visual - Tuesday, May 30, 2017 - link

    Interesting how the price differences for each extra two cores go - 210, 400, 200, 200, 300, 300. That early 400 gap in particular seems like a fairly bad deal. Is it because AMD has nothing at just over 8 cores and jumps directly to 16? Do we even know that about AMD yet?

    I wonder why Intel revealed prices before much of the other details are clear, and more importantly before info from AMD? It looks suspiciously like an attempt to hint to AMD what their prices should be in turn... like they are afraid AMD might undercut by too much if they just went with a blind bid. I wonder if AMD would accept this sort of public "pricefixing proposal"...
  • ddriver - Tuesday, May 30, 2017 - link

    They can do anything between 8 and 16 in the threadripper design, if the market should call for it.
  • ddriver - Tuesday, May 30, 2017 - link

    They have 4 6 and 8 core zen dies, but they gotta have some with 5 or 7 working cores. Those could be sold as 4 and 6 cores with one disable core, or they can be slapped on the same chip for 10 and 14 core products, no working cores wasted.
  • ilt24 - Thursday, June 1, 2017 - link

    ddriver..."They have 4 6 and 8 core zen dies"

    Actually AMD only have one 8 core Zen die. All of this years Ryzen 7/5/3 chips are made from that die they just disable cores to get the lower core counts. Threadripper is made from a pair of these die in an MCM package, with some cores disabled for some SKUs. The EPYC processor will be made from 4 of these die in a single package. If AMD for some reason wanted to match Intel's 18 cores they would need to make a three die MCM chip with 6 die disabled.
  • ddriver - Friday, June 2, 2017 - link

    "Cores" as in "active/working cores".

    I doubt they'll be making 3 die MCM solutions, that is too asymmetric. Also, they will be throwing I/O away, as the HEDT socket doesn't have the pins to facilitate it.

    It is unlikely that AMD will have a 18 core SKU in either HEDT or server. Threadripper will stop at 16. Epyc will start at 20, if the market calls for it. That's 4 modules with 5 active cores each. I doubt Amd will produce asymmetric designs. So that's 20, 24, 28 and 32 cores for Epyc. And half of that for threadripper - 10, 12, 14 and 16 cores.

    Lacking an 18 core solution is not a big whoop, and most certainly not worth the R&D money.
  • ilt24 - Friday, June 2, 2017 - link

    "I doubt they'll be making 3 die MCM solutions,"

    I agree, which was why I said "If AMD for some reason wanted..."

    "Epyc will start at 20"

    I think AMD will also have lower core count EPYC chips, as a good part of the market uses them. ...and a 16 core version made from a pair of die will cost them quite a bit less then a 20 core+ version made from 4 die.
  • XabanakFanatik - Tuesday, May 30, 2017 - link

    The reason the price jumps from 8 to 10 cores like that is you are paying from the 28 lane skus into the 44 lane skus along with the extra cores. Once there, you are only paying for the increased core count.
  • theuglyman0war - Thursday, June 8, 2017 - link

    that's a tuff sell... the 8 core's higher base clock and 44 lanes at $599 would had been the easy purchase. Now if threadripper 16c actually releases at the rumor $849 I can't see any silver lining for Intel except for 18 core bragging rights. the 10 core benchmarks would have to have some kind of threadripper 16 core beating magic to justify the $1000 price. All the 8 core skylake has to is embarrass the 1800x in benchmarks ( and it could had also done so if it just made that 8 core solution an i9 with 44 lanes )
    Not quite enough to be more awesome?
    The new larger L2 caching seems like a big wrench in the works where direct comparison might easily fall apart if AMD does not have equal advanced on threadripper that trumps infinity fabric woe?
    At the end of the day time for benchmarks and support for AVX 512 etc... will take some time? I ain't close to pulling a trigger till I C. ( tho I B really excited to C! )
  • jjj - Tuesday, May 30, 2017 - link

    Intel reacts to Ryzen and they leave room for AMD to do up to 2x better lol.
  • alamilla - Tuesday, May 30, 2017 - link

    Those prices are HILARIOUS.
    C'mon Intel
  • jjj - Tuesday, May 30, 2017 - link

    I expect the cheapest 16 cores Threadripper to allow users to fit the CPU plus mobo in 1k$ so 799$ for the CPU.

    Intel's play in recent years has been to push ASPs up and offset declining units but AMD has to find the balance between margins and share gains- Intel has no share left to gain. AMD doesn't have yield issues with MCM so they could easily go even lower than 799$ for 16 cores.
    What i am curious about is if AMD has any SKUs with less than 12 cores and how aggressive they get with those.
    And ofc let's see if AMD has a new revision for Ryzen.
    Doubt they launch Threadripper today, before Naples.
  • nevcairiel - Tuesday, May 30, 2017 - link

    I doubt you'll see a 16-core for less then $1000
  • alamilla - Tuesday, May 30, 2017 - link

    I'm placing my bets at $999 for the 16 core.

    What I'm getting at, and to some extent jjj, is that the cost of entry will still be significantly lower. I wouldn't be surprised to see lower tier X399 boards come in at $180-200.
  • mdw9604 - Tuesday, May 30, 2017 - link

    They are trying not to canabilize their server market. Even though these are not server chips. AMD is making them earn their place in the chip world.
  • Socaltyger - Tuesday, May 30, 2017 - link

    All at once now... "Thank you AMD".
  • Gothmoth - Tuesday, May 30, 2017 - link

    +1
  • soydeedo - Tuesday, May 30, 2017 - link

    No kidding. I have to say I'm kind of glad that intel isn't looking to take losses to undercut AMD while they're still in a somewhat vulnerable position. We desperately needed this competition.
  • Gothmoth - Tuesday, May 30, 2017 - link

    the 18 core is a cpu for bragging rights.

    a few youtuber will get it from intel to promote it.
    a few 1% will buy it or bragging rights.

    and intel can still say they have the fastest enthusiast CPU.

    but how many mortals will buy it?
  • Gothmoth - Tuesday, May 30, 2017 - link

    just read the tread about 10 core broadwell-e maybe being the best selling HEDT cpu.
    that suprises me.

    i live in europe. i am an engineer. my friends too have well payed jobs.

    but i don´t know a single person who would spend 1700$ (not to mention euro) on a CPU.

    so who is buying the huge amount of HEDT CPU´s?

    is it such a well received CPU for companys?
    would they not go for xeon systems anyway?
  • Kevin G - Tuesday, May 30, 2017 - link

    I suspect that the 10 core Broadwell-E sales were upgrades from lower core count Haswell-E systems. I suspect that very few people went out and built a system from scratch with the 10 core chip. Rather they started with low and then upgrades since X99 motherboards were going to be around for awhile (~3 years).
  • damianrobertjones - Tuesday, May 30, 2017 - link

    Capitals. Sentences need capitals my Man.
  • negusp - Tuesday, May 30, 2017 - link

    If you're going to be so condescending concerning @Gothmoth's use of (or lack of) capitals, at least use capitals correctly in your sentences.
  • theuglyman0war - Thursday, June 8, 2017 - link

    kind of makes one wish AMD HEDT was relevant before Intel artificially moved the cpu core increase price line that had been at $949 since Westmere? If they had been. The 18 core i9 would probably be $999 in august. Broadwell-E was going to be my upgrade. I felt betrayed. I assumed the market would react in kind. I suppose AMD was just a generation to late to save Intel from itself.
  • Notmyusualid - Tuesday, May 30, 2017 - link

    @ Gothmoth

    I paid for a 6950X, and have a 14C/28T Broadwell Xeon v4 too.

    Unless you *really* know what you are buying, i.e. you are in corporate IT already and understand the different SKUs, I'd hazard against recommending a Broadwell Xeon over a consumer part - or you'll be stuck with lower frequency operation, and there is little you can do about it. I think two-gens back the overclockers have v2 & v3 Xeons unlocked with custom bios and other madness, but not v4 (Broadwell), though I've been too busy to check back in months.

    From my inital reading, without any benchies, I think a 7920X might do for me. I'm already used to some programs not running on HCC without disabling cores. I'll take a LCC design next time.

    Just my 2c.
  • SanX - Tuesday, May 30, 2017 - link

    $2000 for 18 cores is $100 per core.
    This is approximately 20x the production cost.

    It is always good for monopoly to be a monopoly.
  • Kevin G - Tuesday, May 30, 2017 - link

    Silicon is indeed cheap. The billion dollar fabs, not so much.
  • Notmyusualid - Tuesday, May 30, 2017 - link

    +1
  • ddriver - Thursday, June 1, 2017 - link

    It actually takes less than a quarter of selling production to make up for a production line's cost. They don't make a new fab for each and every process or chip flavor. Their margins speak louder than words, they are milking consumers viciously.
  • WoodyPWX - Tuesday, May 30, 2017 - link

    Thank you AMD you forced Intel to finally do something! I'm tempted to buy an 8 core Ryzen to improve my compilation times, where I'm always CPU bound (Core i7 4790K)
  • Silma - Tuesday, May 30, 2017 - link

    The most interesting tidbit of Gregory Bryant editorial is this:

    "8th Gen Intel® Core™ Processor: We will have more to say about the 8th Gen Intel Core processor in the future but it’s exciting to share that in the latest testing, we’re seeing a performance improvement of more than 30 percent over the 7th Gen Intel® Core™ processor"

    : comparison with same TDP.

    If this proves to be true and priced without excess, the next generation of Intel Core processors will be very interesting.
  • Ian Cutress - Tuesday, May 30, 2017 - link

    It'll be interesting to see what they've done with the microarchitecture to get such a gain
  • Kevin G - Tuesday, May 30, 2017 - link

    I wonder how much of that performance gain is just higher clock speed or software optimization (AVX-512).
  • Meteor2 - Saturday, June 3, 2017 - link

    It's funny that they slipped that in; with the biggest performance jump in Intel chips in years just around the corner, it's a bad time to buy Kaby Lake.
  • lefty2 - Tuesday, May 30, 2017 - link

    " I wouldn’t be surprised if the 10-core $1721 part was the bestselling Broadwell-E processor. "
    I would. That price excludes it from the enthusiast market anyway (I know that from looking at distributor sales figures).
    I'm just wondering what this product is used for. Who needs 18 cores and is will to pay $2000 for it and what application are they running on it?
  • damianrobertjones - Tuesday, May 30, 2017 - link

    There doesn't have to be a need to achieve, "I'm da' Bomb and own this SUCKAZ!"
  • kinopro123 - Tuesday, May 30, 2017 - link

    Have you heard of the film and tv editing industry? It's not exactly small. And that's just one industry that requires high IPC + multithreading, where machines are an essential part of the workflow -- hence pay for themselves. There are studio computers with a dual Xeon and seven GPU's -- no consumer could afford it, because they're not really for normal consumers..
  • smilingcrow - Tuesday, May 30, 2017 - link

    The Pro market tend to go Xeon for the fuller feature set so the question is how many consumers will spend over a grand on a CPU?
  • Kevin G - Tuesday, May 30, 2017 - link

    The film/video market only goes Xeon if their workloads need more memory than what a single socket with non-registered memory can provide or the additional PCIe links for IO. The rest of the Xeon feature set is generally lost in that market segment.

    As for the real consumers, best probably to look back a little over a decade to see how well the Gallatin based Pentium 4 Extreme edition sold with its extra 2 MB of L3 cache. That's the last time Intel felt this threatened as they used a Xeon MP die for a consumer part. We're seeing this again with the middle Xeon die being brought into consumer form. Not a bad thing.
  • Namisecond - Wednesday, May 31, 2017 - link

    I would suggest "as many people who would spend over a grand on a GPU"
  • TEAMSWITCHER - Tuesday, May 30, 2017 - link

    The DESKTOP PC market is on it's last legs. These products are not about what you can do wth a computer ... but rather what kind of computer you can merely own.
  • Notmyusualid - Tuesday, May 30, 2017 - link

    Oh yeah, I wish I could trade in my 14C/28T to browse on my smartphone...
  • TEAMSWITCHER - Tuesday, May 30, 2017 - link

    The reality is ... that despite having a 14C/28T machine ... you will probably still browse more on your smartphone.
  • Notmyusualid - Wednesday, May 31, 2017 - link

    Nope - I despise browsing on my smartphone, but I do use it as a telephone / walkman / GPS mostly, if that is not too unusual? I'm not one of the 'head-down generation', and I'll try not to pick up my phone in restaurants, at meetings, in the pub, or when crossing the road.

    My 14C/28T is the one usually busy browsing (its just SO smoooth), or when I'm on the road like now - running World Community Grid, mining Ethereum, or just being my UK-based Remote Desktop - so I can deal with bills / deal with Paypal etc - without having to reset my account password everytime as is often required when I login from overseas...

    AMD's Threadripper & Intels new offerings are onto something here though - these large core count chips offer the smoothest computing experience you are ever going to find. Complex HTML pages are dealt with ease... installs zip by, archives unpacked so quickly, boot times (save for X99 bios) are nearly instant.

    I'll take that 12C/24T from Intel about this time next year.
  • Kevin G - Tuesday, May 30, 2017 - link

    I actually do that for work email. Not because my quad core i7 based laptop is slow. Rather that Outlook wasn't written by programmers but sadists who enjoy mass suffering. Some desktop applications are so flawed going to another tool to save me the pain and hassle is more efficient.
  • Namisecond - Wednesday, May 31, 2017 - link

    You need to clean out your mailbox and the Exchange admins need to run maintenance on the databases. When you've got less than a 4GB .ost file, stuff works well. :p
  • theuglyman0war - Thursday, June 8, 2017 - link

    last legs? The desktop market was a multi billion dollar market before internet saturation and it will still be one after all the house wives and kiddies are on phones n consoles. Then it will resemble the HEDT market that use to cater to me and was not second guessed for silly non HEDT ( non workstation concerns ).
    Good riddance!
  • eddman - Tuesday, May 30, 2017 - link

    Absolutely not interested in these. Do they have any plans to release 6 and 8 core models for socket 1151?
  • extide - Tuesday, May 30, 2017 - link

    I think we will see 6 cores on next gen mainstream design which may indeed be 1151 compatible...
  • Teknobug - Tuesday, May 30, 2017 - link

    Well so much for having interest in the i9, pricepoint is way out of whack.
  • martinkrol - Tuesday, May 30, 2017 - link

    I am thinking the same. Since I do a lot of 3d work I am thinking of upgrading my i7 3930k to a double xeon of that same generation or just slightly newer. It will allow me to use a lot more ram, and get double the threads which will help in multi-tasking. ( working and rendering on the same box for example ). Nowadays an hp z820 goes for fairly cheap and I can outfit that with 256gb ram for around the same price as one of these new top end i9 processors.
  • ltcommanderdata - Tuesday, May 30, 2017 - link

    Any word on whether Skylake-X is treated like the rest of Skylake by Microsoft/Intel and has official Windows 7/8.1 support? Or will Broadwell-E remain the fastest/last CPUs that have official multi-boot support for Windows 7/8.1/10?x
  • PUN - Tuesday, May 30, 2017 - link

    AMD will ALWAYS undermine intel with pricing, regardless of either performances. Now, they can compete with similar performance at lower price, giving consumer a choice.
  • Bullwinkle J Moose - Tuesday, May 30, 2017 - link

    Time to stop bragging about who has the best A.I. and start putting it to use

    Which "Intel" core count is the best "Value" for Gamers / Media Production / General Business / etc

    On day 1 (when ALL core counts are finally available to the public), what is the optimum core count for software that is currently available in each category of usage

    Is 4/6/8 core the best value for home use?
    Is 14/16/18 core the best value for certain business case usage?
    Or would 10/12/14 be the best all around value for other business classes?

    Time to call Microsoft and ask what the top 20 software applications are for each usage case scenario for Windows 10 and see who has the best A.I. to figure this out

    Best value vs core count on day 1 availability could be VERY different from 6 month / 1 year and 3 year mark after launch, so keep us updated!

    Thanks
  • edzieba - Tuesday, May 30, 2017 - link

    For the vast majority of consumer desktop applications (gaming, office, web browsing) you're looking at a spread of workloads from single-threaded to just about benefiting from a 4th core. Outside of video encoding, 3DCG rendering, or data analytics, there aren't that many workloads that really benefit from throwing more cores at them the handful that do are embarrassingly parallel and trivially parallellised have already moved over to GPGPU.

    We've heard the siren song of "We'll add more cores, and the parallellism will come!" many times over the last decade, and it has yet to happen.
  • Gothmoth - Tuesday, May 30, 2017 - link

    i run 4 heavy apps parallel most of the time.. if i give all of them 4 cores it works very well.
    and yes i do video editing and 3d rendering.

    people who run a lot of VM´s will love more cores also.
    this is not for word processing .. but there are enough people who can make use of 16 and more cores.
  • Threska - Friday, June 2, 2017 - link

    VMs are kind of the sneak that'll get more cores onto people's machines. In the enterprise VMs are something that's used a lot, but for desktop, not so much. But there are advantages to VMs that would benefit a desktop user, especially in this day and age, of a hostile internet, not to mention better software delivery.
  • theuglyman0war - Thursday, June 8, 2017 - link

    outside of billion dollar industries...
    ok?
  • Gothmoth - Tuesday, May 30, 2017 - link

    why is there basically no covering of AMD at computex at anandtech??

    it´s 10 articles about intel for one about AMD?

    where is an article about the displayed x399 boards?

    i am wrong in my perception?
  • fanofanand - Wednesday, May 31, 2017 - link

    Not wrong at all, I have seen just about nothing.
  • Meteor2 - Saturday, June 3, 2017 - link

    Not wrong. It's very weird how AMD have suddenly vanished from Anandtech's coverage. I don't think they even reported the Threadripper announcement a few weeks back.
  • Gothmoth - Sunday, June 4, 2017 - link

    good to know im not imagining this.

    seems like intel is paying anandtech a lot to not report much about AMD.
    back to the 90s and intels shady tactics... so soon?

    and where are the critical voices about the crippled x299 ?

    linus on x299:

    https://www.youtube.com/watch?v=TWFzWRoVNnE
  • Drumsticks - Tuesday, May 30, 2017 - link

    If it's not too late, Ian, can you follow up with Intel on whether Skylake-X is soldered? Kaby Lake-X is, but it's at least KIND OF (read: not) understandable since they're based off of Kaby Lake. Are the Skylake-X parts different?
  • Bullwinkle J Moose - Tuesday, May 30, 2017 - link

    Yeah, I'd like to see someone justify an 18 core processor with software currently available

    If the performance of your graphics software falls of a cliff after 6 or 8 cores, then 18 cores will never be used

    or
    If you only use over 10 cores say once a month, you'd be much better off financially by keeping an older multicore around for the odd job when needed
  • dullard - Tuesday, May 30, 2017 - link

    How about someone who crunches numbers for a living but who cannot justify a $100,000+ server?

    http://www.ansys.com/Solutions/Solutions-by-Role/i...

    http://www.ansys.com/Solutions/Solutions-by-Role/i...
  • Gothmoth - Tuesday, May 30, 2017 - link

    VM´s, streaming... there are a lot of multitasking cases where a 18 core makes sense.
  • Flunk - Tuesday, May 30, 2017 - link

    Not really, both of those cases would be better served by more, cheaper boxes. Unless you mean game streaming and then you don't need that many cores anyway (6 is more than enough, you just need 1 more than the majority of people).
  • Icehawk - Tuesday, May 30, 2017 - link

    In our IT environment, and many others, the goal is to reduce to a minimum the physical number of boxes required and run as much virtualized. Although if we are talking any reasonable desktop usage, yeah this stuff is way overkill except for a few usage cases.
  • Gothmoth - Sunday, June 4, 2017 - link

    in any sane environment that´s the goal.
  • Gothmoth - Sunday, June 4, 2017 - link

    more boxes?

    why should you be willing to buy more PSU´s, more peripherals overall waste space when you can run it in one box?

    not to mention maintenance of many systems instead of one.. i guess you have no clue how the reality looks, sorry.
  • haukionkannel - Tuesday, May 30, 2017 - link

    Something interesting
    https://m.youtube.com/watch?feature=youtu.be&v...

    Intel is using bublegum in all these prosessors. Maybe we will get upgraded tin versions next year at upgraded prices ;)
  • Morawka - Wednesday, May 31, 2017 - link

    Just wow.. It's called Lead Free Solder intel!!!!!!! You can still use solder and still be environmentally friendly (that's been intel's excuse for the paste)
  • shady28 - Tuesday, May 30, 2017 - link

    Looks like a marketing stunt to me. I welcome the 6c/12t part, but most applications can't even effectively use 4c/8t processors. It is a complete waste for 99% of buyers and even the remaining 1% are likely to rarely see a benefit.
  • Maleorderbride - Tuesday, May 30, 2017 - link

    Your statement just betrays your ignorance and your lack of imagination. Computers are tools for quite a few people, so they will pay considerable sums for better tools which in turn earn them more money.

    Video editing and 3D work can and will use all cores. While I am not going to claim they are a large percentage of the market, they routinely purchase 8/10 core options. I have quite a few customers running X99 boards with a single E5-2696 V4 dropped in ($1400 on ebay) and it excels in some workflows.

    They are not "rarely" use these extra cores--they are using them every single day and it is the primary reason for purchase.
  • shady28 - Tuesday, May 30, 2017 - link


    Lol! The childish insults aside, you think those thoughts you regurgitated are new? Professional video editors make a tiny fraction of a tiny fraction of the market, and if they are smart they aren't using CPUs for much. Most people who profess this 'need' to do 3D video editing are playing anyway, not working. Like I already said, a fraction of a 1% use case.

    Common sense says Intel did not release these for the 0.1% of users who might be able to take advantage of it. They released it to make suckers of the other 99.9%. Your comments indicate they are once again succeeding.
  • Maleorderbride - Wednesday, May 31, 2017 - link

    Your post made a claim about 100% of the market. Obviously you over-claimed. You can't edit posts here, so your "like I said," followed by a watered down version of your post is just a transparent attempt to save your ego. Your assumptions about whether people who claim to be video editors are really "working" is irrelevant.

    As for blaming video professionals for even using a CPU, you obviously are unaware that some codecs are entirely CPU bound when transcoding, and that these professionals (DITs especially) are under pressure to complete transcodes as quickly as possible on location. Every other person there is waiting for them.

    Are many things GPU accelerated? Yes, but being "smart" has nothing to do with it. Sometimes one can use those 2x 1080 Ti's, but sometimes you need 18+ cores, or both. But I guess you got me, I'm a "sucker" if I buy the best tool for a job that makes money.
  • shady28 - Friday, June 2, 2017 - link

    First sentence in your post is a lie, else you're reading comprehension is challenged. My first post is just a few lines up, it said :
    "It is a complete waste for 99% of buyers and even the remaining 1% are likely to rarely see a benefit."
  • prisonerX - Wednesday, May 31, 2017 - link

    You use applications that are highly parallel everyday and you don't even know it. Maleorderbride is right: you're ignorant and unimaginative.
  • Meteor2 - Saturday, June 3, 2017 - link

    No shady28 is correct here. People who *truly* need HCC on desktop are a vanishingly small minority. This is about headlines and marketing.
  • Namisecond - Wednesday, May 31, 2017 - link

    Welcome to the 1%?
  • helvete - Friday, September 8, 2017 - link

    Have you ever tried to run more than one application at a time? /s
  • Bulat Ziganshin - Tuesday, May 30, 2017 - link

    i can give you details about avx-512 - they are pretty obvious from analysis of skylake execution ports. so

    1) avx-512 is mainly single-issue. all the avx commands that now are supported BOTH on port 0 & port 1, will become avx-512 commands supported on joined port 0+1

    2) a few commands that are supported only on port 5 (this are various bit exchanges), will be also single-issued in avx-512, which still means doubled perfromance - from single-issued avx-256 to single-issued avx-512

    3) a few commands that can be issued on any of 3 ports (0,1,5), including booleans and add/sub/cmp - so-lcalled PADD group, will be double-issued in avx-512, so they will get 33% uplift

    overall, ports 0&1 will join when executing 512-bit commands, while port 5 is extended to 512-bit operands. joined port 0&1 can execute almost any avx-512 command, except for a bit exchange ones, port 5 can execute bit exchanges and PADD group

    when going from sse to avx, intel sacrificed easy of programming for easy of hardware implemenation, resulting in almost fuull lack of commands that can exchane data between upper&lower parts of ymm register. avx-512 was done right, but this means that bit exchange commands require a full 512-bit mesh. so, intel mobed all these commands to port 5 providing full 512 bit implementation, while most remaining commands were moved into ports 0&1 where 512-bit command can be implemented as simple pair of 256-bit ones

    lloking at power budgets, it's obvious that simple doubling of execution resources (i.e. support of 512 bit commands instead of 256-bit ones) is impossible. in previous cpu generation, even avx commands increased energy usage by 40%, so it's easy to predict that extending each executed command to 512 bits will require another 80% increase

    of course, m/a analysis can't say anything about commands absent in avx2 set, so my guess that predicate register manipulations will also go to port 5, just to make the m/a a bit less asymmetric

    also it's easy to predict that in the next generations the first "improvement" will be to add FMAD capability to port 5, further doubling the marketing perfromance figures

    finally, their existing 22-core cpus are already perfrom more than SP teraflop, but this time teraflop will go into HEDT class (while 10 broadwell cores at 3 GHz are only 0.9 tflops capable)
  • zodiacfml - Tuesday, May 30, 2017 - link

    It was all AMD. Intel would rather let AMD starved for cash than lose profit.
  • Maleorderbride - Tuesday, May 30, 2017 - link

    Kudos to AMD for forcing Intel to do something interesting for a change!

    It is a bit of a low blow to gimp the 7820X with 28 PCI-e lanes though. It should still be great performance at that price, but there are some instances where I want all five PCI-e slots occupied.
  • Morawka - Tuesday, May 30, 2017 - link

    intel can keep their dual ring designs, no thanks. They won't overclock well at all, wait and see. I'm very dissapointed in the pricing.. I though intel would offer a 8 core for $350 this time around to match ryzen, but nope.
  • NEGuy123 - Tuesday, May 30, 2017 - link

    24 Core Threadrippers in 2018 (MY PREDICTION)

    I actually feel that if AMD can get to 7nm process next year and we will easily see 24 core Threadrippers out next year. at 7nm, i feel AMD Ryzens will be 12 core each (compared to the 8).

    AMD has said they are working on 7nm and saying they will have a 48 core server. Which tells me 12 x 4 = 48 Cores.

    All this tells us that Ryzen will be 12 core dies. So, next year AMD can slap 2 of those together.

    This is my prediction for 2018

    GLAD TO SEE YOU BACK AMD!
  • Meteor2 - Saturday, June 3, 2017 - link

    AMD won't be on 7 nm until 2019, as Global Foundaries is doing it property.
  • Meteor2 - Saturday, June 3, 2017 - link

    Properly! Apple auto-correct really is bollocks.
  • jhoff80 - Tuesday, May 30, 2017 - link

    Do we know if any of the Skylake-X chips will include full HEVC / 10-bit decode/encode, or would that still remain Kaby Lake only? (I assume Kaby Lake only, but figured it was worth asking.) I might be looking to upgrade my Haswell-based media server which is struggling with software decode when transcoding.
  • extide - Tuesday, May 30, 2017 - link

    I would expect none of the X299 based CPU's to support that -- but then again they also require a video card so you would get the support from, that.
  • Timoo - Tuesday, May 30, 2017 - link

    Looks like Intel is finally slicing it's prices:
    The 7800X looks like the successor of the 6800K, just $100 less expensive...
    The sweet spot, as mentioned above, is indeed the 7820X, which looks like the successor of the 6900K, for a whopping $500 less expensive.

    That makes the i9 7900X a re-branded i7 6950X, with its price almost cut in half. Just the i9 7920X seems to be new in line. Again; with its price cut in half. Where the 6950X costs almost 2k right now, suddenly they offer 2 extra cores for 600$ less...

    Seems like AMD did do something for the market after all: Intel cuts deep in their prices ánd start spicing up their core-count. Since the Margins for AMD are approx 30-35% right now, it means Intel is lowering their margins on CPUs considerably...
  • Morawka - Tuesday, May 30, 2017 - link

    Nobody seems to notice the 7820X and every sub $1K processor in the new skylake lineups only come with 28 lanes.

    to get 44 Lanes, you gotta spend a grand or more
  • LuckyWhale - Tuesday, May 30, 2017 - link

    It's so unconsciously idiotic for tech articles to quote price of products as $1999. What the hell are you doing? Hello?!!! You are not Intel's marketing department.
  • catavalon21 - Tuesday, May 30, 2017 - link

    Idiotic? Hardly. I don't believe Ian was encouraging buying these, but he is providing insight on something ... unexpected? We (this community) have long criticized Intel's lack of much interesting for a long time. This is interesting. It may not be useful to many of us, but it's very interesting.

    Complaining about a $2000 processor with these specs? Highest end Intel enthusiast processors have often been expensive. Several enthusiast Extreme processors from a decade ago weighed in anywhere from $1000 and up at the time, with performance that blew chunks compared to this beast. https://en.wikipedia.org/wiki/Yorkfield_(microproc...

    If you think $1999 is too expensive for AT to preview, I wonder what you thought about the $5000 Dell monitor covered a while back. http://www.anandtech.com/show/11220/dells-32-inch-...
  • catavalon21 - Tuesday, May 30, 2017 - link

    If I could only copy and paste working links, then we'd have something.
  • wrkingclass_hero - Tuesday, May 30, 2017 - link

    Why is this announcement being reported on, but Threadripper never was? I even did a double check on this site and searched Threadripper and this is the only article that popped up.
  • catavalon21 - Tuesday, May 30, 2017 - link

    That's a big Twinkie.
  • Maleorderbride - Wednesday, May 31, 2017 - link

    Because AMD was literally on stage when you posted that. The article is up now.
  • ThreeDee912 - Wednesday, May 31, 2017 - link

    Uhm... I guess you need better searching skills.

    http://www.anandtech.com/show/11482/amd-cpu-update...
  • Meteor2 - Saturday, June 3, 2017 - link

    Threadripper was announced weeks ago, at the AMD Financial Analysts' day. And indeed Anandtech did not report it. https://arstechnica.co.uk/gadgets/2017/05/amd-ryze...

    It's hard to conclude anything other than that Purch has received/will receive a payment from Intel in return for only reporting Intel stories for a period.
  • BillR - Tuesday, May 30, 2017 - link

    I wouldn't get too excited, Amdahl's Law is asserting itself and you get diminishing returns by adding CPUs. It all comes down to the apps and how well and how many CPUs they can use effectively.
  • boozed - Tuesday, May 30, 2017 - link

    Was hoping for some more powerful LGA1151 CPUs
  • mdw9604 - Tuesday, May 30, 2017 - link

    FYCK Intel. If AMD had not come out with Ryzen, they would still be sticking with 4 Core desktop processors and 8 cores on the HEDT machines and charging $1K plus for them. They are trying to make sure AMD can't compete. I'm buying AMD, I am not continuing to supporting Intel's monopolistic x86 stranglehold.
  • Bullwinkle J Moose - Wednesday, May 31, 2017 - link

    Preferred Core / Turbo 3 needs another update for the upcoming Cannon Lake

    Even if a single core could run @ 4.8Ghz single thread continuous while the second best core might reach 4.7 and another 4.6, why not let the core cool off while temporarily boosting the clocks "above" their "continuous" max speed on single threaded apps?

    Cycle the cores to max "temporary" clock speed 5.2 / 5.1 / 5.0Ghz while the previous main core is cooling down

    Turbo 4?
  • Ej24 - Wednesday, May 31, 2017 - link

    It's worth noting that ryzen 7 is akin to lga115x. It's mainstream. Motherboards will cost half of what x299 will cost. There should be no comparisons made between am4 and lga2066. They're two different market segments. People keep making the comparison b/c core counts but it doesn't make sense. The Intel HEDT should be compared to Threadripper. Amd literally doubled our core count per dollar at the mainstream. Intel still hasn't.
  • SanX - Wednesday, May 31, 2017 - link

    Billion is very scary word for unwashed. For 100+ billion market cap company it is a change.
  • Notmyusualid - Thursday, June 1, 2017 - link

    @ SanX

    I feel like 'the unwashed' this morning, I better move my @ss...

    :)
  • SanX - Wednesday, May 31, 2017 - link

    I wrote this in respond to the two trolls who think that the cost of the fab is not included into the price of the chips.

    /* Anandtech, fix your obsolete discussion forum which does not have Edit function and slips posts to the end from the threads if use Android Google browser with JS off.
  • close - Thursday, June 1, 2017 - link

    Dude, you're the one who calculated that:
    "$2000 for 18 cores is $100 per core.
    This is approximately 20x the production cost."

    And concluded that:
    "It is always good for monopoly to be a monopoly."

    Don't be surprised that people take a p*ss at you for what you write. You are the one who suggested the relationship between the production price per core and the retail price is somehow relevant. Why not "per transistor"? Or "per mm^2"?

    You chose an irrelevant metric (price/core when the CPU has additional components that you ignored), you ignored that there re many objective factors that make such a CPU more expensive (like yields which are worse the bigger the chip), you assumed everything is linear and can be quickly presented as a simple napkin calculation, and you tried to sell it. This isn't how any of this works so now it's easy to question your understanding on these topics. Maybe you're too washed...
  • tamalero - Wednesday, May 31, 2017 - link

    140W TDP? jesus...
  • Hrel - Wednesday, May 31, 2017 - link

    I can't believe they released to consumers at all, what consumer pay $2000 for a CPU? Who is this for?

    Hell, I had a hard time getting a fortune 10 company to agree to pay more than $1000/CPU for the servers that ran their own network and directory.

    I truly cannot imagine any consumer spending that much on a CPU. This baffles my mind.

    Someone hit me up when Anandtech does a review of 200-$300 CPU's, as anything beyond that better be for a fucking server.
  • Morawka - Wednesday, May 31, 2017 - link

    dude you dont know how many rich kids and benchmarkers there are in the world. As Ian noted, the top end Extreme Edition is always the best selling CPU out of all of them.. That was even true for Broadwell's 10c $1750 CPU last year.
  • FwFred - Thursday, June 1, 2017 - link

    My company paid for a dual 10 core Ivy Bridge Xeon workstation for me and every one of my coworkers. We'd save money moving to the 18 core i9s on the next round.
  • PrazVT - Wednesday, May 31, 2017 - link

    I totally whiffed on this announcement. I JUST ordered a 6800k / mobo / etc that are coming in tomorrow. Wanted to upgrade from my Core i7 970 system (a GPU upgrade from my old GTX 770 SLI setup later). Presumably I should return the cpu and mobo immediately and wait for the 7800x?
  • Notmyusualid - Thursday, June 1, 2017 - link

    It depends if you got a good deal on it...

    The 6800K will feel like night & day compared to your Gen 1 proc.

    But I feel another 20 to 30% IPC is heading our way...

    The question remains, was it a good deal? If the answer was yes, then continue.

    Also, can you wait? This will be half a year away (I expect). If the answer is no, then continue with with you have again.

    You can push that 6800K hard, reward it with a liquid cooler. Even if its just a Closed Loop affair, get a 240mm.
  • Lolimaster - Thursday, June 1, 2017 - link

    Maybe simply go for the Ryzen 7 1700X?

    Way better cpu than the 7800X for the same money.
  • AGS3 - Thursday, June 1, 2017 - link

    i9 is just a branding scheme to combat the Ryzen pressure. The i9's will just be the high end desktop parts that are Xeon's with ECC removed and repackaged. It is nothing different than prior high-end desktops which were labeled i7 with an X at the end. They do not have integrated graphics.
  • GeoffreyA - Thursday, June 1, 2017 - link

    It's a bit like the Pentium 4 Emergency Edition.
  • GeoffreyA - Thursday, June 1, 2017 - link

    Ryzen is very much like a new Athlon or Athlon 64.
  • Ro_Ja - Thursday, June 1, 2017 - link

    AMD's got them by the balls. It's just a matter of how much profit will Intel get with these new product.
  • SaturnusDK - Thursday, June 1, 2017 - link

    A late comment and no one will read it but as I see it even if Intel gets the 12-18 cores models out the door which still remains to be seen.

    Then it will still most likely be hampered by an ineffective inter-core connect. I didn't hear it mentioned anywhere but if the Xeon processors are anything to go by then the high core count models will be realized by stitching two cores together with the interface used between sockets. And as anyone who has ever used a dual Xeon workstation knows that two Xeons aren't anywhere near twice as fast as a single Xeon. The increase is 50% better on average.

    Furthermore it looks to be hampered by a very low TDP. So it looks like while Intel might win the core count battle based on a paper launched product, the performance crown might actually still be in the hands of AMD. It'll certainly be extremely close in performance. Price however is likely to be twice as high as a comparable AMD product.

    Also, 44 PCI-E lanes only for the top range products? And it caps at 44 lanes? What the hell are you thinking Intel? That's just crippling any hope of ever being competitive with AMD in this market segment you might have had.
  • FwFred - Thursday, June 1, 2017 - link

    If you think the penalty paid by the dual ring bus in the HCC die is too high, you really won't like the AMD solution.
  • SaturnusDK - Thursday, June 1, 2017 - link

    The infinity fabric seems to be working fine with minimal scaling performance loss for the Ryzen chips already on the market so there's no reason to believe that extending the bus will incur a severe performance penalty.
  • rocky12345 - Thursday, June 1, 2017 - link

    I got to ask Anandtech site gives all of this love to Intel for releasing products we already expected except for the 18/36 CPU (Thanks AMD for getting fire under Intel's butt again). What I am saying is there are at least three headlines for the Intel crap but one little byline for AMD's threadripper crap. I like Anandtech and all but AMD's release is way more important to the industry than this Intel release because of it were not for AMD new CPU line Intel would have just once more released a ho hum product with little extra to offer and probably $500 or more than the prices they are now asking. Give credit where credit is needed. You say new stuff in the industry does not excite you much anymore. Well for me and hopefully anyone else with a brain are more excited for the New AMD tech than this rehashed Intel tech. Thanks
  • KalliMan - Friday, June 2, 2017 - link

    There is a "small" mistake here. The price of 1800X is now ~ 429-449. You are comparing 2 CPUS with that belong to completely different price ranges( 1800X is 150- 170 $ cheaper than 7872X) . And be sure in Multitasking it will be superior.
  • cekim - Friday, June 2, 2017 - link

    To all those prattling on about how such processors have no market or purpose, I direct your attention to ebay... clearly you are wrong. The question is not whether there is a market for consumer HCC chips, the question is what that market is willing to pay for them?
  • alpha754293 - Friday, June 2, 2017 - link

    re: the whole AMD vs. Intel thing all over again

    I'm not worried about AMD as a threat at all.

    Their latest processor, on some workloads, still barely beats an Intel Core i5(!) or can only beat some of the mid-range Core i7s at best.

    I've long been an "AMD guy" because they used to be a value proposition - where you can get decent performance at a much lower price compared to the Intel counterparts.

    But times have changed and that isn't really quite the case anymore. AMD CPUs really aren't that much cheaper compared to Intel's, but Intel's CPUs perform SIGNIFICANTLY better than AMD (mostly because AMD went the way of the UltraSPARC Niagara T1, by having only ONE FPU shared across multiple ALUs) - and of course, the problem with THAT design idea/approach is that fundamentally, CPUs are massively glorified calculators.

    And AMD choose to cripple their product's ability to do calculations.

    People have a tendency to want to focus on IPCs (as it is here). But really, you need both IPC AND FLOPs and a MUCH BETTER metric to compare against is FLOP/clock (because it tells you about the processor's computational efficiency), which almost NO one writes about anymore.

    I'm already running 16 cores acrossed three systems and I just make the requisition for a 64-core system.

    The "thing" that I have found/discovered with systems that have lots and lots of cores is that you REALLY WANT, should, and NEED to have ECC RAM because if you try to get it to do many things at once, in order to prevent issues with the programs interfering with each other, the ECC is a patch-style method that can help correct some of that.

    When I've launched 6 runs of a computationally intensive task at once, some of them fail because my current systems don't have ECC Registered RAM (and I am not sure if the CPU knows what to do with it (being that the memory controller is on-die) and to deal with and work with memory coherency.

    While it might be a welcome changed on the ultra high end, extreme enthusiast front, you can get a system that does a LOT more for a LOT less than what it would cost you to use these processors by using server grade hardware (albeit used - which, in my opinion, if it still works, why not? I don't see anything "wrong" with that.)

    A system using the new 16-core CPU is likely going to run you between $3000-5000. The system that I just bought has 64 cores (four times more) with 512 GB of RAM for the same price.
  • Meteor2 - Saturday, June 3, 2017 - link

    Literally TL;DR.
  • Lolimaster - Saturday, June 3, 2017 - link

    If you mean low threaded then you need to look at the Ryzen 5 1400-1500X which is 90% of the i7 7700 and its obviously "better" than the top of the line Ryzen at "some workloads, mean lower thread apps/games",

    $160-190, rip intel.
  • Gothmoth - Sunday, June 4, 2017 - link

    so much words for trolling.... you took the time to write so much but when it comes to what you supposedly bought you suddenly become unspecific.... no letters and words to write it out you can only say "The system that I just bought ".
  • twtech - Friday, June 2, 2017 - link

    So what are some common applications for this many cores? Rendering, compiling large C++ projects like Unreal 4 for example. It may not be huge, but there is a market for more cores, and Intel doesn't want AMD taking all of it.
  • slickr - Saturday, June 3, 2017 - link

    So not only are they introducing less cores overall than AMD's threadripper at 32/64, they also cost a ton more money, require a new socket, it features locked overclocking and they cost more than AMD's equivalents.

    Intel really do have nothing, they announced their 14/16/18 cores, but they have no info on them, meaning it was a last minute thing, where they would only be available late 2017, but they have nothing else to go against AMD, so they are playing a move to trick people into thinking they have products up and coming soon, when they don't.
  • Gothmoth - Sunday, June 4, 2017 - link

    so true (linus about x299):

    https://www.youtube.com/watch?v=TWFzWRoVNnE

    it´s intel doing as less as possible to make as much money as possible....
    and customers are stupid enough to defend that.

    yes it capitalism.. but i would be stupid as a customer to like it.

    AMD is offering me more at the moment so i decide with my wallet.
  • Timoo - Saturday, June 10, 2017 - link

    A little late, but I think it is worth mentioning:
    Linus' review of the i9.
    Scrapped to pieces...https://www.youtube.com/watch?v=TWFzWRoVNnE
  • Gothmoth - Monday, June 12, 2017 - link

    64 PCI lanes for under 900 euro or 44 pci lanes for over 900 euro... mhm let me think about it.....

Log in

Don't have an account? Sign up now