In my opinion the entire Intel HEDT lineup is a joke. And the 9980XE: $180 more for literally just a bit over *half* the cores and threads. Sure it has better lightly threaded performance but surely that's not the intention of this processor, and surely it is not worth charging this insane 'Intel Tax' premium for it.
Intel is free to charge whatever they want for a device that I have zero intention of purchasing. Most professionals I know have stopped using desktop computers for their daily drivers. The Dell XPS 15 and Apple's 15" MacBook Pro seam to be the weapons of choice these days. These products surely have their uses, but in the real world, most users are happy to sacrifice absolute performance for mobility.
Most be a strange world you live on. Mobile won't ever be anything close to a desktop for daily tasks. I don't know any professional who have did that. They use mobile devices mainly to view items they did on desktop, not for working.
Really? I work in software development (WEB, C++, OpenGL, and yes our own ray tracing engine) We have one guy with a desktop, the rest of the developers use either an XPS 15, a MacBook Pro, and one guy with a Surface Book. All were given a choice...this was the result.
Interesting story about how we got here... Windows used to be a requirement for developing browser plugins. But with the move to Web Assembly, we can now compile and test our plugin on the Mac just as easily as we do on Windows. While many fanboys will lament this change .. I personally love it!
Yea, for code development only. Mobility has been the choice for that for years.
Not everyone is a coder though. Some need these desktops for rendering big animations, videos, etc. You're simply not going to do that in any meaningful way on a laptop
Rendering and production work can indeed happen on laptop hardware. I don't argue that desktop hardware with fewer limits on TDP and storage aren't a faster way to accomplish the same tasks, but as Team noted, given a choice, a lot of people opt for mobility over raw compute power.
It's a big joke to use XPS or Macbook GPU to do anything intensive. It's good for remote code editing though (except macbooks with absolute terrible keyboard)
Define "intensive." Our software does real-time (WebGL) and photo-realistic (ray-tracing) rendering. I suppose that a Path Tracing engine would be MORE intensive. But the goal of our software is to be as ubiquitous as possible. We support the iPad and some Android tablets.
There's your answer: anything that runs on iPad and Android Tablets is not "intensive". I'll grant you that it's "intensive" compared to what we were doing on workstations a decade ago, and mobile is closing the gap... but a workstation today has 24-56 cores (not threads) at 5Ghz and dual NVidia 2080 GPUs. You can get a 12-core CPU and dual 1080 in the pinnacle gaming laptops but they don't have ECC or the certifications of a workstation. At best they have half to 2/3 the performance. If you're paying your engineers by the hour you don't want them sitting on their hands twice as long. But I can see how they might make that choice for themselves. You make an excellent point there, lol.
@linuxgeek, logged in for the first time in 10 years or more just to laugh with you for having cracked the case with your explanation there at the end!
I have IBM Thinkpad 530 with NVidia Quadro - in software development unless into graphics you don't even need more than integrated - even more for average business person - unless you are serious into gaming or high end graphics you don't need highend GPU. Even gaming as long as you are not into latest games - lower end graphics will do.
You see .. there you are TOTALLY WRONG. Supporting the iPad is a MAJOR REQUIREMENT as specified by our customers.
Augmented reality has HUGE IMPLICATIONS for our industry. Try as you may ... you can't hold up that 18 core desktop behemoth (RGB lighting does not defy gravity) to see how that new Pottery Barn sofa will look in your family room. I think what you are suffering from is a historical perspective on computing which the ACTUAL WORLD has moved away from.
@TEAMSWITCHER - I think your comments are an unbalanced result between fantasy and ideals. I think you're pretty superficially, even childishly looking at the use of technology and communicating with the objective world. Of course, a certain aspect of things can be done on a mobile device, but by its very essence it is just a mobile device, therefore, as a casual, temporary solution. It will never be able to match the raw power of "static" desktop computers.working in a laboratory for physical analysis, numerous simulations of supersymmetric breakdowns of material identities, or transposition of spatial-temporal continuum, it would be ridiculous to imagine doing on a mobile device.There are many things I would not even mention.
All seven of our local development teams have long since switched from desktops to laptops. That conversion was a done deal back in the days of Windows Vista and Dell Latitude D630s and 830s. Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. The reality is that desktop computers are for the most part a thing of the past with a few instances remaining on the desks of home users that play video games on custom-built boxes as the primary remaining market segment. Why else would Intel swing overclocking as a feature of a HEDT chip if there was a valid business market for these things?
Yah because you don't do anything intensive with the jobs you have, of course you would use laptops or whatever mobile. But the reality is most people would use desktops because simply faster to get stuff done, and more powerful.
..and if you are doing anything intensive with laptops..that just means company you work for is behind the curve and just being cheap and not fork out money for the right hardware.
There are over 250K people on the payroll. There ARE desktop PCs around, but they are few and far between. I'm not going to get into an extended debate about this because it won't change anyone's perspective, but I do believe you've got a slight misconception about the usefulness and flexibility of portable computer hardware. A simple look at the availability of desktops versus laptops should be enough to make it obvious, for most people, computer == laptop these days.
You're eliding the difference between "convenient and sufficient" and "as powerful as anyone needs".
I'll absolutely grant that if you're only going to have one system for doing your work and you move around a fair bit, then it absolutely makes sense to have that system be mobile, even if you lose a bit of edge-case performance.
For people doing /serious/ GPU grunt work something like an XPS 15 is going to provide between 1/2 and 1/3 of the power they could get with a similarly priced desktop. That compromise doesn't make any sense for someone whose job does not require mobility.
So sure, notebooks are better than ever for a large number of people. Doesn't make desktops and HEDT chips functionally irrelevant for businesses, though. If you can really use 18 cores for the work you're doing then being provided with an XPS 15 will be, at best, a sad joke.
Any laptop is essentially on a different planet than any of the processors covered in this review (doesn't matter if we are talking Intel or AMD). 1. If it is possible to do your work on a laptop (which I am myself at this very moment) then you (and me) are not the target audience for these CPU's. In fact, I'm not entirely sure why you even bother to read or comment on the story? 2. If you have to ask if you need it, you don't need it. 3. If you have to think more than about 1 second to make a decision between one of these and a laptop, then you don't need it. 4. If you do need one, then you already know that.
Most people don't need one, including me. I read these things because the technology is interesting and because I find it interesting what others might be doing. I don't really feel any need to insist that others need what I need and could not possibly need anything else.
So a differing opinion than yours should mean that someone not read an article or comment on it. That appears to be nothing more than a self-protective mechanism intended to erect a bubble in which exists nothing more than an echo chamber filled with your own beliefs. That's hardly a way to integrate new thinking, but I do understand that a lot of people fear change in the same way you do.
"But the reality is most people would use desktops because simply faster to get stuff done, and more powerful."
See, that's the problem with your reasoning. You assume that most people need power when they do not. The reality is that the majority of people who need to use computers for work do not need to do rendering or any kind of intensive task. So no, most people don't use desktops nor would they want to use desktops given the opportunity. They use laptops.
"Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. "
The Man's advantage to the Worker Bees using laptops: their always 'on the job'. no time off. as close to slavery as it's legal to be. some smart folks are truly stupid.
"The Man's advantage to the Worker Bees.." (just quoting because of the lack of continuing indents in Anandtech's 1990's-era comment system)
I think that's a bit of a stretch in our case. My division doesn't do on-call and we strictly prohibit our lower tier managers from tapping employees outside of their normal work hours. Even checking company e-mail outside of work hours is against posted (and enforced) policy. If we must, due to emergencies, they absolutely have to be compensated for the time regardless of whether or not they are hourly or salaried workers. I haven't seen an "emergency" that couldn't wait until the next day so that policy has not been put into use in at least the last five years. Computational mobility is no excuse to allow invasions into off-the-clock time and I for one won't allow it.
I hate to admit it but PNC is right. Super-high-powered desktops are an anachronism. If you need REAL horsepower, you build a server/compute farm and connect to it with thin-client laptops. If you are just doing software development, the laptop cpu is usually good enough.
This is especially true of single socket monsters like these HEDT chips. The only reason they exist is because gamers will pay too much for everything. It's nothing more than an expensive hobby, and like all hobbies at the top end is all "want" and very little "need". The "need" stops somewhere around 6 or 8 cores.
It's exactly the same as owning a Ferrari and never taking it to the track. You will never use more than 20% of the full capabilities of it. All you really need is a Vette.
PNC is not right at all, he's completely wrong. Unless your job requires you to walk around and type at the same time using a laptop is a net loss of producitivity for zero gain. At a professional workplace anyone who thinks that way would definitely be fired. If you're going to be in the same room for 8 hours a day doing real work, it makes sense to have a desktop with dual monitors. You will be faster, more efficient, more productive, and more comfortable. Powerful desktops are more useful today than ever before due to the complexity of modern demands.
Well of course for programming its ok. That is like saying you moved from a desktop to a phone for typing. It requires nothing to type hardly for power. lol That pretty much as always been the case.
I think you are implying programming is not a CPU intensive task? Certainly it can be low intensity for small projects, but trust me it can also use as much CPU as you can possibly throw at it. When you have a project that requires compiling thousands or tens of thousands of files to build it ... the workload scales fairly linearly with the number of cores, up to some fuzzy limit mostly set by memory bandwidth.
I also work in software development (games), and my experience has been completely the opposite. I've actually only known one programmer who preferred to work on a laptop - he bought a really high-end Clevo DTR and brought it in to work.
I do have a laptop at my desk - I brought in a Surface Book 2 - but I mostly just use it for taking notes. I don't code on it.
Unless you're going to be moving around all the time, I don't know why you'd prefer to look at one small screen and type on a sub-par laptop keyboard if there's the choice of something better readily available. And two 27" screens is pretty much the minimum baseline - I have 3x 30" here at home.
:And then of course there's the CPU - if you're working on a really small codebase, it might not matter. But if it's a big codebase, with C++, you want to have a lot of cores to be able to distribute the compiling load. That's why I'm really interested in the forthcoming W3175x - high clocks plus 28 cores on a monolithic chip sounds like a winning combination for code compiling. High end for a laptop is what, 6 cores now?
What utter nonsense. I'be been working on large and complex c++ codebases (2M+ LOC for a single product) for over a decade, and compute power is an absolute necessity to work efficiently. Compile times such beast scales linearly (if done properly), so no one wants a shit mobile cpu for their workstation.
Mobile has been this way for decade - I got a new job working at home and everyone is on laptops - todays laptop are as powerful as most desks - work has quad core notebook and this is my 2nd notebook and first one was from nine years ago. Desktops were not used in my previous job. Notebook mean you can be mobile - for me that is when I go to home office which is not often - but also bring notebook to meeting and such.
I am development C++ and .net primary.
Desktop are literary dinosaurs now becoming part of history.
You are not working on big enough projects. For your projects, a laptop may be sufficient; but for larger projects, there is certainly a wide chasm of difference between the capabilities of a laptop and those of a workstation class developer system.
Today's laptops are not as powerful as desktops. They use slow mobile processors, and overheat easily due to thermals. If you're working from home you're still sitting in a chair all day, meaning you don't need a laptop. If your company fired you and hired someone who uses a desktop with dual monitors, they would get significantly more work done for them per dollar.
I wouldn't call them very "professional" when they are sacrificing 50+% productivity for mobility.
Anyone serious about work in a serious work environment* has a workstation/desktop and at least 2 of UHD/4k monitors. Anything else is just kidding yourself thinking you are productive.
I never said that we didn't have external monitors, keyboards, and mice for desktop work. However, from 25 years of personal experience in this industry I can tell you emphatically .. productivity isn't related to the number of pixels on your display.
Exactly - I work with 15 in IBM Thinkpad 530 that screen is never used - but I have 2 24in 1980p monitors on my desk at home - if I need to go home office - hook it up another monitor - always with external monitor.
It is really not the number of pixels but size of work sapace. I have 4k Dell XPS 15 2in1 and I barely use the 4k on laptop - I mostly use it hook to LG 38U88 Ultrawide. I have option to go to 4k on laptop screen but in reality - I don't need it.
I'd agree if you are talking about going from 15" 1080p laptop screen to 15" 4k laptop screen.
But, if you don't see significant changes in going from a single laptop screen to a 40" 4k or even just dual SD monitors - any arrangement that lets you put up multiple information streams at once, whatever you are doing isn't very complicated.
Maybe not necessarily the number of pixels. I don't think you'd be a whole lot more productive with a 4k screen than a 2k screen. But screen area on the other hand does matter.
From simple things like being able to have the .cpp, the .h, and some other relevant code file all open at the same time without needing to switch windows, to doing 3-way merges, even just being able to see the progress of your compile while you check your email. Why wouldn't you want to have more screen space?
If you're going to sit at a desk anyway, and you're going to be paid pretty well - which most developers are - why sacrifice even 20, 10, even 5% productivity if you don't have to? And personally I think it's at the higher end of that scale - at least 20%. Every time I don't have to lose my train of thought because I'm fiddling with Visual Studio tabs - that matters.
You're assuming that everyone who needs to use a computer for work needs power and dual monitors. That just isn't the case. The only person kidding themselves here is you.
Resolution and the presence or absence of a second screen are things that are not directly linked to increased productivity in all situations. There are a few workflows that might benefit, but a second screen or a specific resolution, 4k for instance versus 1080, doesn't automatically make a workplace "serious" or...well whatever the opposite of serious is in the context in which you're using it.
The idiocy is thinking that working off a laptop screen is you being as productive as you can be.
The threshold for seeing tangible benefiting from more visible workspace (when so restricted) is very low.
I can accept if folks say they dock their laptops and work on large/multiple monitors - but absolutely do not accept the premise that working off the laptop screen should be considered effective working. If you believe otherwise, you've either never worked with multiple/large screens or simply aren't working fast enough or on something complicated enough to have a worthwhile opinion in the matter! [IMO it really is that stark and it boils my piss seeing folks grappling with 2x crap 20" screens in engineering workplaces and their managers squeezing to make them more productive and not seeing the problem right in front of them.]
Dude it depends entirely on what you are doing. A writer (from books to marketing) needs nothing beyond a 11" screen... I'm in marketing in a startup and for half my tasks my laptop is fine, writing in particular. And yes as soon as I need to work on a web page or graphics design, I need my two screens and 6 virtual desktops at home.
I have my XPS 13 for travel and yes I take a productivity hit from the portability, but only when forced to use it for a week. Working from a cafe once or twice a week I simply plan tasks where a laptop screen isn't limiting and people who do such tasks all day (plenty) don't NEED a bigger screen at all.
He'll I know people who do 80% of their work on a freaking PHONE. Sales folks probably NEVER need anything beyond a 15" screen, and that only for 20% of their work...
Yea, have fun waiting a month to render 2 airport terminals from a point cloud. Most professionals I know still use and have a need for desktops. If you're asking why you need this CPU then it's not for you.
It's kind of obvious that you leave the desktop stationary for any sort of real work. My comment was how the OP rarely seen any "professionals" use desktops anymore. Engineers, CAD workers, and rendering farms still use desktops because they're still the only thing that has any grunt behind it.
So while a laptop is still trying to render the airport's bathroom from the point cloud, the desktop has already done the rest of the airport and the next two point-cloud scanning projects.
Yes I understand that but most of one you find on forums like this are not actually professionals - but instead hard core gamers - which desktop still have a lot of that market.
The ideal platform is mobile platform that connect to desktop style monitors at work and if needed at home. My work required two video ports on Laptop.
The ideal platform... depends on the task at hand. Crazy, I know. Some people need a small and light device, some need one they can hook multiple displays too, some need the one with the most raw power available...
That's fine if you don't need the power of a desktop. Your laptop in that case is essentially serving as a SFF desktop - and if that's good enough, great. You get all the benefits of a desktop, and all the portability advantages of a laptop in a single package.
I can see how that works for some forms of software development. For me personally, at work I can use all the CPU power I can get. Around 32 cores @ 5GHz would be perfect.
Render farms do not typically consist of desktop computers. The workloads you're discussing should indeed be processed elsewhere on something other than whatever machine is local to the user's desk and that elsewhere is typically not a collection of desktop-class PC hardware.
PNC--you made a good point earlier about thin clients and servers, but not everyone is corporate. Efficiency is moot if you can't afford to get your foot in the door.
I don't expect someone who isn't in the industry to know this, but small businesses and independent contractors do the majority of the transcoding, grading, editing, and effects that make it into every single thing you watch on TV or stream. These people often use powerful 1P desktops, and occasionally a single 2P system. Obviously large effects studios don't rely on 1P, but again, most people in this industry are not employed by one of the few giants.
It's clear that you don't have these needs, but to pretend that this entire marketing segment is for "gaming," "not necessary," or inefficient is gross oversimplification. Thin clients and servers are a great solution for certain operations of scale, while for others it is as tone deaf and ignorant as crying "let them eat cake."
These powerful 1P builds are our daily bread, and if you can get by with a laptop then that's great, but check your ignorance instead of speaking from it.
HStewart, I agree. My personal work setup is the same way with a laptop and multiple desktop monitors since I don't require the power that the Engineers and CAD workers need. Most of my users that aren't Engineers or CAD designers are the same way too.
PeachNCream, we just purchased a dedicated i9 for the workloads for our point-cloud rendering, but each individual engineer/cad worker still requires a beefy computer. Moving one point of our services to a dedicated computer does not stop the daily work on other projects.
You are not a professional then (and if you are a professional they made a bad hire), because unless your job requires to walk around while typing a laptop is absolutely useless for the professional world. A desktop will get work done significantly faster with no drawbacks. You're sitting in a chair all day, not out camping.
If you are so immature and clueless as to not recognize that most professionals - like anyone in sales or management or 90% of other professions have no need for more compute than a 15 watt core i5 delivers you should certainly not call others unprofessional.
It's about bragging or walking around with a puffed up chest. For some people, self-promotion through the ownership of overpriced, unnecessary computer hardware is an important element of filling up otherwise empty, meaningless lives.
It's weird, because you're doing that too, only inverted. Your entire argument seems to be that because it's good enough for you and your colleagues it should be good enough or everyone.
Attack as a means of defense. You're implying I've listed a set of specifications that meet everyone's requirements in order to attempt to defeat an argument that contained no such implication because you can't find another way to discredit it.
>It's about bragging or walking around with a puffed up chest.
That's funny, because it seems the only one doing that is you! I sense an insecurity coming from you over the fact that you use a tiny laptop. Unless your job requires you to walk around and type at the same time, there is precisely zero reason for any professional to sacrifice power and productivity for useless mobility. You're going to be in the same room all day at work.
Setting conditions for what justifies or doesn't justify a mobile computer versus a desktop is just an attempt to create "rules" for the silly Calvinball-style game you're attempting to play. "Oh, you can ONLY use X if condition Y exists, so nyah nyah! I win you big meanie!" I know that typing when you're offended tends to limit the ability to think sensibly, but at least pause for a moment or two before you let your emotions get the better of you.
"intel tax" as in if you didn't want to upgrade you have to scrap motherboard/ram to upgrade anyways if already intel? Tell me more about that tax they charge.
That is a silly expression, much like people saying "Gsync tax". Not even a tax in that case, you literally are going to buy a monitor with it if you have nvidia..you would actually pay MORE to buy a new gpu/monitor together for other option. lol
Your post engages in little more than sophistry. It's equivalent to claiming that house prices in London aren't any higher than they are in Sheffield because you, the buyer, already live in London. You might want to live there for various reasons, but you're still paying more money for something directly comparable because of that choice. Whether you think it's worth it is up to you, but to pretend it doesn't exist is just weird.
If something is a joke - Threadripper 2920, 2970,2990 is for sure with one exception 2950 To have much more weak threads than IF/RAM subsystem is capable handle it's a joke. Intels HDET despite having less cores is stil just better solution. Benchmarks yet another series here on this site are best prove of that.
Intel have been milking us for years, I am still holding to my 2500k overclocked to 5ghz (their last great CPU) and my next CPU is going to be a AMD for sure. Nvidia is doing the same lately, its outrageous.
My 9900k@5.1Ghz is almost 3X faster than yours considering IPC difference. And seriously, no one's using AMD GPU anymore, either for gaming or gpu computing.
Actually, if, like me, you're rendering on Corona (or any CPU render engine in 3ds Max) daily, AMD has the best CPU. Period. The fact my 2990WX is also WAY cheaper than the next best thing from Intel is just an added bonus for my company.
He's using a different definition of "great CPU" from you. His includes price/performance ratio, yours doesn't. Insisting that your comparison is more valid than his doesn't make any more sense than him doing the same, so if you're going to mock someone's post, maybe avoid the same errors.
You're so ghetto you're using a 2500k from 2011? Stop posting and get a job so you can afford an upgrade. I guess it proves that Intel makes good chips though if you can wait this long to upgrade.
I seriously don't understand people who are so insecure about their choices that they need to mock random people on the internet for not overspending on their computer equipment. If your use case enables you to spend on the absolute best way past the point of diminishing returns, that's great for you! Be happy and maybe lay off the comment sections..?
No... all it really means is that for the first time in the history of computing, software demands have allowed computing power to reach the level of "good enough" for a lot of users. Also things are a lot more GPU dependant than they used to be. CPUs are less relevant.
I guess it depends on i9-9820X. And I have a feeling it would be similar story to 2990WX vs i9-9980XE - AMD scoring in some benchmarks while intel keeping victory in other. Those who matter (actual buyers) will look at bench that matters to them while fans would be squealing that this or that benchmark is more important and therefore their favorite CPU is the best.
I would honestly get an EPYC platform over the TR 32 cores. However, at this point, you have a really particular workload that requires such capabilities.
It all depends on your needs, but true, Intel is not competitive at their price tags.
In what world are AMD cores garbage when they are more than competitive enough to push Intel into releasing the first significant changes in six years? Zen2 cores are also here(with the new Epyc chips), and Ryzen 3rd generation will be launching within the next five months or so, which WILL have a higher IPC than Intel at that point.
The upgrade AMD really needs at this point is a software one - from Microsoft. The 2990WX performs pretty well when using Linux, but it struggles with most workloads in Windows. I hope that Zen 2's chiplets will do a little better in terms of memory access.
I think you mean that Inte's 1 or 2 cores often beat AMD's 1 or 2 cores. In benchmarks that are highly multithreaded, AMD beats Intel. Intel currently has a frequency advantage, so they win in the lightly threaded tests.
But my question in real life do you need this many core - I still think it better to have single thread core speed. I say that even as developer that uses multiple threads.
Depends on what your real life is, doesn't it? That's why reviews don't have one benchmark and the final paragraph of the conclusion here emphasizes that the right processor for you depends on your workload, even *without* considering the relative prices. For many workloads none of the CPUs tested here are appropriate; a 2400G would be the most efficient, cost-effective option.
Windows is garbage. Put these chips onto a real operating system (Linux) and you will see the actual performance they are capable of without Windows holding them back. See Phoronix.
Typo - you wrote "Intel will need to up its game here to remain competitive". Should have been "Intel will need to up its marketing here to remain competitive".
Marketing doesn't work in tech. Tech buyers aren't dumb. People want performance, and today that's Intel by far. On a per-core basis it creams the competitor.
Tech buyers know that shouting "performance" is meaningless out of context - and that includes a lot more than clock speed. For example price, power, cooling, cores, threading, features, platform, socket life... the list goes on. All conveniently ignored in a slogan like yours, which could have come from an Intel ad.
He's dropping classic lines from the "I am an empowered, smart individual and marketing doesn't work on me" playbook. I find it's usually a line trotted out by people on whom marketing works absolute miracles.
Rofl, and the second you look at the price tags, anyone with half a piece of common sense would realize that buying an i9-9980XE over a TR-2950X is absolutely freaking ridiculous! (Unless you simply NEED AVX-512 that is). Intel's flailing with Skylake.... again..., while AMD's near finished changing the game entirely with 7nm Zen 2, and it's all honestly pretty damn hilarious. Karma's a b**ch and all that lol.
@Ian: Thanks, good overview and review! Agree on the "iteration when an evolutionary upgrade was needed"; it seems that Intel's development was a lot more affected by its blocked/constipated transition to 10 nm (now scrapped), and the company's attention was also diverted by its forays into mobile (didn't work out so great) and looking for progress elsewhere (Altera acquisition). This current "upgrade" is mainly good for extra PCI-e lanes (nice to have more), but it's performance is no better than the previous generation. If the new generation chips from AMD are halfway as good as they promise, Intel will loose a lot more profitable ground in the server and HEDT space to AMD. @Ian, and all: While Intel goes on about their improved FinFet 14 nm being the reason for better performance/Wh, I wonder how big the influence of better heat removal through the (finally again) soldered heat spreader is? Yes, most of us like to improve cooling to be able to overclock more aggressively, but shouldn't better cooling also improve the overall efficiency of the processor? After all, semiconductors conduct more current as they get hotter, leading to ever more heat and eventual "gate crashing". Have you or anybody else looked at performance/Wh between, for example, an i7 8700 with stock cooler and pasty glued heat spreader vs. the same processor with proper delidding, liquid metal replacement and a great aftermarket cooler, both at stock frequencies? I'd expect the better cooled setup to have more performance/Wh, but is that the case?
I think quite a few people are looking at these workstation class CPU's to develop BI things and it might quite helpful to actually measure results with some SQL / NoSQL / BI-suites. Assuming bit more complex parallel SQL executions with locking could show some interesting differences between NUMA-Threadrippers and Intels.
But then make sure it is realistic, not running in cache or such... A real db suitable for these chips is terabytes, merely keeping the index in ram... rule of thumb: if your index fits in cache your database doesn't need this CPU ;-)
Hi, Ian. Did Intel officially announce Skylake-X Refresh be manufactured on 14++ node? But 9980XE Stepping is the same as 7980XE. Stepping is 4, there is no change.
Sometimes the advantage of these processors with AVX512 versus usual desktop processors with AVX2 is crazy. The 3D particle tests fly like 500 mph cars. Which other tasks besides 3D particle movement also benefit from AVX512?
How about linear algebra? Does Intel MKL which seems now support these extensions demonstrate similar speedups with AVX512 on solutions Ax=B, say, with the usual dense matrices?
@Ian: What's with that paragraph about the Mesh clocks on page 1? Mesh clock is 2.4 GHz stock on SKX, and there is no mesh turbo at all. You can check for yourself with AIDA64 or HwInfo. So does SKX-R have the same 2.4 GHz clock, or higher?
I'd like to request that you consider adding some DaVinci Resolve tests to your suite, as it would be helpful for professional film post production professionals. There is a free license which has enough capability for professional work, and there is free raw footage available from Black Magic's web site and 8K raw footage available from Red's. Thanks :)
Intel is playing with fire by doing incremental upgrades over and over again. Look no further than Apple's new iPads - their chips are better than what Intel has to offer in terms of price-power-efficiency. Apple is going to ditch Intel's processors very soon for most of its Mac lineup.
Microsoft Windows is known to suck hard when it comes to performance on NUMA architectures and particularly the TR2 processors. See Phoronix for analysis.
Why does Anandtech continue to post Windows-only benchmarks? They are fairly useless; they tell more about the limitations of Microsoft Windows than they do the processors themselves.
Of course, if you're a poor sap stuck running Windows for any task that requires these processors, I guess you care, but you really should be pushing your operating system vendor to use some of their billions of dollars to hire OS developers who know what they are doing.
I just bought a TR2 1950X for my software development workstation (Linux based) and I am fairly confident that for my work loads, it will kick the crap out of these Intel processors. I wouldn't know for sure though because I tend to read Anandtech fairly exclusively for hardware reports, dipping into sites like Phoronix only when necessary to get accurate details for edge cases like the TR2.
It sure would be nice if my site of choice (Anandtech) would start posting relevant results from operating systems designed to take advantage of these high power processors instead of more Windows garbage ... especially Windows gaming benchmarks, as if those are even remotely relevant to this CPU segment!
Totally agree, we have come to a time that benchmarks are not even accurately evaluating the product anymore. The big question is how can we depict an accurate picture? Especially if the reviewer is not choosing the right ones properly for real comparison.
Well, seeing disparity from Phoronix is raising major concern to me.
It's indeed suprise to me that those new 24,32C AMD processors 2920,2970 are just worst in any therms than their 16C equivalents. In terms of perf/money perf/power just laughable. Linux changes a lot but who uses Linux and for what purpose? I bet developers but what makes me really angry is that nobody even tries to use KVM, Xen, VirtualBox, VMware, VirtualBox as benchmarking tool for purpose of testing usage as small company server. In mine company lot of Remote Desktop sessions are connected to same server. Someone would think - who needs good CPU? But it's only because dont used to solve real life problems and those problems are like importing big databases from obsolete programs, filtering, fixing and exporting to new ERP systems. This consumes lot of time to have fast CPU is crucial. Most of companies i know uses RDP server for that purpose and typical cheep portable laptops given to workers. To have AMD or Intel HDET tested in such purposes would be nice to see. Cause anyone can potencially have 32C anyone could benefit.. but rather than this kind test i used to see gaming.. gaming od HDET?! WTF
All of these "my work doesn't have any desktop users" comments crack me up. Congratulations. Your work is not the entire world of computing in a professional space, much less prosumer space. Get over yourselves.
@Ian Cutress Your tests and review text are always a pleasure to read, thank you for the professionalism.
Questions related to the test suite (I know, everybody always wants something):
1. You are missing an Excel Spreadsheet calculation (Finance still uses a lot of these and they can peg all cores near 100% and be incredibly CPU dependent). Would be nice to see some for example an Excel Monte Carlo simulation kn the suite (local data)
2. Alternatively an R (language) based test for heavy statistical computation. Finding a one that is representative of real world workloads and strikes a balance between single core IPC and many core parallelisation might take some work. But this is one area where laptops just can't muster it and CUDA/OpenCL acceleration isn't often available at all.
3. For Web / JS framework it is nice to see SpeedoMeter and WebXPRT3, but for some reason V8 Web Tooling Bench is not there (https://v8.github.io/web-tooling-benchmark/ ). The old Kraken/JetStream/Octane are nice for reference, but not very representative of real world anymore for some time now (hence why they are abandoned).
Again thank you for this monumental work, the amount of tests is already superb!
For graphing results it would be so helpful to get a comparative price/perf graphed results browser (pick your baseline CPU, pick workloads, cpus on graph as a func of price/perf). This would enable auick viewinf of the efficient frontier in terms of price/perf for each workload and see the base CPU as an anchor.
Yeah, yeah, I know.... Just throwing this in here 😀
Witch should lead to conlcusion AMD Threadripper 2 is just bad offer except 2950. It's the one and only AMD CPU worh mentioning - witch means TR4 16C is dead end. AMD offers overpriced CPUs on that platform that is for sure. Overpriced because half of cores are choking being absolute useless. If 32C and 4 channels is too much cores/channel imagine RYEN 3 16C on dual channel.. It will be big dissapointment for some people i bet. Regardless of pricing Intel 9980 is just great.
I have to say I'm a big fan of HEDT platforms, I built my last workstation in 2011 and it still serves me well 7 years later. But looking at this and the X299 offering I really don't see why anyone would bother.
Till intel changes the way it builds high core count cpu's they can't compete with AMD and it will be even worse next year when AMD made an already cheaper way to produce high core count cpu's even cheaper, to sick levels.
I'm actually more interested in the i7-9800X vs. the i9-9900K. I want to see how the overclocking is compared to the i9-9900K before I just in to X299.
I stopped reading when I saw 8k with a 1080. Most tests are just pointless, as it would be more interesting with a 1080ti at least or better 2080ti. That would give the chips more room to run when they can to separate the men from the boys so to speak.
Vid tests with handbrake stupid too. Does anyone look at the vid after those tests? It would look like crap. Try SLOWER as a setting and lets find out how the chips fare, and bitrates of ~4500-5000 for 1080p. Something I'd actually watch on a 60in+ tv without going blind.
More than I'd do, but the point is, SLOWER will give you far better quality (something I could actually stomach watching), without all the black blocks in dark scenes etc. Current 720p releases from nf or amzn have went to crap (700mb files for h264? ROFL). We are talking untouched direct from NF or AMZN. Meaning that is the quality you are watching as a subscriber that is, which is just one of the reasons we cancelled NF (agenda TV was the largest reason to dump them).
If you're going to test at crap settings nobody would watch, might as well kick in quicksync with quality maxed and get better results as MOST people would do if quality wasn't an issue anyway. option1=value1:option2=value2:tu=1:ref=9:trellis=3 and L4.1 with encoder preset set to QUALITY. That's a pretty good string for decent quality with QSV. Seems to me you're choosing to turn off AVX/Quicksync so AMD looks better or something. Why would any USER turn off stuff that speeds things up unless quality (guys like me) is an issue? Same with turning off gpu in blender etc. What is the point of a test that NOBODY would do in real life? Who turns off AVX512 in handbrake if you bought a chip to get it? LOL. That tech is a feature you BUY intel for. IF you turn off all the good stuff, the chip becomes a ripoff. But users don't do that :) Same for NV, if you have the ability to use RTX stuff, why would you NOT when a game supports it? To make AMD cards look better? Pffft. To wait for AMD to catch up? Pffft.
I say this as an AMD stock holder :) Most useless review I've seen in a while. Not wasting my time reading much of it. Moving on to better reviews that actually test how we PLAY/WATCH/WORK in the real world. 8K...ROFLMAO. Ryan has been claiming 1440p was norm since 660ti. Then it was 4k not long after for the last 5yrs when nobody was using that, now it's 8k tests with a GTX 1080...ROFLMAO. No wonder I come here once a month or less pretty much and when I do, I'm usually turned off by the tests. Constantly changing what people do (REAL TESTS) to turning stuff off, down, (vid cards at ref speeds instead of OC OOTB settings etc), etc etc...Let's see if we can set up this test in a way nobody would do at home to strike down advantages of anyone competing with AMD. Blah. I'd rather see where both sides REALLY win in ways we USE these products. Turn everything on if it's in the chip, gpu, test, etc and spend MORE time testing resolutions etc we actually USE in practice. 8k...hahaha. Whatever. 13fps?
"Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain." Yeah, I'm out. Dropdown quality is against my religion and useless to me. I'm sure the other tests have issues I'd hate also, no time to waste on junk review tests. Too many other places that don't do this crap. I bought a 1070ti to run MAX settings at 1200p (dell 24in) in everything or throw it to my lower res 22in. If I can't do that, I'll wait for my next card to play game X. Not knocking AMD here, just Anandtech. I'll likely buy a 7nm AMD cpu when they hit, and they have a shot at a 7nm gpu for me too. You guys and tomshardware (heh, you joined) have really went downhill with irrational testing setups. If you're going to do 4k at ultra, why not do them all there? I digress...
Just curious, but how many of you AMD fanbois have ever been in a data center or been responsible for adjusting performance on a couple dozen VMware hosts running mixed applications? Oh wait...none. In the mythical world according to AMDs BS dept a Hypervisor / Operating system takes the number of tasks running and divides them by the number of cores running, and you clowns believe it. In the *real world* where we have to deal with really expensive hosts that don't have LED fans in them and run applications adults use we know that's not the truth. Hypervisors and Operating systems schedulers all favor cores that process mixed threads faster, and if you want to argue that please consult with a VMware or Hyper-V engineer the next time you see them in your drive thru. Oh wait...I am a VMware engineer. An i3 8530 costs $200 and literally beats any AMD chip made running stock in dual threaded applications. Seriously....look up the single threaded performance. More cores don't make an application more multithreaded and they don't make contribute to a better desktop experience. I have servers with 30-40% of my CPU resources not being used, and just assigning more cores won't make applications faster. It just ties up my scheduler doing nothing and wastes performance. The only way to get better application efficiency is vertical, and that's higher core performance, and that's nothing I'm seeing AMD bringing to the table.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
143 Comments
Back to Article
nadim.kahwaji - Tuesday, November 13, 2018 - link
Niceeee , keep up the great work Ian ‘:)AshlayW - Tuesday, November 13, 2018 - link
In my opinion the entire Intel HEDT lineup is a joke. And the 9980XE: $180 more for literally just a bit over *half* the cores and threads. Sure it has better lightly threaded performance but surely that's not the intention of this processor, and surely it is not worth charging this insane 'Intel Tax' premium for it.TEAMSWITCHER - Tuesday, November 13, 2018 - link
Intel is free to charge whatever they want for a device that I have zero intention of purchasing. Most professionals I know have stopped using desktop computers for their daily drivers. The Dell XPS 15 and Apple's 15" MacBook Pro seam to be the weapons of choice these days. These products surely have their uses, but in the real world, most users are happy to sacrifice absolute performance for mobility.imaheadcase - Tuesday, November 13, 2018 - link
Most be a strange world you live on. Mobile won't ever be anything close to a desktop for daily tasks. I don't know any professional who have did that. They use mobile devices mainly to view items they did on desktop, not for working.TEAMSWITCHER - Tuesday, November 13, 2018 - link
Really? I work in software development (WEB, C++, OpenGL, and yes our own ray tracing engine) We have one guy with a desktop, the rest of the developers use either an XPS 15, a MacBook Pro, and one guy with a Surface Book. All were given a choice...this was the result.Interesting story about how we got here... Windows used to be a requirement for developing browser plugins. But with the move to Web Assembly, we can now compile and test our plugin on the Mac just as easily as we do on Windows. While many fanboys will lament this change .. I personally love it!
Endda - Tuesday, November 13, 2018 - link
Yea, for code development only. Mobility has been the choice for that for years.Not everyone is a coder though. Some need these desktops for rendering big animations, videos, etc. You're simply not going to do that in any meaningful way on a laptop
PeachNCream - Tuesday, November 13, 2018 - link
Rendering and production work can indeed happen on laptop hardware. I don't argue that desktop hardware with fewer limits on TDP and storage aren't a faster way to accomplish the same tasks, but as Team noted, given a choice, a lot of people opt for mobility over raw compute power.nerd1 - Tuesday, November 13, 2018 - link
It's a big joke to use XPS or Macbook GPU to do anything intensive. It's good for remote code editing though (except macbooks with absolute terrible keyboard)TEAMSWITCHER - Tuesday, November 13, 2018 - link
Define "intensive." Our software does real-time (WebGL) and photo-realistic (ray-tracing) rendering. I suppose that a Path Tracing engine would be MORE intensive. But the goal of our software is to be as ubiquitous as possible. We support the iPad and some Android tablets.linuxgeex - Wednesday, November 14, 2018 - link
There's your answer: anything that runs on iPad and Android Tablets is not "intensive". I'll grant you that it's "intensive" compared to what we were doing on workstations a decade ago, and mobile is closing the gap... but a workstation today has 24-56 cores (not threads) at 5Ghz and dual NVidia 2080 GPUs. You can get a 12-core CPU and dual 1080 in the pinnacle gaming laptops but they don't have ECC or the certifications of a workstation. At best they have half to 2/3 the performance. If you're paying your engineers by the hour you don't want them sitting on their hands twice as long. But I can see how they might make that choice for themselves. You make an excellent point there, lol.zeromus - Wednesday, November 14, 2018 - link
@linuxgeek, logged in for the first time in 10 years or more just to laugh with you for having cracked the case with your explanation there at the end!HStewart - Tuesday, November 13, 2018 - link
I have IBM Thinkpad 530 with NVidia Quadro - in software development unless into graphics you don't even need more than integrated - even more for average business person - unless you are serious into gaming or high end graphics you don't need highend GPU. Even gaming as long as you are not into latest games - lower end graphics will do.pandemonium - Wednesday, November 14, 2018 - link
"Even gaming as long as you are not into latest games - lower end graphics will do."This HEAVILY depends on your output resolution, as every single review for the last decade has clearly made evident.
Samus - Wednesday, November 14, 2018 - link
Don't call it an IBM Thinkpad. It's disgraceful to associate IBM with the bastardization Lenovo has done to their nameplate.imaheadcase - Tuesday, November 13, 2018 - link
Uhh yah but no one WILL do it on mobility. Makes no sense.TEAMSWITCHER - Tuesday, November 13, 2018 - link
You see .. there you are TOTALLY WRONG. Supporting the iPad is a MAJOR REQUIREMENT as specified by our customers.Augmented reality has HUGE IMPLICATIONS for our industry. Try as you may ... you can't hold up that 18 core desktop behemoth (RGB lighting does not defy gravity) to see how that new Pottery Barn sofa will look in your family room. I think what you are suffering from is a historical perspective on computing which the ACTUAL WORLD has moved away from.
scienceomatica - Tuesday, November 13, 2018 - link
@TEAMSWITCHER - I think your comments are an unbalanced result between fantasy and ideals. I think you're pretty superficially, even childishly looking at the use of technology and communicating with the objective world. Of course, a certain aspect of things can be done on a mobile device, but by its very essence it is just a mobile device, therefore, as a casual, temporary solution. It will never be able to match the raw power of "static" desktop computers.working in a laboratory for physical analysis, numerous simulations of supersymmetric breakdowns of material identities, or transposition of spatial-temporal continuum, it would be ridiculous to imagine doing on a mobile device.There are many things I would not even mention.HStewart - Tuesday, November 13, 2018 - link
For videos - as long as you AVX 2 (256bit) you are ok.SanX - Wednesday, November 14, 2018 - link
AMD needs to beat Intel with AVX to be considered seriously for scientific apps (3D particle movement test)PeachNCream - Tuesday, November 13, 2018 - link
All seven of our local development teams have long since switched from desktops to laptops. That conversion was a done deal back in the days of Windows Vista and Dell Latitude D630s and 830s. Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. The reality is that desktop computers are for the most part a thing of the past with a few instances remaining on the desks of home users that play video games on custom-built boxes as the primary remaining market segment. Why else would Intel swing overclocking as a feature of a HEDT chip if there was a valid business market for these things?imaheadcase - Tuesday, November 13, 2018 - link
Yah because you don't do anything intensive with the jobs you have, of course you would use laptops or whatever mobile. But the reality is most people would use desktops because simply faster to get stuff done, and more powerful.BYOD fyi is not like that for most companies..
imaheadcase - Tuesday, November 13, 2018 - link
..and if you are doing anything intensive with laptops..that just means company you work for is behind the curve and just being cheap and not fork out money for the right hardware.PeachNCream - Tuesday, November 13, 2018 - link
There are over 250K people on the payroll. There ARE desktop PCs around, but they are few and far between. I'm not going to get into an extended debate about this because it won't change anyone's perspective, but I do believe you've got a slight misconception about the usefulness and flexibility of portable computer hardware. A simple look at the availability of desktops versus laptops should be enough to make it obvious, for most people, computer == laptop these days.Spunjji - Tuesday, November 13, 2018 - link
You're eliding the difference between "convenient and sufficient" and "as powerful as anyone needs".I'll absolutely grant that if you're only going to have one system for doing your work and you move around a fair bit, then it absolutely makes sense to have that system be mobile, even if you lose a bit of edge-case performance.
For people doing /serious/ GPU grunt work something like an XPS 15 is going to provide between 1/2 and 1/3 of the power they could get with a similarly priced desktop. That compromise doesn't make any sense for someone whose job does not require mobility.
So sure, notebooks are better than ever for a large number of people. Doesn't make desktops and HEDT chips functionally irrelevant for businesses, though. If you can really use 18 cores for the work you're doing then being provided with an XPS 15 will be, at best, a sad joke.
Ratman6161 - Tuesday, November 13, 2018 - link
Any laptop is essentially on a different planet than any of the processors covered in this review (doesn't matter if we are talking Intel or AMD).1. If it is possible to do your work on a laptop (which I am myself at this very moment) then you (and me) are not the target audience for these CPU's. In fact, I'm not entirely sure why you even bother to read or comment on the story?
2. If you have to ask if you need it, you don't need it.
3. If you have to think more than about 1 second to make a decision between one of these and a laptop, then you don't need it.
4. If you do need one, then you already know that.
Most people don't need one, including me. I read these things because the technology is interesting and because I find it interesting what others might be doing. I don't really feel any need to insist that others need what I need and could not possibly need anything else.
PeachNCream - Wednesday, November 14, 2018 - link
So a differing opinion than yours should mean that someone not read an article or comment on it. That appears to be nothing more than a self-protective mechanism intended to erect a bubble in which exists nothing more than an echo chamber filled with your own beliefs. That's hardly a way to integrate new thinking, but I do understand that a lot of people fear change in the same way you do.Kilnk - Tuesday, November 13, 2018 - link
"But the reality is most people would use desktops because simply faster to get stuff done, and more powerful."See, that's the problem with your reasoning. You assume that most people need power when they do not. The reality is that the majority of people who need to use computers for work do not need to do rendering or any kind of intensive task. So no, most people don't use desktops nor would they want to use desktops given the opportunity. They use laptops.
FunBunny2 - Tuesday, November 13, 2018 - link
"Now we live in a BYOD (bring your own device) world where the company will pay up to a certain amount (varies between $1,100 and $1,400 depending on funding from upper echelons of the corporation) and employees are free to purchase the computer hardware they want for their work. There are no desktop PCs currently and in the past four years, only one person purchased a desktop in the form of a NUC. "The Man's advantage to the Worker Bees using laptops: their always 'on the job'. no time off. as close to slavery as it's legal to be. some smart folks are truly stupid.
PeachNCream - Tuesday, November 13, 2018 - link
"The Man's advantage to the Worker Bees.." (just quoting because of the lack of continuing indents in Anandtech's 1990's-era comment system)I think that's a bit of a stretch in our case. My division doesn't do on-call and we strictly prohibit our lower tier managers from tapping employees outside of their normal work hours. Even checking company e-mail outside of work hours is against posted (and enforced) policy. If we must, due to emergencies, they absolutely have to be compensated for the time regardless of whether or not they are hourly or salaried workers. I haven't seen an "emergency" that couldn't wait until the next day so that policy has not been put into use in at least the last five years. Computational mobility is no excuse to allow invasions into off-the-clock time and I for one won't allow it.
jjjag - Tuesday, November 13, 2018 - link
I hate to admit it but PNC is right. Super-high-powered desktops are an anachronism. If you need REAL horsepower, you build a server/compute farm and connect to it with thin-client laptops. If you are just doing software development, the laptop cpu is usually good enough.This is especially true of single socket monsters like these HEDT chips. The only reason they exist is because gamers will pay too much for everything. It's nothing more than an expensive hobby, and like all hobbies at the top end is all "want" and very little "need". The "need" stops somewhere around 6 or 8 cores.
It's exactly the same as owning a Ferrari and never taking it to the track. You will never use more than 20% of the full capabilities of it. All you really need is a Vette.
MisterAnon - Wednesday, November 14, 2018 - link
PNC is not right at all, he's completely wrong. Unless your job requires you to walk around and type at the same time using a laptop is a net loss of producitivity for zero gain. At a professional workplace anyone who thinks that way would definitely be fired. If you're going to be in the same room for 8 hours a day doing real work, it makes sense to have a desktop with dual monitors. You will be faster, more efficient, more productive, and more comfortable. Powerful desktops are more useful today than ever before due to the complexity of modern demands.TheinsanegamerN - Wednesday, November 14, 2018 - link
What is your source for gamers being the primary consumers of HDET?imaheadcase - Tuesday, November 13, 2018 - link
Well of course for programming its ok. That is like saying you moved from a desktop to a phone for typing. It requires nothing to type hardly for power. lol That pretty much as always been the case.bji - Tuesday, November 13, 2018 - link
I think you are implying programming is not a CPU intensive task? Certainly it can be low intensity for small projects, but trust me it can also use as much CPU as you can possibly throw at it. When you have a project that requires compiling thousands or tens of thousands of files to build it ... the workload scales fairly linearly with the number of cores, up to some fuzzy limit mostly set by memory bandwidth.twtech - Thursday, November 15, 2018 - link
I also work in software development (games), and my experience has been completely the opposite. I've actually only known one programmer who preferred to work on a laptop - he bought a really high-end Clevo DTR and brought it in to work.I do have a laptop at my desk - I brought in a Surface Book 2 - but I mostly just use it for taking notes. I don't code on it.
Unless you're going to be moving around all the time, I don't know why you'd prefer to look at one small screen and type on a sub-par laptop keyboard if there's the choice of something better readily available. And two 27" screens is pretty much the minimum baseline - I have 3x 30" here at home.
:And then of course there's the CPU - if you're working on a really small codebase, it might not matter. But if it's a big codebase, with C++, you want to have a lot of cores to be able to distribute the compiling load. That's why I'm really interested in the forthcoming W3175x - high clocks plus 28 cores on a monolithic chip sounds like a winning combination for code compiling. High end for a laptop is what, 6 cores now?
Laibalion - Saturday, November 17, 2018 - link
What utter nonsense. I'be been working on large and complex c++ codebases (2M+ LOC for a single product) for over a decade, and compute power is an absolute necessity to work efficiently. Compile times such beast scales linearly (if done properly), so no one wants a shit mobile cpu for their workstation.HStewart - Tuesday, November 13, 2018 - link
Mobile has been this way for decade - I got a new job working at home and everyone is on laptops - todays laptop are as powerful as most desks - work has quad core notebook and this is my 2nd notebook and first one was from nine years ago. Desktops were not used in my previous job. Notebook mean you can be mobile - for me that is when I go to home office which is not often - but also bring notebook to meeting and such.I am development C++ and .net primary.
Desktop are literary dinosaurs now becoming part of history.
bji - Tuesday, November 13, 2018 - link
You are not working on big enough projects. For your projects, a laptop may be sufficient; but for larger projects, there is certainly a wide chasm of difference between the capabilities of a laptop and those of a workstation class developer system.MisterAnon - Wednesday, November 14, 2018 - link
Today's laptops are not as powerful as desktops. They use slow mobile processors, and overheat easily due to thermals. If you're working from home you're still sitting in a chair all day, meaning you don't need a laptop. If your company fired you and hired someone who uses a desktop with dual monitors, they would get significantly more work done for them per dollar.Atari2600 - Tuesday, November 13, 2018 - link
I wouldn't call them very "professional" when they are sacrificing 50+% productivity for mobility.Anyone serious about work in a serious work environment* has a workstation/desktop and at least 2 of UHD/4k monitors. Anything else is just kidding yourself thinking you are productive.
TEAMSWITCHER - Tuesday, November 13, 2018 - link
I never said that we didn't have external monitors, keyboards, and mice for desktop work. However, from 25 years of personal experience in this industry I can tell you emphatically .. productivity isn't related to the number of pixels on your display.HStewart - Tuesday, November 13, 2018 - link
Exactly - I work with 15 in IBM Thinkpad 530 that screen is never used - but I have 2 24in 1980p monitors on my desk at home - if I need to go home office - hook it up another monitor - always with external monitor.It is really not the number of pixels but size of work sapace. I have 4k Dell XPS 15 2in1 and I barely use the 4k on laptop - I mostly use it hook to LG 38U88 Ultrawide. I have option to go to 4k on laptop screen but in reality - I don't need it.
Atari2600 - Tuesday, November 13, 2018 - link
I'd agree if you are talking about going from 15" 1080p laptop screen to 15" 4k laptop screen.But, if you don't see significant changes in going from a single laptop screen to a 40" 4k or even just dual SD monitors - any arrangement that lets you put up multiple information streams at once, whatever you are doing isn't very complicated.
twtech - Thursday, November 15, 2018 - link
Maybe not necessarily the number of pixels. I don't think you'd be a whole lot more productive with a 4k screen than a 2k screen. But screen area on the other hand does matter.From simple things like being able to have the .cpp, the .h, and some other relevant code file all open at the same time without needing to switch windows, to doing 3-way merges, even just being able to see the progress of your compile while you check your email. Why wouldn't you want to have more screen space?
If you're going to sit at a desk anyway, and you're going to be paid pretty well - which most developers are - why sacrifice even 20, 10, even 5% productivity if you don't have to? And personally I think it's at the higher end of that scale - at least 20%. Every time I don't have to lose my train of thought because I'm fiddling with Visual Studio tabs - that matters.
Kilnk - Tuesday, November 13, 2018 - link
You're assuming that everyone who needs to use a computer for work needs power and dual monitors. That just isn't the case. The only person kidding themselves here is you.PeachNCream - Tuesday, November 13, 2018 - link
Resolution and the presence or absence of a second screen are things that are not directly linked to increased productivity in all situations. There are a few workflows that might benefit, but a second screen or a specific resolution, 4k for instance versus 1080, doesn't automatically make a workplace "serious" or...well whatever the opposite of serious is in the context in which you're using it.steven4570 - Tuesday, November 13, 2018 - link
"I wouldn't call them very "professional" when they are sacrificing 50+% productivity for mobility."This is quite honestly, a very stupid statement without any real practical views in the real world.
Atari2600 - Wednesday, November 14, 2018 - link
Not really.The idiocy is thinking that working off a laptop screen is you being as productive as you can be.
The threshold for seeing tangible benefiting from more visible workspace (when so restricted) is very low.
I can accept if folks say they dock their laptops and work on large/multiple monitors - but absolutely do not accept the premise that working off the laptop screen should be considered effective working. If you believe otherwise, you've either never worked with multiple/large screens or simply aren't working fast enough or on something complicated enough to have a worthwhile opinion in the matter! [IMO it really is that stark and it boils my piss seeing folks grappling with 2x crap 20" screens in engineering workplaces and their managers squeezing to make them more productive and not seeing the problem right in front of them.]
jospoortvliet - Thursday, November 15, 2018 - link
Dude it depends entirely on what you are doing. A writer (from books to marketing) needs nothing beyond a 11" screen... I'm in marketing in a startup and for half my tasks my laptop is fine, writing in particular. And yes as soon as I need to work on a web page or graphics design, I need my two screens and 6 virtual desktops at home.I have my XPS 13 for travel and yes I take a productivity hit from the portability, but only when forced to use it for a week. Working from a cafe once or twice a week I simply plan tasks where a laptop screen isn't limiting and people who do such tasks all day (plenty) don't NEED a bigger screen at all.
He'll I know people who do 80% of their work on a freaking PHONE. Sales folks probably NEVER need anything beyond a 15" screen, and that only for 20% of their work...
Atari2600 - Thursday, November 15, 2018 - link
I never said the non-complicated things need anything more than 1 small screen!AdhesiveTeflon - Tuesday, November 13, 2018 - link
Yea, have fun waiting a month to render 2 airport terminals from a point cloud. Most professionals I know still use and have a need for desktops. If you're asking why you need this CPU then it's not for you.HStewart - Tuesday, November 13, 2018 - link
Like to see you lug a desktop though an airport.AdhesiveTeflon - Tuesday, November 13, 2018 - link
It's kind of obvious that you leave the desktop stationary for any sort of real work. My comment was how the OP rarely seen any "professionals" use desktops anymore. Engineers, CAD workers, and rendering farms still use desktops because they're still the only thing that has any grunt behind it.So while a laptop is still trying to render the airport's bathroom from the point cloud, the desktop has already done the rest of the airport and the next two point-cloud scanning projects.
HStewart - Tuesday, November 13, 2018 - link
Yes I understand that but most of one you find on forums like this are not actually professionals - but instead hard core gamers - which desktop still have a lot of that market.The ideal platform is mobile platform that connect to desktop style monitors at work and if needed at home. My work required two video ports on Laptop.
Lord of the Bored - Wednesday, November 14, 2018 - link
The ideal platform... depends on the task at hand. Crazy, I know. Some people need a small and light device, some need one they can hook multiple displays too, some need the one with the most raw power available...twtech - Thursday, November 15, 2018 - link
That's fine if you don't need the power of a desktop. Your laptop in that case is essentially serving as a SFF desktop - and if that's good enough, great. You get all the benefits of a desktop, and all the portability advantages of a laptop in a single package.I can see how that works for some forms of software development. For me personally, at work I can use all the CPU power I can get. Around 32 cores @ 5GHz would be perfect.
PeachNCream - Wednesday, November 14, 2018 - link
Render farms do not typically consist of desktop computers. The workloads you're discussing should indeed be processed elsewhere on something other than whatever machine is local to the user's desk and that elsewhere is typically not a collection of desktop-class PC hardware.M O B - Friday, November 16, 2018 - link
PNC--you made a good point earlier about thin clients and servers, but not everyone is corporate. Efficiency is moot if you can't afford to get your foot in the door.I don't expect someone who isn't in the industry to know this, but small businesses and independent contractors do the majority of the transcoding, grading, editing, and effects that make it into every single thing you watch on TV or stream. These people often use powerful 1P desktops, and occasionally a single 2P system. Obviously large effects studios don't rely on 1P, but again, most people in this industry are not employed by one of the few giants.
It's clear that you don't have these needs, but to pretend that this entire marketing segment is for "gaming," "not necessary," or inefficient is gross oversimplification. Thin clients and servers are a great solution for certain operations of scale, while for others it is as tone deaf and ignorant as crying "let them eat cake."
These powerful 1P builds are our daily bread, and if you can get by with a laptop then that's great, but check your ignorance instead of speaking from it.
AdhesiveTeflon - Wednesday, November 14, 2018 - link
HStewart, I agree. My personal work setup is the same way with a laptop and multiple desktop monitors since I don't require the power that the Engineers and CAD workers need. Most of my users that aren't Engineers or CAD designers are the same way too.PeachNCream, we just purchased a dedicated i9 for the workloads for our point-cloud rendering, but each individual engineer/cad worker still requires a beefy computer. Moving one point of our services to a dedicated computer does not stop the daily work on other projects.
MisterAnon - Wednesday, November 14, 2018 - link
You are not a professional then (and if you are a professional they made a bad hire), because unless your job requires to walk around while typing a laptop is absolutely useless for the professional world. A desktop will get work done significantly faster with no drawbacks. You're sitting in a chair all day, not out camping.jospoortvliet - Thursday, November 15, 2018 - link
If you are so immature and clueless as to not recognize that most professionals - like anyone in sales or management or 90% of other professions have no need for more compute than a 15 watt core i5 delivers you should certainly not call others unprofessional.bigboss2077 - Saturday, November 24, 2018 - link
ohh man I think you take too much weed.PeachNCream - Tuesday, November 13, 2018 - link
It's about bragging or walking around with a puffed up chest. For some people, self-promotion through the ownership of overpriced, unnecessary computer hardware is an important element of filling up otherwise empty, meaningless lives.Spunjji - Tuesday, November 13, 2018 - link
It's weird, because you're doing that too, only inverted. Your entire argument seems to be that because it's good enough for you and your colleagues it should be good enough or everyone.PeachNCream - Tuesday, November 13, 2018 - link
Attack as a means of defense. You're implying I've listed a set of specifications that meet everyone's requirements in order to attempt to defeat an argument that contained no such implication because you can't find another way to discredit it.MisterAnon - Wednesday, November 14, 2018 - link
You look ridiculous trying to get out of admitting that you lost the argument. Interesting mental gymnastics though.MisterAnon - Wednesday, November 14, 2018 - link
>It's about bragging or walking around with a puffed up chest.That's funny, because it seems the only one doing that is you! I sense an insecurity coming from you over the fact that you use a tiny laptop. Unless your job requires you to walk around and type at the same time, there is precisely zero reason for any professional to sacrifice power and productivity for useless mobility. You're going to be in the same room all day at work.
PeachNCream - Wednesday, November 14, 2018 - link
Setting conditions for what justifies or doesn't justify a mobile computer versus a desktop is just an attempt to create "rules" for the silly Calvinball-style game you're attempting to play. "Oh, you can ONLY use X if condition Y exists, so nyah nyah! I win you big meanie!" I know that typing when you're offended tends to limit the ability to think sensibly, but at least pause for a moment or two before you let your emotions get the better of you.imaheadcase - Tuesday, November 13, 2018 - link
"intel tax" as in if you didn't want to upgrade you have to scrap motherboard/ram to upgrade anyways if already intel? Tell me more about that tax they charge.That is a silly expression, much like people saying "Gsync tax". Not even a tax in that case, you literally are going to buy a monitor with it if you have nvidia..you would actually pay MORE to buy a new gpu/monitor together for other option. lol
Spunjji - Tuesday, November 13, 2018 - link
Your post engages in little more than sophistry. It's equivalent to claiming that house prices in London aren't any higher than they are in Sheffield because you, the buyer, already live in London. You might want to live there for various reasons, but you're still paying more money for something directly comparable because of that choice. Whether you think it's worth it is up to you, but to pretend it doesn't exist is just weird.Stasinek - Wednesday, November 21, 2018 - link
If something is a joke - Threadripper 2920, 2970,2990 is for sure with one exception 2950To have much more weak threads than IF/RAM subsystem is capable handle it's a joke.
Intels HDET despite having less cores is stil just better solution.
Benchmarks yet another series here on this site are best prove of that.
Badelhas - Tuesday, November 13, 2018 - link
Intel have been milking us for years, I am still holding to my 2500k overclocked to 5ghz (their last great CPU) and my next CPU is going to be a AMD for sure. Nvidia is doing the same lately, its outrageous.Badelhas - Tuesday, November 13, 2018 - link
Oh, I forgot: GREAT REVIEW, Anandtech! Cheers. :)imaheadcase - Tuesday, November 13, 2018 - link
Crazy right, its almost like you needed a performance product and bought it.nerd1 - Tuesday, November 13, 2018 - link
My 9900k@5.1Ghz is almost 3X faster than yours considering IPC difference. And seriously, no one's using AMD GPU anymore, either for gaming or gpu computing.benedict - Tuesday, November 13, 2018 - link
No one cares how fast your CPU is and plenty of people who are not obscenely rich use AMD GPUs.nerd1 - Tuesday, November 13, 2018 - link
He said 2500k is intel's last great CPU and I showed him a counterexample.AMD is indeed good if you just need many cores for budget, but that's it.
Aephe - Tuesday, November 13, 2018 - link
Actually, if, like me, you're rendering on Corona (or any CPU render engine in 3ds Max) daily, AMD has the best CPU. Period. The fact my 2990WX is also WAY cheaper than the next best thing from Intel is just an added bonus for my company.Spunjji - Tuesday, November 13, 2018 - link
He's using a different definition of "great CPU" from you. His includes price/performance ratio, yours doesn't. Insisting that your comparison is more valid than his doesn't make any more sense than him doing the same, so if you're going to mock someone's post, maybe avoid the same errors.Arbie - Tuesday, November 13, 2018 - link
How do AMD's GPUs relate to the HEDT CPU market being discussed here? And seriously, can't see any point to your remarks.nexuspie - Tuesday, November 13, 2018 - link
You're so ghetto you're using a 2500k from 2011? Stop posting and get a job so you can afford an upgrade. I guess it proves that Intel makes good chips though if you can wait this long to upgrade.LordanSS - Tuesday, November 13, 2018 - link
Still rocking a 3770k. Not going to pay "Intel price" for 4-cores and just 20% more IPC than I have.Zen2, that'll be my swap.
Spunjji - Tuesday, November 13, 2018 - link
I seriously don't understand people who are so insecure about their choices that they need to mock random people on the internet for not overspending on their computer equipment. If your use case enables you to spend on the absolute best way past the point of diminishing returns, that's great for you! Be happy and maybe lay off the comment sections..?Kilnk - Tuesday, November 13, 2018 - link
No... all it really means is that for the first time in the history of computing, software demands have allowed computing power to reach the level of "good enough" for a lot of users. Also things are a lot more GPU dependant than they used to be. CPUs are less relevant.duploxxx - Tuesday, November 13, 2018 - link
It is quite obvious. From a general performance/price/power perspective the TR2 2950x is the one to get. Forget all the uber expensive Intel junk.qap - Tuesday, November 13, 2018 - link
I guess it depends on i9-9820X. And I have a feeling it would be similar story to 2990WX vs i9-9980XE - AMD scoring in some benchmarks while intel keeping victory in other.Those who matter (actual buyers) will look at bench that matters to them while fans would be squealing that this or that benchmark is more important and therefore their favorite CPU is the best.
eva02langley - Tuesday, November 13, 2018 - link
I would honestly get an EPYC platform over the TR 32 cores. However, at this point, you have a really particular workload that requires such capabilities.It all depends on your needs, but true, Intel is not competitive at their price tags.
nexuspie - Tuesday, November 13, 2018 - link
These benchmarks show that the 9980's 18 cores often BEAT the 2990wx's 32 cores. AMD cores are garbage.Targon - Tuesday, November 13, 2018 - link
In what world are AMD cores garbage when they are more than competitive enough to push Intel into releasing the first significant changes in six years? Zen2 cores are also here(with the new Epyc chips), and Ryzen 3rd generation will be launching within the next five months or so, which WILL have a higher IPC than Intel at that point.twtech - Thursday, November 15, 2018 - link
The upgrade AMD really needs at this point is a software one - from Microsoft. The 2990WX performs pretty well when using Linux, but it struggles with most workloads in Windows. I hope that Zen 2's chiplets will do a little better in terms of memory access.coder543 - Tuesday, November 13, 2018 - link
I think you mean that Inte's 1 or 2 cores often beat AMD's 1 or 2 cores. In benchmarks that are highly multithreaded, AMD beats Intel. Intel currently has a frequency advantage, so they win in the lightly threaded tests.coder543 - Tuesday, November 13, 2018 - link
Intel's*AnandTech and Twitter both need an edit button.
HStewart - Tuesday, November 13, 2018 - link
But my question in real life do you need this many core - I still think it better to have single thread core speed. I say that even as developer that uses multiple threads.GreenReaper - Wednesday, November 14, 2018 - link
Depends on what your real life is, doesn't it? That's why reviews don't have one benchmark and the final paragraph of the conclusion here emphasizes that the right processor for you depends on your workload, even *without* considering the relative prices. For many workloads none of the CPUs tested here are appropriate; a 2400G would be the most efficient, cost-effective option.twtech - Thursday, November 15, 2018 - link
Yes. I could use about twice this many. I also need good single-core clockspeed as well. That's why I'm eagerly anticipating the 3175x launch.bji - Tuesday, November 13, 2018 - link
Windows is garbage. Put these chips onto a real operating system (Linux) and you will see the actual performance they are capable of without Windows holding them back. See Phoronix.twtech - Thursday, November 15, 2018 - link
It's nice if you have that option. Most of the time the software you run only works on certain operating systems and you don't have much choice.Lolimaster - Thursday, November 15, 2018 - link
In benchmarks where the 2950X at 1K will score similar too? What's the point?AMD top dogs are on another whole level, if you got the workload for them, take them, else 2950X.
Arbie - Tuesday, November 13, 2018 - link
Typo - you wrote "Intel will need to up its game here to remain competitive". Should have been "Intel will need to up its marketing here to remain competitive".eva02langley - Tuesday, November 13, 2018 - link
Unfortunately, they are selling everything. I would be happy if they were not selling anything.nexuspie - Tuesday, November 13, 2018 - link
Marketing doesn't work in tech. Tech buyers aren't dumb. People want performance, and today that's Intel by far. On a per-core basis it creams the competitor.Arbie - Tuesday, November 13, 2018 - link
Ironically stated in pure marketing-speak.Tech buyers know that shouting "performance" is meaningless out of context - and that includes a lot more than clock speed. For example price, power, cooling, cores, threading, features, platform, socket life... the list goes on. All conveniently ignored in a slogan like yours, which could have come from an Intel ad.
Spunjji - Tuesday, November 13, 2018 - link
He's dropping classic lines from the "I am an empowered, smart individual and marketing doesn't work on me" playbook. I find it's usually a line trotted out by people on whom marketing works absolute miracles.Kilnk - Tuesday, November 13, 2018 - link
I've been reading your comments and I love your style.Arbie - Tuesday, November 13, 2018 - link
"there’s no point advertising a magical 28-core 5 GHz CPU ... if only one in a million hits that value."Sure there is: to confuse the market and draw attention away from the competition. As at Computex in June.
twtech - Thursday, November 15, 2018 - link
How about 4.5 GHz?eva02langley - Tuesday, November 13, 2018 - link
So many refreshes, and so little supply on the shelves.jospoortvliet - Friday, November 16, 2018 - link
Takes only 9 weeks to be delivered I suppose? And that is just the promise - delays likely.Cooe - Tuesday, November 13, 2018 - link
Rofl, and the second you look at the price tags, anyone with half a piece of common sense would realize that buying an i9-9980XE over a TR-2950X is absolutely freaking ridiculous! (Unless you simply NEED AVX-512 that is). Intel's flailing with Skylake.... again..., while AMD's near finished changing the game entirely with 7nm Zen 2, and it's all honestly pretty damn hilarious. Karma's a b**ch and all that lol.benedict - Tuesday, November 13, 2018 - link
Agreed, the 2950X offers the best value in the HEDT segment.Cellar Door - Tuesday, November 13, 2018 - link
The best part is that an i7 part(9800X) is more expensive then a i9 part(9900k). Intel smoking some good stuff.DigitalFreak - Tuesday, November 13, 2018 - link
You're paying more for those extra 28 PCI-E lanesHixbot - Tuesday, November 13, 2018 - link
And much more L3. It's also interesting that HEDT is no longer behind in process node.Hixbot - Tuesday, November 13, 2018 - link
And AVX512eastcoast_pete - Tuesday, November 13, 2018 - link
@Ian: Thanks, good overview and review!Agree on the "iteration when an evolutionary upgrade was needed"; it seems that Intel's development was a lot more affected by its blocked/constipated transition to 10 nm (now scrapped), and the company's attention was also diverted by its forays into mobile (didn't work out so great) and looking for progress elsewhere (Altera acquisition). This current "upgrade" is mainly good for extra PCI-e lanes (nice to have more), but it's performance is no better than the previous generation. If the new generation chips from AMD are halfway as good as they promise, Intel will loose a lot more profitable ground in the server and HEDT space to AMD.
@Ian, and all: While Intel goes on about their improved FinFet 14 nm being the reason for better performance/Wh, I wonder how big the influence of better heat removal through the (finally again) soldered heat spreader is? Yes, most of us like to improve cooling to be able to overclock more aggressively, but shouldn't better cooling also improve the overall efficiency of the processor? After all, semiconductors conduct more current as they get hotter, leading to ever more heat and eventual "gate crashing". Have you or anybody else looked at performance/Wh between, for example, an i7 8700 with stock cooler and pasty glued heat spreader vs. the same processor with proper delidding, liquid metal replacement and a great aftermarket cooler, both at stock frequencies? I'd expect the better cooled setup to have more performance/Wh, but is that the case?
Arbie - Tuesday, November 13, 2018 - link
The "Competition" chart is already ghastly for Intel. Imagine how much worse it will be when AMD moves to 7 nm with Zen 2.zepi - Tuesday, November 13, 2018 - link
How about including some kind of DB test?I think quite a few people are looking at these workstation class CPU's to develop BI things and it might quite helpful to actually measure results with some SQL / NoSQL / BI-suites. Assuming bit more complex parallel SQL executions with locking could show some interesting differences between NUMA-Threadrippers and Intels.
GreenReaper - Wednesday, November 14, 2018 - link
It's a good idea, Phoronix does them so in the short term you could probably look there.jospoortvliet - Friday, November 16, 2018 - link
But then make sure it is realistic, not running in cache or such... A real db suitable for these chips is terabytes, merely keeping the index in ram... rule of thumb: if your index fits in cache your database doesn't need this CPU ;-)FunBunny2 - Tuesday, November 13, 2018 - link
I guess I can run my weather simulation in Excel on my personal machine now. neato.at8750 - Tuesday, November 13, 2018 - link
Hi, Ian.Did Intel officially announce Skylake-X Refresh be manufactured on 14++ node?
But 9980XE Stepping is the same as 7980XE.
Stepping is 4, there is no change.
SanX - Tuesday, November 13, 2018 - link
Sometimes the advantage of these processors with AVX512 versus usual desktop processors with AVX2 is crazy. The 3D particle tests fly like 500 mph cars. Which other tasks besides 3D particle movement also benefit from AVX512?How about linear algebra? Does Intel MKL which seems now support these extensions demonstrate similar speedups with AVX512 on solutions Ax=B, say, with the usual dense matrices?
TitovVN1974 - Friday, November 16, 2018 - link
Pray look up linpack results.SonicAndSmoke - Tuesday, November 13, 2018 - link
@Ian: What's with that paragraph about the Mesh clocks on page 1? Mesh clock is 2.4 GHz stock on SKX, and there is no mesh turbo at all. You can check for yourself with AIDA64 or HwInfo. So does SKX-R have the same 2.4 GHz clock, or higher?Tamerlin - Tuesday, November 13, 2018 - link
Thorough review as always.I'd like to request that you consider adding some DaVinci Resolve tests to your suite, as it would be helpful for professional film post production professionals. There is a free license which has enough capability for professional work, and there is free raw footage available from Black Magic's web site and 8K raw footage available from Red's.
Thanks :)
askmedov - Tuesday, November 13, 2018 - link
Intel is playing with fire by doing incremental upgrades over and over again. Look no further than Apple's new iPads - their chips are better than what Intel has to offer in terms of price-power-efficiency. Apple is going to ditch Intel's processors very soon for most of its Mac lineup.bji - Tuesday, November 13, 2018 - link
Microsoft Windows is known to suck hard when it comes to performance on NUMA architectures and particularly the TR2 processors. See Phoronix for analysis.Why does Anandtech continue to post Windows-only benchmarks? They are fairly useless; they tell more about the limitations of Microsoft Windows than they do the processors themselves.
Of course, if you're a poor sap stuck running Windows for any task that requires these processors, I guess you care, but you really should be pushing your operating system vendor to use some of their billions of dollars to hire OS developers who know what they are doing.
I just bought a TR2 1950X for my software development workstation (Linux based) and I am fairly confident that for my work loads, it will kick the crap out of these Intel processors. I wouldn't know for sure though because I tend to read Anandtech fairly exclusively for hardware reports, dipping into sites like Phoronix only when necessary to get accurate details for edge cases like the TR2.
It sure would be nice if my site of choice (Anandtech) would start posting relevant results from operating systems designed to take advantage of these high power processors instead of more Windows garbage ... especially Windows gaming benchmarks, as if those are even remotely relevant to this CPU segment!
bji - Tuesday, November 13, 2018 - link
Erp I meant 2950X, sorry typo there.Shaky156 - Wednesday, November 14, 2018 - link
I have to agree 110%, gaming benchmarks are more gpu/ipc related more than a multicore cpu benchmark.Somehow seems anandtech could be biased
eva02langley - Wednesday, November 14, 2018 - link
Totally agree, we have come to a time that benchmarks are not even accurately evaluating the product anymore. The big question is how can we depict an accurate picture? Especially if the reviewer is not choosing the right ones properly for real comparison.Well, seeing disparity from Phoronix is raising major concern to me.
Stasinek - Wednesday, November 21, 2018 - link
It's indeed suprise to me that those new 24,32C AMD processors 2920,2970 are just worst in any therms than their 16C equivalents. In terms of perf/money perf/power just laughable.Linux changes a lot but who uses Linux and for what purpose?
I bet developers but what makes me really angry is that nobody even tries to use KVM, Xen, VirtualBox, VMware, VirtualBox as benchmarking tool for purpose of testing usage as small company server. In mine company lot of Remote Desktop sessions are connected to same server.
Someone would think - who needs good CPU? But it's only because dont used to solve real life problems and those problems are like importing big databases from obsolete programs, filtering, fixing and exporting to new ERP systems. This consumes lot of time to have fast CPU is crucial. Most of companies i know uses RDP server for that purpose and typical cheep portable laptops given to workers. To have AMD or Intel HDET tested in such purposes would be nice to see. Cause anyone can potencially have 32C anyone could benefit.. but rather than this kind test i used to see gaming.. gaming od HDET?! WTF
pandemonium - Wednesday, November 14, 2018 - link
All of these "my work doesn't have any desktop users" comments crack me up. Congratulations. Your work is not the entire world of computing in a professional space, much less prosumer space. Get over yourselves.halcyon - Wednesday, November 14, 2018 - link
@Ian CutressYour tests and review text are always a pleasure to read, thank you for the professionalism.
Questions related to the test suite (I know, everybody always wants something):
1. You are missing an Excel Spreadsheet calculation (Finance still uses a lot of these and they can peg all cores near 100% and be incredibly CPU dependent). Would be nice to see some for example an Excel Monte Carlo simulation kn the suite (local data)
2. Alternatively an R (language) based test for heavy statistical computation. Finding a one that is representative of real world workloads and strikes a balance between single core IPC and many core parallelisation might take some work. But this is one area where laptops just can't muster it and CUDA/OpenCL acceleration isn't often available at all.
3. For Web / JS framework it is nice to see SpeedoMeter and WebXPRT3, but for some reason V8 Web Tooling Bench is not there (https://v8.github.io/web-tooling-benchmark/ ). The old Kraken/JetStream/Octane are nice for reference, but not very representative of real world anymore for some time now (hence why they are abandoned).
Again thank you for this monumental work, the amount of tests is already superb!
For graphing results it would be so helpful to get a comparative price/perf graphed results browser (pick your baseline CPU, pick workloads, cpus on graph as a func of price/perf). This would enable auick viewinf of the efficient frontier in terms of price/perf for each workload and see the base CPU as an anchor.
Yeah, yeah, I know.... Just throwing this in here 😀
KAlmquist - Wednesday, November 14, 2018 - link
These benchmarks also show the 16 core TR 2950X beating the 18 core i9-9980XE in some cases.KAlmquist - Wednesday, November 14, 2018 - link
My previous comment was a reply to nexuspie's observation that, "These benchmarks show that the 9980's 18 cores often BEAT the 2990wx's 32 cores."Stasinek - Wednesday, November 21, 2018 - link
Witch should lead to conlcusion AMD Threadripper 2 is just bad offer except 2950.It's the one and only AMD CPU worh mentioning - witch means TR4 16C is dead end.
AMD offers overpriced CPUs on that platform that is for sure.
Overpriced because half of cores are choking being absolute useless.
If 32C and 4 channels is too much cores/channel imagine RYEN 3 16C on dual channel..
It will be big dissapointment for some people i bet.
Regardless of pricing Intel 9980 is just great.
Stasinek - Wednesday, November 21, 2018 - link
Is that what you wanted to say?crotach - Wednesday, November 14, 2018 - link
I have to say I'm a big fan of HEDT platforms, I built my last workstation in 2011 and it still serves me well 7 years later. But looking at this and the X299 offering I really don't see why anyone would bother.Lolimaster - Thursday, November 15, 2018 - link
Till intel changes the way it builds high core count cpu's they can't compete with AMD and it will be even worse next year when AMD made an already cheaper way to produce high core count cpu's even cheaper, to sick levels.Gasaraki88 - Thursday, November 15, 2018 - link
I'm actually more interested in the i7-9800X vs. the i9-9900K. I want to see how the overclocking is compared to the i9-9900K before I just in to X299.TheJian - Friday, November 16, 2018 - link
I stopped reading when I saw 8k with a 1080. Most tests are just pointless, as it would be more interesting with a 1080ti at least or better 2080ti. That would give the chips more room to run when they can to separate the men from the boys so to speak.Vid tests with handbrake stupid too. Does anyone look at the vid after those tests? It would look like crap. Try SLOWER as a setting and lets find out how the chips fare, and bitrates of ~4500-5000 for 1080p. Something I'd actually watch on a 60in+ tv without going blind.
Release groups for AMZN for example release 5000 bitrate L4.1, 5-9 ref frames, SLOWER. etc. Nfo files reveal stuff like this:
cabac=1 / ref=9 / deblock=1:-3:-3 / analyse=0x3:0x133 / me=umh / subme=11 / psy=1 / psy_rd=1.00:0.00 / mixed_ref=1 / me_range=32 / chroma_me=1 / trellis=2 / 8x8dct=1 / cqm=0 / deadzone=21,11 / fast_pskip=0 / chroma_qp_offset=-2 / threads=6 / lookahead_threads=1 / sliced_threads=0 / nr=0 / decimate=0 / interlaced=0 / bluray_compat=0 / constrained_intra=0 / bframes=8 / b_pyramid=2 / b_adapt=2 / b_bias=0 / direct=3 / weightb=1 / open_gop=0 / weightp=2 / keyint=250 / keyint_min=23 / scenecut=40 / intra_refresh=0 / rc=crf / mbtree=0 / crf=17.0 / qcomp=0.60 / qpmin=0 / qpmax=69 / qpstep=4 / ip_ratio=1.40 / pb_ratio=1.30 / aq=3:0.85
More than I'd do, but the point is, SLOWER will give you far better quality (something I could actually stomach watching), without all the black blocks in dark scenes etc. Current 720p releases from nf or amzn have went to crap (700mb files for h264? ROFL). We are talking untouched direct from NF or AMZN. Meaning that is the quality you are watching as a subscriber that is, which is just one of the reasons we cancelled NF (agenda TV was the largest reason to dump them).
If you're going to test at crap settings nobody would watch, might as well kick in quicksync with quality maxed and get better results as MOST people would do if quality wasn't an issue anyway.
option1=value1:option2=value2:tu=1:ref=9:trellis=3 and L4.1 with encoder preset set to QUALITY.
That's a pretty good string for decent quality with QSV. Seems to me you're choosing to turn off AVX/Quicksync so AMD looks better or something. Why would any USER turn off stuff that speeds things up unless quality (guys like me) is an issue? Same with turning off gpu in blender etc. What is the point of a test that NOBODY would do in real life? Who turns off AVX512 in handbrake if you bought a chip to get it? LOL. That tech is a feature you BUY intel for. IF you turn off all the good stuff, the chip becomes a ripoff. But users don't do that :) Same for NV, if you have the ability to use RTX stuff, why would you NOT when a game supports it? To make AMD cards look better? Pffft. To wait for AMD to catch up? Pffft.
I say this as an AMD stock holder :) Most useless review I've seen in a while. Not wasting my time reading much of it. Moving on to better reviews that actually test how we PLAY/WATCH/WORK in the real world. 8K...ROFLMAO. Ryan has been claiming 1440p was norm since 660ti. Then it was 4k not long after for the last 5yrs when nobody was using that, now it's 8k tests with a GTX 1080...ROFLMAO. No wonder I come here once a month or less pretty much and when I do, I'm usually turned off by the tests. Constantly changing what people do (REAL TESTS) to turning stuff off, down, (vid cards at ref speeds instead of OC OOTB settings etc), etc etc...Let's see if we can set up this test in a way nobody would do at home to strike down advantages of anyone competing with AMD. Blah. I'd rather see where both sides REALLY win in ways we USE these products. Turn everything on if it's in the chip, gpu, test, etc and spend MORE time testing resolutions etc we actually USE in practice. 8k...hahaha. Whatever. 13fps?
"Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain."
Yeah, I'm out. Dropdown quality is against my religion and useless to me. I'm sure the other tests have issues I'd hate also, no time to waste on junk review tests. Too many other places that don't do this crap. I bought a 1070ti to run MAX settings at 1200p (dell 24in) in everything or throw it to my lower res 22in. If I can't do that, I'll wait for my next card to play game X. Not knocking AMD here, just Anandtech. I'll likely buy a 7nm AMD cpu when they hit, and they have a shot at a 7nm gpu for me too. You guys and tomshardware (heh, you joined) have really went downhill with irrational testing setups. If you're going to do 4k at ultra, why not do them all there? I digress...
spikespiegal - Saturday, November 24, 2018 - link
Just curious, but how many of you AMD fanbois have ever been in a data center or been responsible for adjusting performance on a couple dozen VMware hosts running mixed applications? Oh wait...none. In the mythical world according to AMDs BS dept a Hypervisor / Operating system takes the number of tasks running and divides them by the number of cores running, and you clowns believe it. In the *real world* where we have to deal with really expensive hosts that don't have LED fans in them and run applications adults use we know that's not the truth. Hypervisors and Operating systems schedulers all favor cores that process mixed threads faster, and if you want to argue that please consult with a VMware or Hyper-V engineer the next time you see them in your drive thru. Oh wait...I am a VMware engineer.An i3 8530 costs $200 and literally beats any AMD chip made running stock in dual threaded applications. Seriously....look up the single threaded performance. More cores don't make an application more multithreaded and they don't make contribute to a better desktop experience. I have servers with 30-40% of my CPU resources not being used, and just assigning more cores won't make applications faster. It just ties up my scheduler doing nothing and wastes performance. The only way to get better application efficiency is vertical, and that's higher core performance, and that's nothing I'm seeing AMD bringing to the table.
Michael011 - Wednesday, December 12, 2018 - link
The pricing shows just how greedy Intel has become. It is better to spend your money on a top end AMD Threadripper and motherboard. https://mobdro.io/