Core Wars: AMD's Threadripper Takes On Intel's i9

May 31, 2017
By: Rob Enderle

Boy, it’s starting to feel like summer in the tech market because the competitive fights are really heating up.  And the most amazing part is that all this drama is happening in the desktop space, a market that was largely written off as dead by both vendors a little over five years ago.  But gaming and VR desktops are starting to look interesting again, and Intel (News - Alert) is playing catch up because AMD realized that the whole “mobile is taking over the world thing” was wrong headed first.  

Let’s look at this latest core war as AMD’s (News - Alert) Threadripper takes on Intel’s Teraflop i9, eventually.

Timing

Now, be aware that AMD’s part is expected in a few weeks, while Intel’s is due more towards the end of the year, making Intel’s launch more of a paper launch at the moment.  Still, this is designed to stall the market so that people wait to buy until they see Intel’s part tested.  However, the issue with the timing is that kids in school often have summers off and that is when they’d likely prefer to build a new computer or buy one and have it to play games.  AMD’s Threadripper, I love that name, should arrive late, but within the window, while Intel’s i9 will arrive later in the year.   You can’t expect people to wait for something better when they want to game now, particularly when you are talking about young adults who are known for thinking tactically and needing instant gratification. 

Consuming Cores

About a decade ago, Intel sent me an 8-core machine and, as I worked and played games, I would bring up the performance monitor to find that I rarely lit up 3 cores let alone anything over 6.  But apparently a lot has changed because I’m writing this on an 8 core i7 and all 8 cores are almost constantly lit up and running at about 30 percent to 50 percent capacity. Plus, when I play a current generation strategy game, I can often peg the cores, suggesting that now I can actually use more than 8 even in what I’m doing.  However, the folks that will really like this extra headroom are those that work with visual tools, such as photo editors, video editors, 3D creation, Computer Aided Design, engineering, or graphic arts.   You and I may like the cores but folks using those tools need them.  

Core Wars

AMD launched with 16, and Intel countered with 18, but AMD also launched a 32 Core processor (named Epyc) for enterprises, suggesting they have a ton more potential core headroom to match and possibly raise Intel by year end when the i9 actually ships. So, this fight is far from over and we are likely to see a return to something similar to the MHz wars of the 1990s, but this time with cores instead of clock speed.

3D-Xpoint

Now, there is a technology that plays on the periphery of this, and it is 3D-Xpoint memory.  Currently Intel has it, branded Optane, and AMD doesn’t, though they could get access through Micron, which shares rights to this technology with Intel.  This is the wild card that could be a king maker, but currently it is being used mostly to speed up drives, not replace them.  When it moves to an SSD replacement, AMD should be able to use it as well as Intel does, making it an important, but potentially fleeting, advantage depending on how it plays out.  Optane, 3D-Xpoint is substantially faster than flash, performing closer to DRAM, and the visible performance improvement it has showcased is impressive (though I have yet to use it myself).  

Wrapping Up:  More Excitement to Come

The core war thing is only the tip of the iceberg when it comes to battle royals in the PC space.  This initial fight is on desktop computers but it will quickly move to laptops, and overlaying this will be another Battle Royal between AMD and NVIDIA (News - Alert) over graphics technology.  AMD is also launching a new line of graphics cards and NVIDIA has been pumping up their cards at an incredible pace.  

In a fight like this, the only people that truly win are us because from gaming, to VR, to actually getting work done that pays the bills, our next computer is likely to be incredible regardless of whose technology is in it!



 

Original Page