One of the most popular processors of the last decade has been the Intel Core i7-2600K. The design was revolutionary, as it offered a significant jump in single core performance, efficiency, and the top line processor was very overclockable. With the next few generations of processors from Intel being less exciting, or not giving users reasons to upgrade, and the phrase 'I'll stay with my 2600K' became ubiquitous on forums, and is even used today. For this review, we dusted off our box of old CPUs and put it in for a run through our 2019 benchmarks, both at stock and overclocked, to see if it is still a mainstream champion.

The Core i7 Family Photo

If you want to see all of our Core i7 benchmarks for each one of these CPUs, head over to


Why The 2600K Defined a Generation

Sit in a chair, lie back, and dream of 2010. It's a year when you looked at that old Core 2 Duo rig, or Athlon II system, and it was time for an upgrade. You had seen that Nehalem, and that the Core i7-920 was a handy overclocker and kicking some butt. It was a pleasant time, until Intel went and gave the industry a truly disruptive product whose nostalgia still rings with us today. 

The Core i7-2600K: The Fastest Sandy Bridge CPU (until 2700K)

That product was Sandy Bridge. AnandTech scored the exclusive on the review, and the results were almost impossible to believe, for many reasons. In our results at the time, it was by far and above a leap ahead of anything else we had seen, especially given the thermal monstrosities that Pentium 4 had produced several years previous. Built on Intel’s 32nm process, the redesign of the core was a turning point in performance on x86, one which has not been felt since. It would be another 8 years for AMD to have its ‘Sandy Bridge’ (or perhaps more appropriately, a 'Conroe') moment with Ryzen. Intel managed to stand on the shoulders of its previous best product and score a Grand Slam.

In that core design, Intel shook things up considerably. One key proponent was the micro-op cache, which means that recently decoded instructions that are needed again are taken already decoded, rather than wasting power being decoded again. For Intel with Sandy Bridge, and more recently with AMD on Ryzen, the inclusion of the micro-op cache has done wonders for single threaded performance. Intel also launched into improving its simultaneous multi-threading, which Intel has branded HyperThreading for generations, slowly improving the core by making more of it dynamically allocated for threads, rather than static and potentially losing performance.

The quad-core design of the highest processor of the family on launch day, the Core i7-2600K, became a staple through Intel’s next five generations of the architecture, all the way through Ivy Bridge, Haswell, Broadwell, Skylake, and Kaby Lake. Since Sandy Bridge, while Intel has moved to smaller process nodes and taken advantage of lower power, Intel has been unable to recreate that singular jump in raw instruction throughput, with incremental 1-7% increases year on year, using that power budget to increase operational buffers, execution ports, and instruction support.

With Intel unable to recreate the uplift of Sandy Bridge, and with the core microarchitecture defining a key moment in x86 performance, users who purchased a Core i7-2600K (I had two) stayed on it for a long time. So much so in fact that a lot of people expecting another big jump became increasingly frustrated – why invest in a Kaby Lake Core i7-7700K quad-core processor at 4.7 GHz turbo when the Sandy Bridge Core i7-2600K quad core processor is still overclocked to 5.0 GHz?

(Intel’s answer was typically for power consumption, and new features like PCIe 3.0 GPUs and storage. But that didn’t sway some users.)

This is why the Core i7-2600K defined a generation. It had staying power, much to Intel’s initial delight then subsequent frustration when users wouldn’t upgrade. We are now in 2019, and appreciate that when Intel moved beyond four cores on the mainstream, if users could stomach the cost of DDR4, either upgraded to a new Intel system, or went down the AMD route. But how does the Core i7-2600K hold up to 2019 workloads and games; or perhaps even better, how does the overclocked Core i7-2600K fare?

Compare and Contrast: Sandy Bridge vs. Kaby Lake vs. Coffee Lake

Truth be told, the Core i7-2600K was not the highest grade Sandy Bridge mainstream desktop processor. Months after the 2600K launched, Intel pushed a slightly higher clocked 2700K into the market. It performed almost the same, and overclocked to a similar amount, but cost a bit more. By this time, users who had made the jump were on the 2600K, and it stuck with us.

The Core i7-2600K was a 32nm quad-core processor with HyperThreading, offering a 3.4 GHz base frequency and a 3.8 GHz turbo frequency, with a listed 95W TDP. Back then, Intel’s TDP was more representative: in our recent test for this article, we measured an 88W peak power consumption when not overclocked. The processor also came with Intel HD 3000 integrated graphics, and supported DDR3-1333 memory as standard. Intel launched the chip with a tray price of $317.

For this article, I used the second i7-2600K I purchased back when they were new. It was tested at both its out of the box frequency, and an overclocked frequency of 4.7 GHz on all cores. This is a middling conservative overclock – the best chips managed 5.0 GHz or 5.1 GHz in a daily system. In fact, I distinctly remember my first Core i7-2600K getting 5.1 GHz all-core and 5.3 GHz all-core during an overclocking event in the middle of the peak district one winter with a room temperature around 2C, where I was using a strong liquid cooler and 720mm of radiators. Unfortunately I crippled that chip over time, and now it won’t even boot at stock frequency and voltage. So we have to use my second chip, which wasn’t so great, but still a good representation of an overclocked processor. For these results, we also used overclocked memory, at DDR3-2400 C11.

It’s worth noting that since the launch of the Core i7-2600K, we have moved on from Windows 7 to Windows 10. The Core i7-2600K doesn’t even support AVX2 instructions, and wasn’t built for Windows 10, so it will be interesting to see where this plays out.

The Core i7-7700K: Intel's last Core i7 Quad Core with HyperThreading

The fastest and latest (final?) quad-core processor with HyperThreading that Intel released was the Core i7-7700K, which falls under the Kaby Lake family. This processor was built on Intel’s improved 14nm process, runs at a 4.2 GHz base frequency, and a 4.5 GHz turbo frequency. The 91W rated TDP, at stock, translated to 95W power consumption in our testing. It comes with Intel’s Gen9 HD 630 Graphics, and supports DDR4-2400 memory as standard. Intel launched the chip with a tray price of $339.

The Intel Core i7-7700K (91W) Review: The New Out-of-the-box Performance Champion

At the same time as the 7700K, Intel also launched its first overclockable dual core with hyperthreading, the Core i3-7350K. During that review, we overclocked the Core i3 and compared it directly to the out-of-the-box Core i7-2600K, trying to answer the question if Intel had managed to make a dual-core reach a similar performance to its old flagship processor. While the i3 had the upper hand in single threaded performance and memory performance, the two fewer cores ultimately made most tasks heavy work for the Core i3.

The Core i7-9700K: Intel's Latest Top Core i7 (now with 8 cores)

Our final processor for testing is the Core i7-9700K. This is not the flagship of the current Coffee Lake generation (which is the i9-9900K), but has eight cores without hyperthreading. Going for the 9900K with double the cores and threads is just a little overkill, especially when it still has a tray price of $488. By contrast, the Core i7-9700K is ‘only’ sold in bulk at $374, with a 3.6 GHz base frequency and a 4.9 GHz turbo frequency. The 95W TDP falls foul of Intel’s definition of TDP, and in a consumer motherboard will actually consume ~125W at full load. Memory support is DDR4-2666 as standard.

Upgrading an Overclocked Intel Core i7-2600K
Comparison CPUs
at 4.7 GHz
Released Jan 2011 Jan 2011 Jan 2017 Oct 2018
Price (1ku) $317 $317 $339 $374
Process 32nm 32nm 14nm 14++
uArch Sandy Bridge Sandy Bridge Kaby Lake Coffee Refresh
Cores 4 plus HT 4 plus HT 4 plus HT 8, no HT
Base Freq 3.4 GHz 4.7 GHz 4.2 GHz 3.6 GHz
Turbo Freq 3.8 GHz - 4.5 GHz 4.9 GHz
GPU Gen 6 6 9 9.5
GPU EUs 12 12 24 24
GPU Freq 1350 1350 1150 1200
DDR Support DDR3-1333 DDR3-2400 DDR4-2400 DDR4-2666
PCIe 2.0 x16 2.0 x16 3.0 x16 3.0 x16
AVX Yes Yes Yes Yes
AVX2 No No Yes Yes
Thermal Solder Solder Grease Solder
TDP 95 W N/A 91 W 95 W

The Core i7-2600K is stuck on DDR3 memory, has PCIe 2.0 rather than PCIe 3.0 support, and although not tested here, isn’t built for NVMe storage. It will be interesting to see just how close the overclocked results are to the Core i7-7700K in our tests, and how much of a direct uplift is seen moving to something like the Core i7-9700K.

Pages In This Review

  1. Tackling the Core i7-2600K in 2019
  2. Sandy Bridge: Inside the Core Microarchitecture
  3. Sandy Bridge: Outside the Core
  4. Test Bed and Setup
  5. 2018 and 2019 Benchmark Suite: Spectre and Meltdown Hardened
  6. CPU Performance: System Tests
  7. CPU Performance: Rendering Tests
  8. CPU Performance: Office Tests
  9. CPU Performance: Encoding Tests
  10. CPU Performance: Web and Legacy Tests
  11. Gaming: World of Tanks enCore
  12. Gaming: Final Fantasy XV
  13. Gaming: Civilization 6
  14. Gaming: Ashes Classic
  15. Gaming: Strange Brigade
  16. Gaming: Grand Theft Auto V
  17. Gaming: Far Cry 5
  18. Gaming: Shadow of the Tomb Raider
  19. Gaming: F1 2018
  20. Power Consumption
  21. Analyzing the Results
  22. Conclusions and Final Words
Sandy Bridge: Inside the Core Microarchitecture
Comments Locked


View All Comments

  • Death666Angel - Sunday, May 12, 2019 - link

    I've done some horrendous posts when I used my phone to make a comment somewhere. Mostly because my phone is trained to my German texting habits and not my English commenting habits. And trying to mix them leads to sub par results in both areas, so I mostly stick to using my phone for texting and my PC and laptop for commenting. But sometimes I have to write something via my phone and it makes a beautiful mess if I'm not careful.
  • Death666Angel - Sunday, May 12, 2019 - link

    Well, laptops and desktops (with monitors) are in a different category anyway, at least that's how I see it. :-)
    I work with a 13.3" laptop with a 1440p resolution and 150% scaling. It's not fun, but it does the job. The advantage of the larger screen real estate with a 15" or 17" laptop is outweight by the size and weight increase. I've also done work on 1024x768 monitors and it does the job in a pinch. But I've tried to upgrade as soon as the new technology was established, cheap and good enough to make it worth it without having to pay the early adopter fee or fiddle around to get it to work. Even before Win7 made it a breeze to have multiple windows in an orderly grid, I took full advantage of a multi window and multi program workflow for research, paper/presentation writing, editing and media consumption. So it is a bit surprising to see someone like Ian, a tech enthusiast with a university doctorate be so late to great tech that can really make life easier. :D
  • Showtime - Saturday, May 11, 2019 - link

    Great article. Was hoping to see all the CPU's tested (my 4770k), but I think it shows enough. This isn't the 1st article showing that lesser CPU's can run close to the best CPU's when it come to 4k gaming. Does that look to change any time soon? I was thinking I should upgrade this year, but would like to know if I should be shooting for an 8 core, or if a 6 will be a decent enough upgrade.
    Consoles run slower 8 core proc's that are utilized more efficiently. At some point won't pc games do the same?
  • Targon - Tuesday, May 14, 2019 - link

    There is always the question about what you do on your computer, but I wouldn't go less than 8 cores(since 4-core has become the base on the desktop, and even laptops should never be sold with only 2 cores IMO). If you look at the history, when AMD wasn't competitive and Intel stopped trying to actually innovate, quad-core was all you saw on the desktop, so game developers didn't see a reason to support more threads(even though it would have made sense). Once Ryzen came out with 8 cores, and Intel finally responded, you have to expect that every game developer will design with the potential that players will have 8+ core processors, so why not design with that in mind?

    Remember, a program that is properly multi-threaded in design will work on lower-core processors, but will scale up well when processors with more cores are being used. So going forward, quad-core would work, but 8 or more threads WILL feel a lot better, even for overall use.
  • CaedenV - Saturday, May 11, 2019 - link

    This was a fascinating article! And what I am seeing in the real world seems to reflect this.
    For the most part, the IPC for general use has improved, but not by a whole lot. But if doing anything that hits the on-chip GPU, or requiring any kind of decrypt/encrypt, then the dedicated hardware in newer chips really makes a big difference.
    But at the end of the day, in real-world scenarios, the CPU is simply not the bottle neck for most people. I do a lot of video ripping (all legally purchased, and only for personal use), and the bottleneck is squarely on the Blu-Ray drive. I recently upgraded from a 4x to a 10x drive, and the performance bump was exactly what was expected. Getting a faster CPU or GPU will not help there.
    I do a bit of video editing, and the bottle-neck there is still almost always in storage. The 1gbps connection to the NAS, and the 1GBps connection to my RAID0 of SSDs.
    I do a bit of gaming at 4k, and again the bottleneck there is squarely on the GPU (GTX1080), and as your tests show, at lower resolution my chip will be slower than a new chip... but still faster than the 60-120fps refresh of the monitor.

    The real reason for an upgrade simply isn't the CPU for most people. The upgrade is the chipset. Faster/more RAM, M.2 SSDs, more available throughput for expansion cards, faster USB/USB-C ports, and soon(ish) 10gig Ethernet. These are the things that make life better for the enthusiast and the normal user; and the newer CPUs are simply more capable of taking advantage of all the extra throughput, where Sandy Bridge would perhaps choke when dealing with these newer and faster interfaces that are not available to it.
    All that said; I am still not convinced to upgrade. Every previous computer was simply broken, or could not do something after 2-3 years, so an upgrade was literally necessary. But now... my computer is some 8 years old now, and I am amazed at the fact that it still does it all, and does it relatively quickly. Without it being 'broken' it is hard to justify dropping $1000+ into a new build. I mean... I want to upgrade. But I also want to do some house projects, and replace a car, and do stuff with the kids... *sigh* priorities. Part of me wishes that it would break to give me proper motivation to replace it.
  • webdoctors - Saturday, May 11, 2019 - link

    Great timing, I've been using the same chip for 7 or 8 years now and never felt the need to upgrade until this year, but I will upgrade end of this year. DDR4 finally dropped in price and my GTX1070TI I think is getting throttled when the CPU ain't overclocked.
  • atomicWAR - Saturday, May 11, 2019 - link

    Gaming at 4K with a i7 3930K @ 4.2ghz (4.6ghz capable when needed) with 2 GTX 1080s...I was planning a new build this year but after reading this I may hold off even longer.
  • wrkingclass_hero - Sunday, May 12, 2019 - link

    I've got a 3930K as well. I was planning on upgrading to Threadripper 3 when that comes out, but if it gets delayed I may wait a bit longer for a 5mm Threadripper.
  • mofongo7481 - Saturday, May 11, 2019 - link

    I'm still using a sandy bridge i5 2400 overclocked to 3.6Ghz. Still playing modern stuff @ 1080p and pretty enjoyable.
  • Danvelopment - Sunday, May 12, 2019 - link

    I think the conclusion is slightly off for gaming, from what I could see it's not that the newer processors were only better higher resolutions, it's that the newer systems were better able to keep the GPU fed with data, resulting in a higher maximum frame rate.

    So at lower resolutions/quality settings, when the GPUs could let loose they could achieve much higher FPS.

    My conclusion from the results wouldn't be to keep it for higher res gaming, but to keep it for gaming if you're still using a 60Hz display (which I am). I bet if you tuned quality settings for all of the GPUs to run at 60 FPS your results would sit pretty close at any resolution.

    I'm currently running an E5-2670 for my gaming machine with quad channel DDR3 (4x8GB) and a 1070. That's the budget upgrade path I'd probably recommend at 60Hz.

Log in

Don't have an account? Sign up now