49 Comments

  • Gamers Nexus

    November 6, 2022 - 7:26 am

    Watch our Intel i5-13600K CPU review here: https://www.youtube.com/watch?v=todoXi1Y-PI
    One correction in this review: We just noticed the power efficiency chart contains a charting error (I accidentally plotted the power efficiency for a different test for the 13900K, but the chart is for Blender — so the numbers are correct but they're for the wrong software with the 13900K). This error doesn't change any of our conclusions nor does it change that the CPU does draw 300+/- Watts when under full load, to be clear about that, so power consumption is right. This is the only error we're aware of and, again, the numbers are right but the 13900K result for the Blender chart is different. That particular chart should be between 32.5-37.0, depending on which set of results you take for the math and if you calculate it warm or cold. This means it is still one of the least efficient entries on the chart; however, the 12900K is actually slightly less efficient than the 13900K here, not more. Sorry for that mistake! No excuses. Just too much data too fast, and I'll try to get a few hours ahead of the review cycle next time so I'm not operating so tired. My fault and thanks for your understanding! – Steve

  • SysGhost

    November 6, 2022 - 7:26 am

    Remember when we laughed at AMD for running hotter than everything else?
    Oh how the turns have tabled…

  • Al Michael

    November 6, 2022 - 7:26 am

    Why is the box so big? You can prbly fit an entire motherboard in the Intel CPU box

  • Pragmatic Tornado

    November 6, 2022 - 7:26 am

    I'm cooling my 13900KF with a Noctua NH-D15, with no apparent issues, getting around the same Cinebench R15 multi core score as a random review site did with an AiO cooler. Could've bought the most expensive, best AiO cooler around if I wanted, but I don't trust having liquids in my PC's.

    As a bonus, my PC is helping my heat pump wam up the old ass house I live in.

  • KTMs 1

    November 6, 2022 - 7:26 am

    This makes no sense, isnโ€™t the TDP at max 140 watts for the 13900k? How did you get +300watts?

  • Maria Stevens

    November 6, 2022 - 7:26 am

    Glad I don't need anything new for quite a while. Between my 98se, XP, 7, and Linux machines, I've got all my entertainment covered. Which is hilarious because my main driver (Linux) is a 7600T with a GT 1030 D5.

  • yumyumhungry

    November 6, 2022 - 7:26 am

    Looks like I might finally have to part with my trusty noctua NH-D15 air cooler.

  • PowerRanger83

    November 6, 2022 - 7:26 am

    Just out if interest, why do you no longer plot temperature vs clock speed anymore in your latest CPU reviews?

  • hypercube33

    November 6, 2022 - 7:26 am

    "If you have one of these…" I feel attacked being on a Ryzen 5 1600 ๐Ÿ™‚

  • Mo7amed 7afez

    November 6, 2022 - 7:26 am

    Tech Quotes made a video about the problem with an English subtitle explaining the problem
    #Respect_ME_PC_Community

  • Feby

    November 6, 2022 - 7:26 am

    Under what version of Win 10/11 were these tested ?

  • WAR WAR

    November 6, 2022 - 7:26 am

    how much watts in idle 13900k and 7950x??

  • Gjermund Skogstad Lingรฅs

    November 6, 2022 - 7:26 am

    I miss a temperature chart comparison.

  • ToonNut1 ToonNut007

    November 6, 2022 - 7:26 am

    13900kf and z690 doesn't work properly! Msi tomahawk behaving weird! Cinebench score is half it should be just by rasing the ring by +1

  • Petr Kinkal

    November 6, 2022 - 7:26 am

    I looked on the results for adobe premiere from puget systems site (the article is "Adobe Premiere Pro: 13th Gen Intel Core vs AMD Ryzen 7000" can't link because youtube) and they have very different results is there any explenation for that?

  • Konziu

    November 6, 2022 - 7:26 am

    Can you TRY contactframe with 13 gen ?

  • scorntooth

    November 6, 2022 - 7:26 am

    steve @ .25 playback speed in yt is best steve. I guarantee it

  • Ellieโ€™s fun and games

    November 6, 2022 - 7:26 am

    Whatโ€™s wrong with eleventh gen itโ€™ll cpus?

  • Eng-Firas

    November 6, 2022 - 7:26 am

    If you play games ๐ŸŽฎ buy PS5, it will use 218W only. Save your money ๐Ÿ’ฐ.

  • jimster1111

    November 6, 2022 - 7:26 am

    the i9-13900K and the 4090 both pull over 1000 watts. as an electrician your average 15 amp circuit trips when you go over 1500 watts average load. so in my opinion make sure to save often if youre gaming on a large TV thats connected to the same circuit.

  • J

    November 6, 2022 - 7:26 am

    Please add some Rocket Lake CPUs for comparison. It exists and should be added.

  • Shmeh Fleh

    November 6, 2022 - 7:26 am

    Having some real flashbacks to the Pentium 4 days. Intel seems to forget after a decade or so how to make a chip that's not also a hot plate.

  • Andrew McKeon

    November 6, 2022 - 7:26 am

    LOL, this was great.

  • Ilia Gofman

    November 6, 2022 - 7:26 am

    I've got a conspiracy for you. Current GPUs and CPUs use more electricity purposely, to increase energy use through inefficiencies, to increase fossil fuels like oil dependence for energy production

  • Chris Vog

    November 6, 2022 - 7:26 am

    Also with power consumption, the returns on mining with this hardware will be interesting to see if it's worth it or not.

  • Tainen1

    November 6, 2022 - 7:26 am

    What version of windows did you use for gaming benchmarks? Are the 7950x numbers suffering from poor win11 scheduling?

  • ThinkingBetter

    November 6, 2022 - 7:26 am

    Give me some 3nm silicon from Intel with higher processing efficiency, thanks.

  • Bradley Ulis

    November 6, 2022 - 7:26 am

    The crash moment pleases me

  • Grithertime

    November 6, 2022 - 7:26 am

    How many times did Steve say "We are still processing the data."

  • xAWESOMEPOSSUMx

    November 6, 2022 - 7:26 am

    im rounding that up 420fps nice then we round down from 423 to 420……nice

  • qwer55555555

    November 6, 2022 - 7:26 am

    the amount of power that modern PCs require is insane. 300W for just CPU? whaaat?

  • Super Sai

    November 6, 2022 - 7:26 am

    0:25 someone tell me are we supposed to read that pop out on the left? Barely 3 seconds.

  • NagoyaR

    November 6, 2022 - 7:26 am

    For years to come Intel, AMD & Nvidia should really focus on efficiency instead of getting more fps. We can't have pc draw 2000W at some point.

  • Freedom 101%

    November 6, 2022 - 7:26 am

    i am currently with an i 9-12900t on a motherboard gigabyte gaming x B660

    i want to upgrade into i9-13900k cpu

    is a bios update into F5 bios version is enough?

  • Ankur Singhal

    November 6, 2022 - 7:26 am

    Lol, 13900KS + RTX4090ti = "UNLIMTED POWER" by Emperor Palpatine !!!

  • rubbersoul420

    November 6, 2022 - 7:26 am

    Man the lighting looks so much better on this video than it usually does.

  • Kenneth Dias

    November 6, 2022 - 7:26 am

    So how hot does it get can you bake a cake with iy

  • Vitaly Kravchenko

    November 6, 2022 - 7:26 am

    44W in single core power consumption for 13900K and 31W for the whole 10 core cluster in M1 Pro/Max. Meanwhile single-core performance of 13900K is 20% faster than that of M1 Pro/Max. Truly impressive!

  • Madalin

    November 6, 2022 - 7:26 am

    This brings up memories from 2000's Pentium 4 versus Athlon XP fight. Good old "more power and heat" trick used again to beat AMD… It is only a temporary solution designed to mask the fact they can't deliver a new better architecture (yet).

  • Nik

    November 6, 2022 - 7:26 am

    Was there any coverage on the 13700K?

  • The Void

    November 6, 2022 - 7:26 am

    I would love to see a video hosted by Andrew and Mike where they talk about what they look for when new Hardware is being announced and when you guys do any testing, seeing how they are on the (post-)production side of this whole operation and probably have some interesting insights that aren't necessarily about about gaming. On that note, I do want to highlight how much I appreciate you incorporating benchmarks like Blender and Adobe products ๐Ÿ™‚

  • James296

    November 6, 2022 - 7:26 am

    I guess you could say intel tried to bulldoze with a raptor driving. Maybe they should take a zen moment and design a new architecture.

  • TheSpanjaMan

    November 6, 2022 - 7:26 am

    Too much. A few years back I thought stuff would be more efficient.

  • ashesh parmar

    November 6, 2022 - 7:26 am
  • Chillazar

    November 6, 2022 - 7:26 am

    I fucking hate the kind of competition that is just so ignorant to power consumption. I'd much rather see small gains and lower or equal power consumption compared to last gen.

  • Matthew Bartlett

    November 6, 2022 - 7:26 am

    This "just brute force it with more power" is a terrible trend. In a time when power generation is not growing massively and we're looking to be more efficient and less wasteful, this is insane.

  • AL LA

    November 6, 2022 - 7:26 am

    Why are the 0.1% Lows so low ? I'm scared now that I have one on the way.

  • Chris G

    November 6, 2022 - 7:26 am

    Kudos to whoever set up the lighting on the set! I had to check my playback settings when the video started to see if I set the resolution too high lol. Please keep it. It looks great!

  • Rocco Ciccone

    November 6, 2022 - 7:26 am

    Please do some virtualisation tests. Intelโ€™s chips with e-cores are a massive pain in the but for that.

Comments are closed.