Historically, Dell has addressed the market for higher-end gaming desktops with their Alienware-branded machines, which are frequently built around unlocked CPUs as well as advanced graphics cards. Meanwhile, for those who wanted Dell-branded gaming PCs without the Alienware premium, the company has offered their custom-built Inspiron as well as XPS-branded machines; though there's a large gap between the premium XPS and basic Inspiron as well. So, looking to bridge the gap between their machines and produce a line of gaming-centric yet still reasonably affordable desktops, at this year's Gamescom the company is introducing its first ever Dell G-series desktops. Taking their name from Dell's popular G5 gaming laptops – which are intended to fill much the same role on the laptop side –  these new machines are intended to be Dell's gaming-focused desktops for the wider market.

The Dell G5 desktop (model 5090) is based on Intel’s 9th Generation Core processors and is paired with AMD’s Radeon RX 5700-series or NVIDIA’s GeForce GTX 1660 Ti or RTX 2000-series graphics cards. In its top-of-the-range configuration, the Dell G5 can pack Intel’s Core i9-9900K processor, NVIDIA’s GeForce RTX 2080 GPU, 64 GB of DDR4-2666 memory, a 1 TB M.2 PCIe SSD, a 2 TB hard drive (or two of them), a Killer Wi-Fi 6 AX1650 network card, Gigabit Ethernet, and so on.

Dell emphasizes that its compact G5 desktop is completely user-upgradeable, so owners will be able to easily install a new graphics card or upgrade to more storage when they need to. Meanwhile, since the machine uses a motherboard based on Intel’s H370 chipset, it does not support CPU overclocking, unlike Alienware-branded computers. The lack of overclocking support also means that Dell can stick with a (relatively) conservative 480 Watt power supply for the system, as there's no need for a bunch of overclocking headroom in the power delivery design. Overall, this is enough for a 9900K CPU paired up with one of NVIDIA's GeForce RTX 2080 video cards, but is likely a factor in why we don't see an RTX 2080 Ti here.

Unlike many gaming desktops these days, Dell’s G5 will not come with liquid cooling, but will rely on proven air cooling systems with heat pipes. Keeping in mind that CPU overclocking is not supported by the platform, air cooling should be plenty sufficient. Meanwhile, those who would like CPU and GPU to at least hit their maximum boost clocks more often can set appropriate thermal profiles in the Alienware Command Center software.

Dell’s G5 desktops will be available starting August 19. Prices will start at $629, with more advanced configurations coming in at higher prices.

Related Reading:

Source: Dell

POST A COMMENT

68 Comments

View All Comments

  • blppt - Tuesday, August 20, 2019 - link

    "you are forgetting one point. the ryzen cpus... use quite a bit less power then the 9900k, which it needs to use to get you that performance. cap an intel cpu at the power useage intel states.. and your performance.... goes down the toilet."

    The point being that you CAN reach those speeds with the 9900K/KS OOB, whereas it is a very iffy proposition to overclock a 3700/3900 to 4.7/4.8/5.0---in which you may also be sacrificing stability to do so. Going back a ways to the highly flawed BD/PD line, the 9590 for example, failed not because it was a 220W monster, it was because it was a 220W monster that couldn't beat the equivalent i7s at the time whilst using that much electricity. The 9900K/KS can be better in gaming OOB than the equivalent Ryzen 3xxx, albeit drawing more electricity.
    Reply
  • Korguz - Tuesday, August 20, 2019 - link

    not argueing that.. but.. is the extra power usage, better cooling ( which intels cpus.. do not come with a cooler at all) and higher price tag worth it ?
    ryzen 3000 series is pretty much on par with intel on IPC at the clocks it runs at, how much worse will it look if ryzen 3000 were able to hit the same clocks ?

    with PD/BD people ridiculed amd for its power usage.. but now.. that intels power usage is worse.. it seems most people are ok with it.. how does that work ??
    Reply
  • blppt - Tuesday, August 20, 2019 - link

    "with PD/BD people ridiculed amd for its power usage.. but now.. that intels power usage is worse."

    Well, Intel's best consumer chip, (i guess the 9980XE?), doesn't draw a TDP of 220W like the 9590 did. Granted, it doesn't show up well against the much newer 3900X, which has a 60W less TDP and generally benches higher on single thread performance, but it also has 6 more physical cores to power. The 9590 vs the 4770K/4790K was a difference of like 110W for 4 much weaker physical cores to power.

    The non 9590 PD/BD was particularly pathetic in single core/thread performance whilst eating tons of wattage, meaning that it was extra bad for the time of its release.

    Not that Cinebench R15 is the be-all/end-all but the BEST case scenario (9590 boosting to 5ghz), it would score a very low ~130 on the single core test, whereas the equivalent intel, either 4770K or 4790K would be in the 160s/170s, using far, far less power. And that was the best of PD/BD---turboing theoretically to 5ghz on one core.

    Trust me, I had a 9590 for a couple of years, it was a blast furnace that didn't really do anything as well as my 4790K, but it was fun to try and get some cooling system that could tame the beast.

    Intel is falling behind on efficiency, agreed, but nowheres near as bad as BD/PD could be. Or Intel's own blast furnace Prescott P4.
    Reply
  • Korguz - Tuesday, August 20, 2019 - link

    blppt
    " Well, Intel's best consumer chip, (i guess the 9980XE?), doesn't draw a TDP of 220W like the 9590 did. " actually, maybe it does, and maybe more then 220 watts : https://www.anandtech.com/show/13544/why-intel-pro... not sure if this relates to intels latest cpus though. intels TDP ratings, seem more like a minimum then a max.
    Reply
  • blppt - Tuesday, August 20, 2019 - link

    "actually, maybe it does, and maybe more then 220 watts : "

    If you want to go there, so did the 9590 IRL. I'm going by TDP ratings.
    Reply
  • Korguz - Wednesday, August 21, 2019 - link

    :-) yes but the difference was.. amd advertised it as such, didnt they ? intel, says theirs is a 95 watt cpu, quite the difference, correct ? Reply
  • blppt - Wednesday, August 21, 2019 - link

    "yes but the difference was.. amd advertised it as such, didnt they ?"

    They advertised it (the 9590) as a 220W TDP cpu and IRL it could use more than that. Sounds a lot like your example for Intel, no?

    I'll grant you this: the 9590 was in the end a laughably bad idea for a cpu, and a niche product really, not really indicitive of the mainstream parts, although the mainstream models suffered from similar issues (just not to the extreme like this one) of pathetic single thread performance and higher power consumption than the equivalent Intel cpus of the time.

    If memory serves, the power consumption had a lot to do with BD/PD/derivatives being stuck on 32nm for the longest time (intel had been 22nm and below since Ivy Bridge), although while that may have helped with the power consumption, the entire BD/PD line was crippled by its poor single thread performance. We saw that hitting the "5ghz ceiling" as it were still left the 9590 far behind Intel in this crucial field. A move to 22nm may have lowered TDP and allowed mainstream parts to be sold at 5ghz, but that still wouldn't have helped much in solving BD/PD flaws for performance.

    Intel seemingly is starting to suffer from the same issues, amusingly as AMD did back then---stuck on 14nm for the longest time, with 10nm just arriving shortly, but the difference now is that they still are competitive in single thread/single core performance (or slightly better), and the 'power consumption per performance' excess is currently not as bad as things were with the 9590.
    Reply
  • Korguz - Wednesday, August 21, 2019 - link

    it does sound similar to my example :-) never payed attention to the 9590 as it wasnt a cpu i was interested in, but after doing a little search.. dang.. sure did suck the power from the wall depending on what it was doing, didn't it ? but the question still stands, amd got flack and ridiculed for its power consumption then, ( stated, or actual ) but it seems, intel, isn't getting the same flack or ridicule. go figure Reply
  • blppt - Thursday, August 22, 2019 - link

    Yes, I agree with most of that, but what you aren't grasping here is that while AMD was sucking more power (and in the case of the 9590, a LOT more power) they were also coming up VERY short in everyday performance, especially in single thread situations, which gaming and general mainstream computing tasks back then (and to this day) puts a great premium on.

    And the thing the BD/PD was supposed to be great at (highly multi-threaded optimized workloads), had very little application in mainstream tasks at the time, and the 9590 wasn't in actuality appreciably faster than my 4790K in those few applicable highly-threaded tasks anyways.

    So, while yes, Intel has now become the 'piggy' of power draw, their performance is still competitive and in some cases, superior. Thats why the BD/PD era and Intel's current issues aren't ridiculed equally---its not totally similar.

    For their to be a true parallel here---AMD would have had to make a quantum leap *past* Intel in single thread performance (they haven't) and intel's flagship cpu would have had to draw something along the lines of 200+% of the wattage (9590 vs 4770/4790K) of AMD's flagship (they haven't).
    Reply
  • Korguz - Friday, August 23, 2019 - link

    blppt actually, i am.. while intel uses more power, its being used for something, unlike the BD/PD cores :-)

    i think intel should be ridiculed, specially when a cpu is listed as using XX watts, but when in use, and to get the performance intel claims, is uses 50-100 watts more. hypothetically, if i didnt know much about computers. and went out, bought a new comp, and spent the extra money on a cooler ( cause intel doesn't include one ) for a certain wattage, only to find the comp didn't have the performance it is supposed to, and then find out the cooler i bought wasn't sufficient enough, i would be pretty pissed
    Reply

Log in

Don't have an account? Sign up now