Came here to say exactly that. Stop this team-BS. Buy the product that suits your need best. This goes for all walks of life, from Smartphones, to GPUs and CPUs to clothing and furniture.
So you did exactly what I advocate for: You looked at the market and got what seemed like the best thing to get for yourself. You did not base your purchase descisions on some weird affiliation to a brand that doesn't give a flying toss about you in return.
They absolutely don’t care about me nor I care about them. I wanted performance, raster is what I seek so the XTX was a better deal for that but I could still afford the 4090 so I did
Yes. It's showing its age, but it holds up okay today and has served me well for the last 6 years.
I'll probably be replacing it with a 4070 Ti, but I'm also considering a 6800/6900 XT, or a 3070 Ti, or a used 3080. I'll figure it out some time after the 4070 Ti release when the prices settle.
My Ryzen 5 3600 will be a bit of a bottleneck for 1440p 95 Hz, but I'll do a slow sequence of upgrades as I go along.
I have a gold certified 750W power supply, the MSI MPG A750GF. It should be adequate for any of the above-mentioned GPUs.
My motherboard will probably be the next thing I upgrade after my GPU, it's only got PCIe 3.0. Then probably a beefier AM4 CPU. Or maybe I do mobo/cpu in one shot if I decide to switch over to Intel or AM5 instead.
We'll see.
I was honestly just kidding around.
I don't think your cpu will be too bottlenecking tbh.
I had a 5600g with a 4090 at 1440p and I was pushing full 165 frames my monitor was capable of. I run a 5800x3d now and I never bottleneck my GPU. It can always hit 100% before my CPU both at 5600g and 5800x3d
I intend to go higher resolution which will make the bottlenecking concept even more negligible for future games.
I will eventually upgrade mobo, CPU, and everything else to go am5 or Intel next gen, but for now my GPU is gnarly and the CPU is keeping up pretty effortlessly.
I loved my 1080 when I had it, kept it for 4 years
Yeah. The GTX 10-series cards had surprisingly good longevity. It was such a huge leap from the 9-series. The same can be said about the 4090, a huge leap over the 3090. I bet you'll be able to make due with yours for a long while. Also, nice pairing with that 5800X3D, I'm jelly. The improvements to 1% low must make it so buttery smooth. Cheers.
I'm oversimplifying, but here's the basic idea. So, let's say you get a "steady" average of 165 FPS, you're hitting that 99% of the time. However, the average can be misleading. There can still be a lot of volatility in your FPS, going up and down, but you'll still see an average of 165 FPS. You want to minimize that volatility. The 1% low is a good metric for that volatility. It is the 1% of the time that the FPS drops momentarily, for example, to 100 FPS. This can cause hitches and micro stuttering. Not everyone notices it, it's difficult to perceive, or easy to ignore. But if you're a high refresh rate gamer, you probably will notice it. It'll be smooth most of the time, but sometimes you might notice it is less smooth intermittently. Ideally, you want the 1% low to be as close to the average FPS as possible to minimize this problem.
The 3D V-Cache in your 5800X3D is supposed to increase the average FPS overall in many games. In addition to that, many have reported an improvement to the 1% lows as well. Basically, you should perceive fewer hitches and micro stutters, i.e. buttery smooth gameplay.
Similarly, there is also the 0.1% lows, which the 3D V-Cache also improves. Basically just a smaller window of time where the FPS dips.
I have the 4080 and i9 13900K, i do not have stutters, the fps go from 65-180 so it is not stabile. My legion 5 3070 did have stutters, i did put in 64gb Kingstone fury in it and no more stutters.
I disagree. The 4090 is the best performance without compromising. 6800xt has previously been a great value but it’s the definition of compromise.
I’m personally looking forward to the future RTX remix mods that will hit once the tool premiers. Considering AMD performance with portal RTX, those cards won’t be able to cut it.
The 13700k is an absolute beast and punches above its weight class vs even zen 4 cpus.
I personally went from a 12700k to a 13700k. I found more frames with certain games such as Tarkov. But the real reason for my upgrade was because the high speed, low latency ram I had purchased was not stable with my 12700k (6400mhz C32). It’s perfectly stable and error free with the 13700k.
I’d say the bar for the value of an upgrade is relevant to that specific buyer alone. Some upgrade every few years, others love to build and upgrade more frequently. Either is ok because PC gaming and building as a hobby is fun and brings people joy!
if you don’t mind me asking how is a 6800xt a compromise compared to the last gen cards? the only thing its really missing is rt, other than that its pretty on par with the 3080/ti
You said it. RT. If you choose to save money and ignore RT, you’re compromising.
When RTX Remix games / mods begin to appear, the owner of a 6800XT will be missing out on a lot of great, free content because of that compromise. Not to mention RT enhanced future releases.
yeah i took that into consideration but honestly by the time rt starts to replace rasterization i’ll probably be looking to upgrade anyways, honestly it was dlss that almost sold me for team green but i only play at 1440p and this card doesnt have an issue doing 1440 p 144hz in most titles
Very easily and the fact that you're upvoted shows how harmful the Tech Andy culture is. Ray tracing requires CPU to be coordinated. A lot more draw calls, building BVH structure... Tired of people saying the lies of "CPU doesn't matter for gaming if you're at 4k ultra".
I mean its obvious that at a certain point a cpu won’t be able to keep up with RTX calculations but I’m sure the 5800x isn’t bottlenecking any gpu in a game with Ray tracing on unless it’s run at like 240p or something. I can’t see a 13700k adding more than a couple frames of performance. I’d understand if it was a 3600 or something.
Edit: If the gpu has more headroom the 13700k will obviously push more frames. I’m talking about scenarios where the gpu is fully utilized.
Keep in mind that this thread is about a fucking 4090.
Here's a benchmark with 3090 ti and without 13th Gen CPUs that had a good leap of clocks over 12th gen.
3090 ti is A LOT slower than 4090, and 12th gen is definitely not as fast as 13th gen either.
https://static.techspot.com/articles-info/2520/bench/2160p-High-p.webp
Notice how 12900K is 9% faster than 5800x already. Now extrapolate that with a 4090+13700K, no hard numbers and I don't care to spend time looking for a benchmark with that specific configuration but it's guaranteed that the delta is much, much, much higher.
Oh and wait, maybe at 1440p it would be different? You're right, it is. Holy shit.
https://static.techspot.com/articles-info/2520/bench/1440p-High-p.webp
Source:
https://www.techspot.com/article/2520-spiderman-cpu-benchmark/
Ok that’s one great example linked to limitations in memory bandwidth. Nothing surprising about 12th gen intel with extremely fast memory outperforming last gen Ryzen in a heavily CPU bound scenario.
samsung odyssey neo g7 165h 4k .yes i do
[https://www.reddit.com/r/nvidia/comments/zzada9/comment/j2amcis/?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/nvidia/comments/zzada9/comment/j2amcis/?utm_source=share&utm_medium=web2x&context=3)
I understand switch to 4090 gpu, i dont understand switch to dead intel socket and cpu, instead of upgrading to x3d later, on futureproof am5 platform.
Many including me upgrade every 5 years or so, so it is irrelevant.
Also, now Intel gives the option to carry over a DDR4 kit and upgrade to DDR5 with the same CPU once value and performance improves.
I bought a 13600k, kept my DDR4 and I paired with a cheap B660 that costs half as much (or less) as an AM5 board - then in a year or two I can pick a discounted DDR5 Z6/790 and a new memory kit that will be both faster and cheaper than current DDR5s... This is a far better upgrade path IMHO.
That's exactly why it's relevant. People who bought 1000 or 2000 series Ryzen 5 years ago can upgrade to the 5600x and get modern CPU performance for pennies, or go for the 5800x3d to get comparable performance to the top end CPUs out there in gaming.
Whereas if you went for an Intel 7600k/8600 you're screwed.
It'll be the same with am5, in 4 years you'll be able to drop in a ryzen 10800x or whatever they'll call it.
And the cost of opportunity of doing so now is to buy an overpriced board with a DDR5 that is still very much in the process of maturing.
As for early AM4 adopters fair enough, though a friend of mine who bought a second series Ryzen with a first gen mobo had so many issues after updating the BIOS that he turned sour on PC gaming as a whole. I wonder how an 5800 would work on his ancient mobo. :)
I'm curious since it doesn't void warranty. I'm not sure why it matters.....
These newer GPUs produce so much heat that seems backward to WC, a cpu and not the gpu..
Dang you bought the 4090 for $2500? If you don’t mind me asking. Do you regularly use that kind of power and see yourself needing it? Or is it the kind of thing where you want the best money can buy just because you have the money? No hate? Just wondering. Lately I’ve found myself thinking about people paying these outrageous prices and how many people NEED that kind of juice lol
> neo g7
freaking Samsung, most of their high refresh rate VA monitors have "scan lines". I have them on my g7 (the 1440p 240hz one) and with CRU and a custom resolution you can make them go away as you lower the refresh rate (about 185hz is where mine go away). So frustrating, doesn't seem like Samsung cares.
Ohhh I see, definitely an insane upgrade from the 6800.
I see it’s a mix of you can definitely take advantage of the card with your nice monitor and of course you have the money to blow on the card lol. I was gonna buy a 3090 Strix for $900 but I’m gonna just get a 3070 Strix I got a good deal on and wait what and see what the GPU market does next year
Yep, and that's completely fine. I enjoy seeing them. Just found the "your thoughts..." a bit silly :D
But since you didn't take offense - none was meant, my actual thoughts are that you should try, if the build allows it, to get better color harmony (unless they're not static and this is just a momentary shot) :)
One thought is: why the 13700K?
The platform is dead and you could have gone with better performance out of the box with the 7950X, or better a temporary 7600X leaving you with a monumental upgrade to the 7950X3D in about two months.
Or even better, wait the X3D as we are all doing. I am doing so... a few months is not really important after all.
Perhaps people are not sold on the "5+ years" promise. But AM5 will shine when the X3Ds come. Intel will have some really bad time during 2023.
I hope they also do a die revision for the X3D to have a stronger IMC for higher than 6400 DDR5, but probably that won't happen yet. Not that it will really matters for the 3D, thanks to the VCache.
13900k has more stock clocks and has the same boost TDP - better performance for similar power hunger.
We live in a world where people skips their abilities to think.
Why are you talking about settling on cpus when this person is running a 4090, probably at 4k. Cpu bottleneck isn’t a worry unless you are playing very high frames
4090 is capable of running more FPS than most CPUs may prepare at 4k.
And I said only that 13900k has more stock clocks over 13700k, and these slightly better clocks will improve that FPS.
Yours and as I see a lot of other schemas that '4k is obviously GPU bound' isn't working anymore with 4090.
Oh ok. I never thought of it that way. Either the number of frames is too high that bottlenecks the gpu or the fact the lower resolution is more reliant on cpu than gpu. I don’t know the actual reason.
Very seldom is a cpu the bottleneck for games. Now, if you wanna max out a 240hrz monitor playing esports titles, you are going to need a nice cpu. If you are just playing aaa games at high fidelity at 1440p or 4k, then the gpu is the main worker. Since the gpu probably won’t reach higher than 90 frames, the cpu is largely not relevant as long as it’s generically modern. For my I just look at what’s the current gen architecture, look at what’s the most popular on newegg, and that will suffice.
There is barely any difference in gaming between the two (13700k and 13900k).
Look at the 1440p and 4k graphs here:
[Intel Core i7-13700K Review - Great at Gaming and Applications - Performance Summary & Performance per Dollar | TechPowerUp](https://www.techpowerup.com/review/intel-core-i7-13700k/25.html)
13600k has more OC headroom, too. An OC'd 13600k performs better than a stock 13700k for gaming, although you'll never notice the difference outside of benchmarks
I noticed there are a lot of people on this sub that don't value $ at all.
Just because you can pay it, doesn't mean you should. lol
It actually has to make a difference.
Right???? I also don't get the logic ' you spent so much on this! I am sure you can spend some more'. Nobody ever says 'darn! You spent so much on the important stuff, you can definitely save some on xyz as it gives your the least benefit '.
It wouldn't matter that much (for gaming) if it were an i7 or an i9 13th gen really... what matters is the upgrade from a 5800x in the first place. It's not even that much of an upgrade, more of a sidegrade.
Yes lol it can... manage... a 4090. You need an actually old CPU to get considerably decreased performance. Think 9th gen intel core i / 2nd gen AMD Ryzen.
13600k would pair nicely with a 4090 as well. That i5 is a beast. We are back to the days where an i5 is just fine and going to i9 isn't that much of a difference. Go look on GN if ya want proof. Hell it is as good or better than last gen i9.
Probably because his 6800xt was hooked up to it too, so to cool both gpu and cpu he had 2 radiators. And didn't get waterblock for new card but kept using existing loop setup, no point to remove a radiator.
A bit overkill with the GPU. When you got that much money to spend you might as well go for an i9-13900K.
Anyway I'm just glad that Intel stopped being complacent. Even if you won't be buying an Intel processor, the added competitive pressure should keep AMD in check. As a Ryzen 7900X owner I can assure you these chips are aggressively tuned out of the box.
the first time i turn the pc horrible coil whine came from the gpu after 30 sec thats gone
im only with closed case .on battlefield 2042 after 30min 60c gpu ,70 junc temp , 2805 ghz , 360w(on average) fan 1200rpm i can barely hear them
I was honestly just kidding around.
I don't think your cpu will be too bottlenecking tbh.
I had a 5600g with a 4090 at 1440p and I was pushing full 165 frames my monitor was capable of. I run a 5800x3d now and I never bottleneck my GPU. It can always hit 100% before my CPU both at 5600g and 5800x3d
I intend to go higher resolution which will make the bottlenecking concept even more negligible for future games.
I will eventually upgrade mobo, CPU, and everything else to go am5 or Intel next gen, but for now my GPU is gnarly and the CPU is keeping up pretty effortlessly.
I loved my 1080 when I had it, kept it for 4 years
Enjoy the new rig.
To be honest, my initial thought was why you found it necessary to upgrade from an already very capable gaming PC.
It's just a bit odd on a forum where 90% of the posts are people complaining that Nvidia products are too expensive lol
It's your money and you do you. But that was honestly my first thought.
Lol, I'm sorry...I didn't mean to come across as aggressive.
I'll be upgrading my PC very soon as well because even with the recently inflated prices, PC gaming is a relatively affordable hobby compared with most other "adult" hobbies.
I hope you enjoy the PC!
There's no Team Red. There's no Team Green. You paid money for a product, a tool that performs a task. Look beyond the tribalism. Also, congrats.
Came here to say exactly that. Stop this team-BS. Buy the product that suits your need best. This goes for all walks of life, from Smartphones, to GPUs and CPUs to clothing and furniture.
I wanted a AIB 79XTX but could only find a 4090 in stock so that’s what I got. A bit overkill for my need but not disappointed nonetheless
So you did exactly what I advocate for: You looked at the market and got what seemed like the best thing to get for yourself. You did not base your purchase descisions on some weird affiliation to a brand that doesn't give a flying toss about you in return.
They absolutely don’t care about me nor I care about them. I wanted performance, raster is what I seek so the XTX was a better deal for that but I could still afford the 4090 so I did
No don't FUCK WITH MY TEAM I GO HARD FOR MY COLORS. WHAT FLAG YOU FLYIN?! RED?! GREEN?! Me personally...... Team green in perpetuity
Coming from a dude with a 1070
Yes. It's showing its age, but it holds up okay today and has served me well for the last 6 years. I'll probably be replacing it with a 4070 Ti, but I'm also considering a 6800/6900 XT, or a 3070 Ti, or a used 3080. I'll figure it out some time after the 4070 Ti release when the prices settle. My Ryzen 5 3600 will be a bit of a bottleneck for 1440p 95 Hz, but I'll do a slow sequence of upgrades as I go along. I have a gold certified 750W power supply, the MSI MPG A750GF. It should be adequate for any of the above-mentioned GPUs. My motherboard will probably be the next thing I upgrade after my GPU, it's only got PCIe 3.0. Then probably a beefier AM4 CPU. Or maybe I do mobo/cpu in one shot if I decide to switch over to Intel or AM5 instead. We'll see.
I was honestly just kidding around. I don't think your cpu will be too bottlenecking tbh. I had a 5600g with a 4090 at 1440p and I was pushing full 165 frames my monitor was capable of. I run a 5800x3d now and I never bottleneck my GPU. It can always hit 100% before my CPU both at 5600g and 5800x3d I intend to go higher resolution which will make the bottlenecking concept even more negligible for future games. I will eventually upgrade mobo, CPU, and everything else to go am5 or Intel next gen, but for now my GPU is gnarly and the CPU is keeping up pretty effortlessly. I loved my 1080 when I had it, kept it for 4 years
Yeah. The GTX 10-series cards had surprisingly good longevity. It was such a huge leap from the 9-series. The same can be said about the 4090, a huge leap over the 3090. I bet you'll be able to make due with yours for a long while. Also, nice pairing with that 5800X3D, I'm jelly. The improvements to 1% low must make it so buttery smooth. Cheers.
Improvement to 1% low? And thanks man
I'm oversimplifying, but here's the basic idea. So, let's say you get a "steady" average of 165 FPS, you're hitting that 99% of the time. However, the average can be misleading. There can still be a lot of volatility in your FPS, going up and down, but you'll still see an average of 165 FPS. You want to minimize that volatility. The 1% low is a good metric for that volatility. It is the 1% of the time that the FPS drops momentarily, for example, to 100 FPS. This can cause hitches and micro stuttering. Not everyone notices it, it's difficult to perceive, or easy to ignore. But if you're a high refresh rate gamer, you probably will notice it. It'll be smooth most of the time, but sometimes you might notice it is less smooth intermittently. Ideally, you want the 1% low to be as close to the average FPS as possible to minimize this problem. The 3D V-Cache in your 5800X3D is supposed to increase the average FPS overall in many games. In addition to that, many have reported an improvement to the 1% lows as well. Basically, you should perceive fewer hitches and micro stutters, i.e. buttery smooth gameplay. Similarly, there is also the 0.1% lows, which the 3D V-Cache also improves. Basically just a smaller window of time where the FPS dips.
Hey man I just learned a lot thank you. I think I see that setting on my Nvidia panel, it has like 99% fps or something like that.
I have the 4080 and i9 13900K, i do not have stutters, the fps go from 65-180 so it is not stabile. My legion 5 3070 did have stutters, i did put in 64gb Kingstone fury in it and no more stutters.
Team grinch with you
He is right tho.
Muricans love announcing their side affiliation.
And we wonder why GPUs are priced the way they are.
Yeah because people work and are able to afford it
That was different five years ago?
lol , fair enough
Yeah it was They realized demand increased and people were willing to pay more
Demand is almost on a all time low now
I could go buy a 4090 right now. But I won't give AMD or Nvidia my money. Thanks though. Glad you work. I also do.
an unnecessary upgrade but cool nonetheless.
I disagree. The 4090 is the best performance without compromising. 6800xt has previously been a great value but it’s the definition of compromise. I’m personally looking forward to the future RTX remix mods that will hit once the tool premiers. Considering AMD performance with portal RTX, those cards won’t be able to cut it.
The CPU was barely an upgrade tho.
The 13700k is an absolute beast and punches above its weight class vs even zen 4 cpus. I personally went from a 12700k to a 13700k. I found more frames with certain games such as Tarkov. But the real reason for my upgrade was because the high speed, low latency ram I had purchased was not stable with my 12700k (6400mhz C32). It’s perfectly stable and error free with the 13700k. I’d say the bar for the value of an upgrade is relevant to that specific buyer alone. Some upgrade every few years, others love to build and upgrade more frequently. Either is ok because PC gaming and building as a hobby is fun and brings people joy!
Well I cannot argue with that as you are completely right. Hell I upgraded from an i7 9700k to an i9 9900k then a 5800x3d not that long after.
if you don’t mind me asking how is a 6800xt a compromise compared to the last gen cards? the only thing its really missing is rt, other than that its pretty on par with the 3080/ti
You said it. RT. If you choose to save money and ignore RT, you’re compromising. When RTX Remix games / mods begin to appear, the owner of a 6800XT will be missing out on a lot of great, free content because of that compromise. Not to mention RT enhanced future releases.
yeah i took that into consideration but honestly by the time rt starts to replace rasterization i’ll probably be looking to upgrade anyways, honestly it was dlss that almost sold me for team green but i only play at 1440p and this card doesnt have an issue doing 1440 p 144hz in most titles
Lisa Su is rapidly approaching your location
Green and blue hmmm... Team Cyan?
More curious about your thoughts.
Truly, how are you liking the switch? and is there anything we can learn from you
Nice rig. It looks simple and clean.
When you walk away, you don’t hear me say…
Please, oh baby, don't go...
Fans on the bottom are overkill if you have fans blowing from the top or side. Cool air will naturally be pulled in.
Neat
More like team red to seafoam green or teal.
Well, a good choice. Not sure if CPU upgrade was crucial at this point, but 4090 has no competitors.
There is a massive difference in gaming between the 13700k and 5800x especially in ray tracing.
How can a cpu make a noticeable difference in ray tracing performance? I thought all the calculations were done on the gpu.
Very easily and the fact that you're upvoted shows how harmful the Tech Andy culture is. Ray tracing requires CPU to be coordinated. A lot more draw calls, building BVH structure... Tired of people saying the lies of "CPU doesn't matter for gaming if you're at 4k ultra".
I mean its obvious that at a certain point a cpu won’t be able to keep up with RTX calculations but I’m sure the 5800x isn’t bottlenecking any gpu in a game with Ray tracing on unless it’s run at like 240p or something. I can’t see a 13700k adding more than a couple frames of performance. I’d understand if it was a 3600 or something. Edit: If the gpu has more headroom the 13700k will obviously push more frames. I’m talking about scenarios where the gpu is fully utilized.
Keep in mind that this thread is about a fucking 4090. Here's a benchmark with 3090 ti and without 13th Gen CPUs that had a good leap of clocks over 12th gen. 3090 ti is A LOT slower than 4090, and 12th gen is definitely not as fast as 13th gen either. https://static.techspot.com/articles-info/2520/bench/2160p-High-p.webp Notice how 12900K is 9% faster than 5800x already. Now extrapolate that with a 4090+13700K, no hard numbers and I don't care to spend time looking for a benchmark with that specific configuration but it's guaranteed that the delta is much, much, much higher. Oh and wait, maybe at 1440p it would be different? You're right, it is. Holy shit. https://static.techspot.com/articles-info/2520/bench/1440p-High-p.webp Source: https://www.techspot.com/article/2520-spiderman-cpu-benchmark/
You’re acting like there aren’t games that will be bottlenecked by a 4090. The original statement was pretty broad anyways.
Re-read, I linked a benchmark.
Ok that’s one great example linked to limitations in memory bandwidth. Nothing surprising about 12th gen intel with extremely fast memory outperforming last gen Ryzen in a heavily CPU bound scenario.
Insane that people down vote you. 5800x is noticeably slower than 13700K across the board and especially in gaming.
They can downvote all they want, doesn’t change actual facts.
What is the use case for your new system? Was the 5800 not fast enough? Have you considered the 5800x3d? Do you game in 4k(+)
samsung odyssey neo g7 165h 4k .yes i do [https://www.reddit.com/r/nvidia/comments/zzada9/comment/j2amcis/?utm\_source=share&utm\_medium=web2x&context=3](https://www.reddit.com/r/nvidia/comments/zzada9/comment/j2amcis/?utm_source=share&utm_medium=web2x&context=3)
5800 3D would have been enough honestly at 4k
Would have been better*, saved money and ran cooler.
I made the same switch. Biggest thing I miss is the Radeon software.
What software feature(s) exactly? Describe what they did.
I understand switch to 4090 gpu, i dont understand switch to dead intel socket and cpu, instead of upgrading to x3d later, on futureproof am5 platform.
my brother had a connection with intel supplier he got me all for fairly cheap
What future proof? For three years?
Intel changes sockets every few minutes. AMD changed theirs after, what, 4 CPU generations?
Many including me upgrade every 5 years or so, so it is irrelevant. Also, now Intel gives the option to carry over a DDR4 kit and upgrade to DDR5 with the same CPU once value and performance improves. I bought a 13600k, kept my DDR4 and I paired with a cheap B660 that costs half as much (or less) as an AM5 board - then in a year or two I can pick a discounted DDR5 Z6/790 and a new memory kit that will be both faster and cheaper than current DDR5s... This is a far better upgrade path IMHO.
That's exactly why it's relevant. People who bought 1000 or 2000 series Ryzen 5 years ago can upgrade to the 5600x and get modern CPU performance for pennies, or go for the 5800x3d to get comparable performance to the top end CPUs out there in gaming. Whereas if you went for an Intel 7600k/8600 you're screwed. It'll be the same with am5, in 4 years you'll be able to drop in a ryzen 10800x or whatever they'll call it.
And the cost of opportunity of doing so now is to buy an overpriced board with a DDR5 that is still very much in the process of maturing. As for early AM4 adopters fair enough, though a friend of mine who bought a second series Ryzen with a first gen mobo had so many issues after updating the BIOS that he turned sour on PC gaming as a whole. I wonder how an 5800 would work on his ancient mobo. :)
So? 4 generations of AM4 yet they still can't fix teething issues on that platform.
It works perfectly, what are you talking about?
USB drop outs still ain't fixed....
Are you going to watercool the gpu??
i did it with my 6800 xt 830$ .im not doing that with 2500$ rtx 4090.mybe at the end of the warranty
Well if you ever get the itch, the 4090 is glorious with a waterblock on it. 25 to 30C reduction in all temps, and so much smaller!
Kinda pointless from a performance perspective though
Yeah I agree, the cooler on it already is overkill to get optimal performance even when overclocking.
For quiet
For performance, yeah, I agree. But it would look so great and match the rest of the build.
2500 ??
Canada / Nz / Australia
I'm curious since it doesn't void warranty. I'm not sure why it matters..... These newer GPUs produce so much heat that seems backward to WC, a cpu and not the gpu..
The 4090 coolers are so massive and overkill. I don't see a reason to watercool.
It does void warranty for pretty much every country in the world
Not in the US.
Lucky
Dang you bought the 4090 for $2500? If you don’t mind me asking. Do you regularly use that kind of power and see yourself needing it? Or is it the kind of thing where you want the best money can buy just because you have the money? No hate? Just wondering. Lately I’ve found myself thinking about people paying these outrageous prices and how many people NEED that kind of juice lol
i got the neo g7 .you tell me at 4k doom enteral rt - 6800 xt 60 fps , 4090 200 fps evil west- 6800 xt 70 fps ,4090 220 fps dying light 2 - 6800xt 70 fps , 4090 130 fps
Why not the Neo G8, ultrawide for gaming is great and it even has a higher refresh rate. Albeit personally I'd go for an oled 1440p over 4k.
Hardware Unboxed didnt like the scan lines on the g8 and always preferred the g7 over the g8 . i hates scan lines too. 165 its more then enough for me
> neo g7 freaking Samsung, most of their high refresh rate VA monitors have "scan lines". I have them on my g7 (the 1440p 240hz one) and with CRU and a custom resolution you can make them go away as you lower the refresh rate (about 185hz is where mine go away). So frustrating, doesn't seem like Samsung cares.
Nice build but the fps were alright for singleplayer games lol.
to be fair 60ish fps feels choppy once you're used to 100+, especially in fps games he listed like doom/dying light 2
Ohhh I see, definitely an insane upgrade from the 6800. I see it’s a mix of you can definitely take advantage of the card with your nice monitor and of course you have the money to blow on the card lol. I was gonna buy a 3090 Strix for $900 but I’m gonna just get a 3070 Strix I got a good deal on and wait what and see what the GPU market does next year
I hope you didn't pay $2500, the MSRP is $1600.
im from the middle east that's the price here .i wish i was from the us
Damn that sucks. I'm in UK they are easily available £1600-1700.
Stealth mode in the BIOS and get the side panels on... Nice kit!
Seeeexy! Installed my 4090 today!!! shit insane mate !
Nice build
4090 is insane! Congrats! Enjoy the build
Beautiful system
I want team red and team green. 7900x and 4090... They are playing very nicely together 😉
Congratulations with 4090. I just got 3090 this November from 1080 Ti.
Intel and Nvidia are the only way to be. Unless hurting for money, then AMD all the way.
My thoughts are 'it's expensive. It's fast. It's a brag post.'
90% of the "bulid" post are "brag post"
Yep, and that's completely fine. I enjoy seeing them. Just found the "your thoughts..." a bit silly :D But since you didn't take offense - none was meant, my actual thoughts are that you should try, if the build allows it, to get better color harmony (unless they're not static and this is just a momentary shot) :)
Looks dope. I've always had intel and Nvidia combos. It's just what I am comfortable with. Enjoy.
One thought is: why the 13700K? The platform is dead and you could have gone with better performance out of the box with the 7950X, or better a temporary 7600X leaving you with a monumental upgrade to the 7950X3D in about two months. Or even better, wait the X3D as we are all doing. I am doing so... a few months is not really important after all. Perhaps people are not sold on the "5+ years" promise. But AM5 will shine when the X3Ds come. Intel will have some really bad time during 2023. I hope they also do a die revision for the X3D to have a stronger IMC for higher than 6400 DDR5, but probably that won't happen yet. Not that it will really matters for the 3D, thanks to the VCache.
7950x is significantly more expensive and the platform is more expensive. If he wanted better the 13900k would have been the best choice.
AMD preys on FOMO customers, like you. Kudos to them they've been successful at it as you've proven.
Team Red = AMD Team Blue = Intel Team Green = Nvidia
13700K with a 4090 ... why!? you spent a shit ton of $ to pair with an i7
We now live in a world where people think a brand new 16 core 24 thread cpu is “settling”
13900k has more stock clocks and has the same boost TDP - better performance for similar power hunger. We live in a world where people skips their abilities to think.
You’re a perfect example of your last sentence.
And how did you define this?
Why are you talking about settling on cpus when this person is running a 4090, probably at 4k. Cpu bottleneck isn’t a worry unless you are playing very high frames
4090 is capable of running more FPS than most CPUs may prepare at 4k. And I said only that 13900k has more stock clocks over 13700k, and these slightly better clocks will improve that FPS. Yours and as I see a lot of other schemas that '4k is obviously GPU bound' isn't working anymore with 4090.
Or low resolution. At 4k OP is perfectly fine…. I was under the impression that 13700k outperforms 13900k in most cases
Well low res would imply high frames imo
Oh ok. I never thought of it that way. Either the number of frames is too high that bottlenecks the gpu or the fact the lower resolution is more reliant on cpu than gpu. I don’t know the actual reason.
Very seldom is a cpu the bottleneck for games. Now, if you wanna max out a 240hrz monitor playing esports titles, you are going to need a nice cpu. If you are just playing aaa games at high fidelity at 1440p or 4k, then the gpu is the main worker. Since the gpu probably won’t reach higher than 90 frames, the cpu is largely not relevant as long as it’s generically modern. For my I just look at what’s the current gen architecture, look at what’s the most popular on newegg, and that will suffice.
i im not the only one on 3dmark with 13700k and 4090 :)
I use a 13600k with a 4090 at 4K. Even that is more than enough power and a massive boost from my previous 9700k.
But it might be even more. Spending so much on the top motherboard, top GPU, and skipping 200$ on the CPU.
I saw results of 4090 with an intel 4790k
There is barely any difference in gaming between the two (13700k and 13900k). Look at the 1440p and 4k graphs here: [Intel Core i7-13700K Review - Great at Gaming and Applications - Performance Summary & Performance per Dollar | TechPowerUp](https://www.techpowerup.com/review/intel-core-i7-13700k/25.html)
Wow. That article really shows I should be buying a 13600k instead (gaming only). Hmmm
13600k has more OC headroom, too. An OC'd 13600k performs better than a stock 13700k for gaming, although you'll never notice the difference outside of benchmarks
Do it I cool mine with a 60 dollar ak620 lmao
btw im with neo g7 165 4k
Lol what????
I noticed there are a lot of people on this sub that don't value $ at all. Just because you can pay it, doesn't mean you should. lol It actually has to make a difference.
Right???? I also don't get the logic ' you spent so much on this! I am sure you can spend some more'. Nobody ever says 'darn! You spent so much on the important stuff, you can definitely save some on xyz as it gives your the least benefit '.
Hilarious
It wouldn't matter that much (for gaming) if it were an i7 or an i9 13th gen really... what matters is the upgrade from a 5800x in the first place. It's not even that much of an upgrade, more of a sidegrade.
bro i build on the x570 so many times i just wanted a new motherboard and my brother suggested the z790 thats all i didnt care the main is the 4090
5800x can manage 4090?
Yes lol it can... manage... a 4090. You need an actually old CPU to get considerably decreased performance. Think 9th gen intel core i / 2nd gen AMD Ryzen.
He probably heard of bottlenecking and feared for his GPU to suffocate and die or some shit.
I swear idiots think a slight bottleneck will make their PC explode
You have a 13700k paired with a weak ass 3070. Shhh
13600k would pair nicely with a 4090 as well. That i5 is a beast. We are back to the days where an i5 is just fine and going to i9 isn't that much of a difference. Go look on GN if ya want proof. Hell it is as good or better than last gen i9.
Is your liquid cooling using 2 radiators? One on top one on bottom?
Probably because his 6800xt was hooked up to it too, so to cool both gpu and cpu he had 2 radiators. And didn't get waterblock for new card but kept using existing loop setup, no point to remove a radiator.
Really good though custom water cooling is overkill
i moved to water cooling only because my rx 6800 xt won the silicon lottery on water i got him to 2804 GHz and 23983 Graphics score
Custom water cooling looks cool, but I can never stop thinking about the day that the pipe starts leaking all over the GPU and motherboard.
A bit overkill with the GPU. When you got that much money to spend you might as well go for an i9-13900K. Anyway I'm just glad that Intel stopped being complacent. Even if you won't be buying an Intel processor, the added competitive pressure should keep AMD in check. As a Ryzen 7900X owner I can assure you these chips are aggressively tuned out of the box.
Nvidia shill /s
Nice build, did the 16pin cable touch the glass when you closed the side panel?
The next step is to watercool the beast!
Hows the fans noise and coil whine on your tuf? Open case vs closed
the first time i turn the pc horrible coil whine came from the gpu after 30 sec thats gone im only with closed case .on battlefield 2042 after 30min 60c gpu ,70 junc temp , 2805 ghz , 360w(on average) fan 1200rpm i can barely hear them
So no annoying coil whine with closed case?
no. but its luck on newage reviews some of them got coil whine with there 4090 tuf
I was honestly just kidding around. I don't think your cpu will be too bottlenecking tbh. I had a 5600g with a 4090 at 1440p and I was pushing full 165 frames my monitor was capable of. I run a 5800x3d now and I never bottleneck my GPU. It can always hit 100% before my CPU both at 5600g and 5800x3d I intend to go higher resolution which will make the bottlenecking concept even more negligible for future games. I will eventually upgrade mobo, CPU, and everything else to go am5 or Intel next gen, but for now my GPU is gnarly and the CPU is keeping up pretty effortlessly. I loved my 1080 when I had it, kept it for 4 years
Would haves saved money and upgrade to a 5800x3d unless you do multi thread work
my x570 was half broken and was mod on. i will not buy in 2022/2023 m4 board
Enjoy the new rig. To be honest, my initial thought was why you found it necessary to upgrade from an already very capable gaming PC. It's just a bit odd on a forum where 90% of the posts are people complaining that Nvidia products are too expensive lol It's your money and you do you. But that was honestly my first thought.
i didn't know that the situation here that bad .believe me ill keep everything to my self from now on
Lol, I'm sorry...I didn't mean to come across as aggressive. I'll be upgrading my PC very soon as well because even with the recently inflated prices, PC gaming is a relatively affordable hobby compared with most other "adult" hobbies. I hope you enjoy the PC!
lol its ok bro .happy upgrading . thank you :)
I would have just sold my 5800X and buy a 5800X3D, would have been way cheaper, similar performance.
Lisa Su gunnar slap his dick on you for this beware!
I assume you are just waiting for 4090 water locks to come out right? It would be criminal to leave the you air cooled in a build like this.
I did pretty much the same jump, just upgraded to a 5800X3D instead. Your performance should be off the roof, hope you enjoy your rig!