T O P
hiktaka

The best thing about 6500XT, if AMD indeed has a lot of stock to buy, is that it will calibrate the current ridiculous fucking **used** GPU prices. There is no doubt that many few-years-old models are plain better than the 6500XT itself *if* the used market price is tamed down a bit.


brandonopolis

I built my first gaming PC in many years right when the first shortage started happening. Newegg had one GTX1070 left for sale, the Rock Edition. I got it for $379 three or more years ago. I just checked Amazon and it looks like I could get $600-700 for it... Insane. My plan was to upgrade every other generation, but now I'm praying my 1070 holds out for another year or two...


chickensmoker

same here, i'm stuck with my r9 290. they're still going for like £200, and i've got a 760 i can use in the meantime, so i'm very tempted to sell it. £200 for a normal gpu from 2013 is actually a profit in my case (bought it used a few years ago), so maybe when this all dies down i could get a 1600 series or something to take it's place lol. it'd be a gamble though, i kinda need at least a decent gpu for my work so unfortunately it's probably a gamble i won't be able to take :( if you can though, i definitely think selling a 1000 series rn because you can almost definitely profit from it in the short term and put that money away to get a genuinely beefy gpu once (if) this all blows over


Lobsterzilla

im sitting on a rx480 8gb and praying it holds on.


chickensmoker

Kinda jealous ngl, I’m currently playing RDR2 on lowest settings at around 30 fps lol. I’d kill for an 8G card rn, even one so old as yours! It should hold up okay though if it’s from a decent manufacturer and has been treated right, mines a windforce and it’s still going strong at nearly 9 years of age, an rx 400 series should be healthy enough, fingers crossed


Lobsterzilla

Definitely!


patriotmd

Love my Sapphire 580, but it's definitely showing its age with new games. AC:Valhalla for one.


triniumalloy

I have a RX 580 that is starting to act squirrely. I am watching prices for the new ones like a hawk.


BostonDodgeGuy

I bought a 6600XT today for $540. It's still shit, but it looks like it might be getting better slowly.


A-nom-nom-nom-aly

Strip it and repaste/pad it... I did that with my Sapphire RX580 (8GB) and temps dropped 11ºC and all throttling and weirdness stopped. ​ That card is still running in my mums system I built for her a couple of years ago. ​ Almost tempted to swap it out for the R9 280X I have in the drawer and sell it as £300+ for a 2nd hand on in the UK right now (more than I paid for it in 2017). ​ Then splurge on a 3080 or 6800XT card... and sell my 5700XT which is worth in the £600+ range right now. ​ I reckon my actual cost outlay would be in the £300-400 range for a new card. ​ But then I come to my senses and think fuck that... I'm not contributing to this shitshow


mraowbot

If you’re selling your existing cards, you’re contributing to supply. Refusing to sell because “you don’t want to contribute to the madness” is actually worse for everybody. You’re sitting on what could be productive for somebody for no reason, at your own loss (of either price or performance).


A-nom-nom-nom-aly

But I'd be contributing to the madness by selling my cards at vastly inflated prices... I would in effect... become the very thing I object too... a scalper of some kind. ​ If I wanted to be kind... I'd sell the R9 280X for a sensible price and leave the other cards in their current systems until such a time as a normal upgrade cycle deems that I replace it. ​ For me path was the 280X, then the RX580 and now the RX5700XT... an average of 2-3yrs at a time when new cards were released every year. We're now on an average 2yr release cycle with a tiny refresh in the middle.


mraowbot

Then sell it at less than vastly inflated prices to somebody you trust to make gaming use of it. A 5700 XT is still a solid gaming card after all, and somebody upgrading from a 1060 would be grateful. I see your edit, and will agree that literally letting a 280X sit in a drawer when it could be doing practically anything else is a waste. Rescuing a buyer from getting gouged for a 1050 Ti helps everybody but the scalper.


troublesome58

>squirrely How?


kastaldi

same here with a 2019 sapphire rx580 nitro 8gb... if it should fail, I've got no plan b


bitfugs

I bought an XFX r280 and it broke, so they sent me a 470, it died 6 months later and they sent me a 570. Broke another 3 months later and they sent me another 570. Now XFX only gives 2 year warranty.


onsVad

3 broken gpus in a row? Might be fault on your side like a bad psu for example?


ayunatsume

Bad/old/cheap PSUs shows up commonly with people that have defective cards. Dirty power dood.


g0d15anath315t

Yup, no way that many cards go bad in a row. I'm running a 6 yo 980TI on a 10 yo Corsair HX650 and a nagging part of my mind is telling me to spend the 100 on a new quality PSU just in case. If my Ti craps itself now, I won't really know if it's the ancient PSU or the old GPU. Dont want to plug in some new 700 card and have it get fried by a bad PSU...


brandonopolis

I just had to check my account for order status. I ordered this card on 05/22/2017 and I fully expected to have a nice upgrade by now... *edit I meant an affordable upgrade. Sell my card for half, not double, and then buy a 3xxx or 4xxx...


spaceraverdk

Went from a r9 290x to a Vega 64, waiting for a 6900 xt when prices drop. Or delay till the next gen.


[deleted]

This is literally why I just decided to buy a gaming laptop. 3rd gen Ryzen 5 and a 1660ti. Underpowered as hell but still cheaper then building a rig.


brandonopolis

It's sad that this is the case today, because my build was to replace a broken gaming laptop (MSI GX740), and it was cheaper then to build a desktop even with the rising GPU prices.


Jovial4Banono

I thought I’d hold off on buying this generation but I was able to sell my 5700 xt and straight across buy a 6700xt. Maybe if you lucked out on a Newegg shuffle err something sell it yah know. Does


juanmvallejo

I’d recommend selling it now and getting a 6700xt, it’ll be worth the upgrade


gtorelly

I sold my 1070 for 25% more than I paid when o bought it new, 5 years ago, got a 5700g and some money left... I don't even game that much, and when I do it's not AAA gets, usually classics...


scritty

My RX480 8GB died a while back and anything with 8GB is now so ridiculously expensive that I'm just going GPU-less until things stabilize. Sucks to have a vive and no ability to use it tho :/


Aos77s

Where do you see $600-700? The market is averaging low 400s. I base off the sold cards on ebay. Either way selling used for full value is crazy


brandonopolis

Amazon, and to be fair I'm not trying to buy one so I'm not looking for the best deal.... https://imgur.com/a/K1bRrBB


Spirit117

Amazon for GPUs during the shortage has been a bad indicator of the actual price, the only cards still in stock there are priced so high no one will buy them even in this market. Looking at sold listings on eBay and r/hardwareswap is definitely the best way to find a cards "true" value.


bubblesort33

I'm afraid the used GPU prices might instead calibrate the retail price of this card. If a used 5500xt 4gb is still $350 already, then I'd imagine this thing might at least hit it half way. I think the extra cheap models will be sold out in the first 12 hours for anything under $270, and all there will be left is overbuild Strix and Red Devil variants. And then just like for the 6600xt they'll stop making the cheap models altogether.


hiktaka

The availability is the key here. 6nm Navi 24 with 4gb is small card, hopefully the quantity are enough to flood the low end market.


LickMyThralls

There's still tons of shortages all the way to workers for a lot of things so prices are going to be high until the choke alleviates. I find it very unrealistic to expect prices to improve during a period where shortages exist to such a degree basically down the whole chain.


hiktaka

dGPUs are not a lot harder to manufacture than motherboard or phones or laptops, yet the prices of those aren't that outrageous. GPU shortage is mainly caused by crypto by far and large. Scalpers and component shortage are side-issue.


Doubleyoupee

Not if retailers scalp the 6500XT and put it at the same high price


hiktaka

Retailers aren't scalpers. If the availability of a SKU is good, they will price-compete each other to sell it as many as possible, ASAP. Tech products become obsolete rather quick, and stores actually want high sales while have no profit stockpiling. As good as Alder Lake is, why don't anyone try to scalp the 12400F?


retiredwindowcleaner

>There is no doubt that many few-years-old models are plain better than the 6500XT itself if the used market price is tamed down a bit. if they are better, then they are also better for mining, which means they will not tame down in price. that's the whole point of designing 6500XT as it is


hiktaka

That's why the 6500 XT availability is the game changer (or so we hope).


Biased24

I bought an rx 580 2 years ago i think? maybe 2 and a half for what i thought was too much. a year later last year, i was building a pc for my partner, went to get the same gpu, literally 3 times the price i bought it for a year prior. fucking mental.


Vicestab

Corporations will talk through both sides of their mouths, depending on the direction of the wind in any given day. That's why they said that 4GB was not sufficient for modern gaming last year (when they had cards with more VRAM than Nvidia), and now they're basically saying the opposite this year (when they release a card with a lower VRAM). Their words are more than just meaningless or innocuous; they're like sweet little whispers, meant to hypnotize you and make you feel better about your own purchasing decisions, so that you don't have to think too hard about how the corporate world actually functions and how they arrive at their decisions.


DOugdimmadab1337

Haha RX 580 8 gig goes BRRRRRRRRRRR


szczszqweqwe

Unfortunately it's also good for mining, I was quite happy with my 470 8GB untill it failed and I saw current prices of polaris cards.


thefpspower

I have a 570 8gb and it was great for mining last year at least, very efficient it's no wonder they were so popular for it.


DukeVerde

...Beautiful.


Romanars

Agreed, and note that increased resolution in video games will use more vram.


nero10578

That's a different story though. This is a much weaker card probably used in 1080p so 4GB isn't as big of a limitation as the GPU core itself. Its more upsetting they're removing the video encoders and AV1 decoding.


Skull_Reaper101

4gb is a limitation. I have a 1050ti and in modern games like cod, i can't run at anything above medium because there's not enough vram. Not to mention the 64bit bus on the 6500xt is half of my 1050ti (128 bit) and could cause some severe bottlenecks and stutters


Mataskarts

Honestly my 4 gb rx 580's limit only started showing up VERY recently. The first game historically where 4 gigs became more limiting than the GPU itself was Half Life:Alyx (absolute minimum of 6 required), since then I've only played Halo Infinite's campaign that also maxed out my VRAM to the brim and started dropping frames real hard on the open map. Other than those 2 I haven't played a game where the 4 gigs is a limit for now. And I predominantly play at 1440p and in VR. Your issue seems to lie with the 1050 Ti being a very weak card in the first place- it's targeted at 1080p low/medium in the first place >\_>


nero10578

That's probably mostly to do with the weak GPU in the first place.


ThePot94

Man, as far I truly dislike this product (6500XT), you're comparing 7Gbps with 18Gbps VRAM, which even at 64bit bus interface outperform 1050's GDDR5 in bandwidth. Add a little extra coming from both 16MB of infinity cache and eventually SAM, and you'll find out these two cards deal with those 4GB in different ways. All of this assuming you're on perfect 6500XT's scenario, with PCIE 4.0, etc... Which is out of any logic for an entry level graphic card.


simpl3y

You also need to consider bandwidth. 6500 XT has an effective bandwidth of 305GB/s, 1050 ti has 112 GB/s


ArseBurner

> You also need to consider bandwidth. 6500 XT has an effective bandwidth of 305GB/s, 1050 ti has 112 GB/s Pretty sure that's incorrect and the 6500XT's memory bandwidth is only 144.0 GB/s Source: [TPU](https://www.techpowerup.com/gpu-specs/radeon-rx-6500-xt.c3850) A better comparison I think would be the RX570 4GB, which while having the same amount of RAM was cheaper ($169), faster, had more bandwidth (225 GB/s), and perhaps more importantly has the video encode/decode engine intact. Source: [Also TPU](https://www.techpowerup.com/gpu-specs/radeon-rx-570.c2939)


simpl3y

Yea you are right. I checked AMD's website and it has effective bandwidth of 305GB/S at the top and at the spec sheet at the bottom says 144. https://www.amd.com/en/products/graphics/amd-radeon-rx-6500-xt I blame the article I read to get that info.


ukieninger

The effective bandwith takes the infinity chache in consideration but I don’t habe a clue how they calculate it. That’s maybe why there are two different values around


Taxxor90

They calculate it by testing the average hit rate of the cache at a given resolution. So how many times can the GPU actually draw data from the cache directly instead of needing to use the VRAM. Say it has a hit rate of 50%, so half of the time the data can be received from cache, then the effective bandwidth of the VRAM is double. With 305 effective and 144 standard, the hit rate seems to be ~53%


Shaggi72

128bits GDDR5 7Gbps: 112 GB/s 64bits GDDR6 18Gbps: 144 GB/s


ThunderClap448

I mean, the bus is as much of an issue here as it is on the 6900XT. People said, and I quote "if you think 6900 XT can have 16GB of VRAM and a 256 bit bus, you're a moron" to me. Let's wait and see if it actually makes a difference.


Skull_Reaper101

Yeah. Though i think the pcie 3.0 x4 limitations might hit harder. But we'll know when the card launches


ThunderClap448

Definitely gonna be the issue, similar to 5500XT iirc. Though, the more I look at it, the more this makes sense https://www.youtube.com/watch?v=0V3aBGp6kSc&t=424s And, as a comment below pointed out - the biggest reason 3050 will be snatched by miners, while 6500 won't is the VRAM size - ETH DAG files are too big. Others do exist, but they're much smaller in market share. We'll see though.


Skull_Reaper101

Yeah


chic_luke

Boggles my mind that upcoming AMD laptop iGPUs will have AV1 decoding but not AMD's latest dedicated GPU.


angel_eyes619

haha so accurate


BobSacamano47

How are they saying the opposite? This is a shit tier card. It's not intended for modern gaming.


pag07

I think AMD still thinks that it is not enough to be considered a GPU leading towards the future. More like a GPU to mend the cracks.


Vivorio

The 4GB is on purpose to avoid mining. Makes no sense if you just said that they will say anything they want to sell if what they are doing makes sense.


AK-Brian

Task failed successfully.


MaxGokue

I7 2600k are u sure bro? With 1080? How is that cpu holding up? I might look em up if it was cheap.


exaltare

AMD: Our card requires PCIe 4.0, has 4 GB of VRAM, and has a 64-bit interface. We promise that's only to stop miners. It's not to save on costs or exploit the market. You won't miss HEVC encoding or AV1 decoding. It's still a good gaming card that's worth every cent of $200 USD. Narrator: It wasn't.


double0cinco

The missing encoding features is the most annoying thing imo. Interested to see PCIe 3 vs 4 benchmarks though.


ThePot94

I would say the missing of the **AV1 decoder** is way more annoying. Only a niche of people would encode/stream with this GPU, while literally everyone would benefit AV1 for YouTube/Browsing. It's like an already old GPU.


eetsu

Encoders are ASICs that take space. I wonder if AMD decided that it's best to leave out the encoder to achieve a smaller die size to be able to get the most out of their N6 wafers. Decoders are essential for battery life, and since the 6500M uses this chip that's probably why they left in the decoder. At the end of the day I'd like to know how small this chip is...


double0cinco

Does laptop Navi 24 include the encoders? I haven't seen the spec sheet for those in that detail. If so, it's possible these are for binning purposes to use dies that had deficient encoders. Either way, whatever reason they decided to do this, it's annoying to me. Most people probably won't notice, but I was hoping to use a low end card eventually for steam streaming in my server. So it's just annoying I can't pick one up for that purpose.


Defeqel

Laptop Navi 24 had them removed from the product page, but Rembrandt has them (at least decoders).


eetsu

I don't think so. The 6500M is based on Navi 24 and only has decode for H.264 and H.265. No encoders and no decoders for AV1 (which is a pretty complicated codec based on what I've seen with software render times, I assume a decode ASIC would be pretty complex). Hence my theory is that these have been left out to reduce die size. GPUs that use Navi 24: [https://www.techpowerup.com/gpu-specs/amd-navi-24.g965](https://www.techpowerup.com/gpu-specs/amd-navi-24.g965) 6500M Specs: [https://www.amd.com/en/products/graphics/amd-radeon-rx-6500m](https://www.amd.com/en/products/graphics/amd-radeon-rx-6500m) 6300M: [https://www.amd.com/en/products/graphics/amd-radeon-rx-6300m](https://www.amd.com/en/products/graphics/amd-radeon-rx-6300m) 6400: [https://www.amd.com/en/products/graphics/amd-radeon-rx-6400](https://www.amd.com/en/products/graphics/amd-radeon-rx-6400) 6500 XT: [https://www.amd.com/en/products/graphics/amd-radeon-rx-6500-xt](https://www.amd.com/en/products/graphics/amd-radeon-rx-6500-xt) 6300M, 6500M, 6400, and 6500 XT all seem to have the same xcoder setup, which *typically* we see the same xcoder configuration across all SKUs that use the same chip. I'd be extremely surprised honestly if they have an encoder ASIC on this chip but decide to disable it for all known SKUs. It wouldn't make any sense as I'd assume "hyper scalar" miners would be using CLI anyway or RDP and not Parsec for remote management. To me, the most logical conclusion would be to reduce die space wasted. AMD has no problem keeping Zen 3 in stock, which uses \~80 mm2 chipsets for the main CCD. I wonder if they have tried to get this thing close to \~100 mm2 as possible. (Yes I'm aware that miners are not buying anywhere near as many CPUs as GPUs). Navi 14 was 158 mm2, had a larger bus (128-bit), had the encoders and decoders and was on N7. Navi 24 is on N6, which really only improves density, has half the bus-width (64-bit), 4x PCIe 4.0 lanes (half again), and cuts out the encoders and the AV1 decoder (which sucks for laptops, but if you're only watching YouTube on your desktop shouldn't matter that it's done in SW especially if you have a high-core count CPU). EDIT: I also see no mention of a VP9 decoder, which if you're a laptop buyer is pretty bad. AV1 will likely become more of an issue in the future, and your battery life will suffer once YouTube uses it more than today and other sites adopt it as well. If this thing has no VP9 decoder the 6500M and the 6300M is worthless because you can't watch YouTube without draining your battery like no tomorrow... (unless AMD is hoping the iGPU will take care of the decoding?) EDIT2: I should also state the obvious: the 5500 XT (Navi 14) had 22 CUs, and the 6500 XT has 16 CUs. I believe the N24 chip has only 16 CUs, which means this really should be an all-around smaller chip than N14 (which I think has 24 CUs?). Also shows that AMD must be more confident about yields this time around. EDIT3: I was right! Die size is 107 mm²! AMD did try to get this thing down to \~100 mm²! [https://www.techpowerup.com/gpu-specs/amd-navi-24.g965](https://www.techpowerup.com/gpu-specs/amd-navi-24.g965)


void_nemesis

Both the current Intel and AMD iGPUs can do VP9 decoding, and some (iirc, not 100% sure) can do AV1 (Alder Lake can, not sure about Ryzen 4/5/6000). I think they're hoping that hardware acceleration will target the iGPUs, which it should by default unless you're using an external screen that connects only to the dGPU.


GeronimoHero

All CPUs can do it as a software decode. Alder lake does have it as a hardware decoder but I’m pretty sure it’s only in the models with an iGPU. Zen3 does not have an AV1 hardware decoder. On the Nvidia side all of the 3000 series does AV1 hardware decoding and even the DDR5 1030 has VP9 hardware decoding.


eetsu

Usually, the encoders/decoders are a part of the i/dGPU, not the CPU. Zen 3 SKUs that aren't APUs don't have them for that reason. ALD has them because the iGPU supports it, not because the ASICs are part of the Golden Cove or Gracemont cores (that's also why the F models don't have it). When anyone says they're doing something "in software" it means they're not using any hardware accelerators, just bog-standard code running on the CPU. It's not ideal for decoding because it eats up CPU cycles that can be used for other tasks, and for laptops software decoding being a multi-core CPU workload would result in much shorter battery life than using an ASIC built for decoding. The same is true for encoding, but decoding matters more to your average Joe. On desktop, decoding is nice because it frees up the CPU to be used for other things. Say you like watching Twitch or YouTube in the background while you game, it would be nice if those background tasks had a minimal impact rather than use up half your CPU, which would for sure affect gaming performance. For YouTube N24 isn't awful if it can do VP9. You can get extensions to force either H.264 (unideal, but if it's necessary you can), or VP9. The question is in how many years will it be necessary to watch AV1 content (ie YouTube no longer provides H.264 or VP9 renders)? When that time hits, you'd either have to suck up with the software decoding or jump to another card (and hope the GPU crisis is over by then).


clicata00

In laptops it would be the iGPU that handles most video decoding.


LeiteCreme

I don't think so, it relies on the iGPU to do that, more efficient.


g0d15anath315t

Looks like the idea of Navi 24 was to lean on the igpu for encode/decode and use the discreet GPU purely for improved gaming performance. Really the only way it makes sense, as all laptop CPUs come with iGPUs and most/all of those typically have some sort of encode/decode support. When N24 got leveled up to discreet card status, it looks weird cause plenty of desktop systems lack an igpu.


double0cinco

Ok, this makes sense. Regardless of how unsuitable is for my use case, I hope they make a ton and saturate the market as much as possible to hopefully put downward pressure on prices at least.


fatherfucking

It doesn't because they don't need them in a laptop dGPU when modern CPUs all have iGPUs with encoders and decoders. It's essentially redundant space on the die for the majority of systems.


Veelhiem

As someone looking for a recent GPU for a dedicated recording rig (1440p 60hz 21:9), this card having the encoder would be the best thing ever. Perfect world scenario? Probably the Quadro RTX A2000, 1650 Super or potentially the 3050.


megasmileys

I mean tbh I’m sick of seeing cards with amazing value MSRP that doesn’t exist cos of scalpers. I want a GPU with a shit price right from MSRP


madn3ss795

You're assuming cards with good MSRP can actually be sold for that price. Last year XFX already revealed that AMD sold the GPU to AIB partners at a price so high that selling the complete card as MSRP is impossible.


geonik72

yeah rx 570 had an MSRP of 160 i think and we all know how that went


fliphopanonymous

> Last year XFX already revealed that AMD sold the GPU to AIB partners at a price so high that selling the complete card as MSRP is impossible. Can you link a source for that? I've looked but can't find anything other than Hardware Unboxed making a similar claim based on them conversations they had with PowerColor when revising the Red Devil 6800XT. I can't find anything definitive about the costs the AIB partners pay for each - just some "they told us this" stuff from secondary sources. Which, to be clear, is the same thing that's been said for NVIDIA AIB partners and Ampere. Without a source it all reads like a lot of blame passing and tomfoolery. Personally I don't believe the board partners any more than I do AMD and NVIDIA. Both NVIDIA FE and AMD reference cards are vastly under the cost of the partner versions of the same cards (with some extremely minor exceptions). It's entirely possible AMD and NVIDIA are taking margin hits on FE/reference designs to "compete" with each other, and that the actual pricing is better reflected by the partners. It's also entirely possible that the partners are spewing BS and taking significant wins on margin due to the shortage.


Defeqel

see everything that is not RX 6800 XT or RTX 3080 Seriously, the MSRP of all of them are bad, the market prices are just way worse.


eeeponthemove

Read somewhere on reddit about it only being intended to be a laptop gpu from the beginning


Put_It_All_On_Blck

Yeah, this is absolutely marketing FUD. The card is simply bad any way you look at it, the only reason it exists is because people are desperate for GPUs and will buy anything for way more than it should cost. Notice how AMD has done NOTHING to make their other cards less attractive to miners or get them into the hands of gamers. This is PR just to try and make them look better for peddling an AWFUL GPU.


TheLegend84

True, but at least they don't sell cards wholesale to cryptominers


ResponsibleJudge3172

Who says they don't? Or that other(s) do?


LightningZ71

That's what passes me off. This is a low end card. It's going to be going in systems that don't have any iGPU, like Ryzen 1000, 2000, Intel F series chips, and maybe some low end 3000x systems. NONE OF THOSE SYSTEMS have these encoders/decoders. It's also going to go into lower end APU systems like the 2200g, 3200g, 2400g, 3400g, which while having encoders are also just PCIe 3, so VRAM shortfalls hurt even more on this card than the 5500 that had an x8 interface. "Well, what about PCIe 4 systems!" Who is gonna buy this that could afford to build a PCIe 4.0 system?!?! That's B550, x570, and higher end Intel 11th gen or newer. There are precious few people that bought that that didn't already get a better card than this during their build/purchase. This card is intentionally bad, not just for mining, but also for gaming in the FUTURE as games are only wanting more and more VRAM at lower resolutions. That chip is probably fine for mobile, which was its intended purpose.


zenoen

Right now a rx580 at 200$ would at least get many people up and running if there cards have died, like my dad. Would you rather have a card or not because good cards can't be bought msrp. This might actually hit msrp


48911150

>if there cards have died, like my dad. I’m sorry for your loss


JC_D3NTON

Sorry about your dad...


capn_hector

yeah if they really wanted to stop mining they’d do a mining brake like NVIDIA. Yeah NVIDIA isn’t really serious about it but that doesn’t mean it couldn’t be done - even with open source drivers AMD could still have the card firmware look at the memory bus and throttle when it sees processes that have mining-like memory patterns (100% random unaligned accesses, 100% memory bandwidth utilization, etc). cutting vram isn’t a serious attempt to stop mining, it’s a serious attempt to cut costs, because they know anything will sell right now and they’re taking advantage.


Charcharo

I am against locking down capabilities on GPUs like that.


aerokozmofotointer

Guys, let's be honest, it's a crappy card.


Darkomax

Yeah I'm sure it being crappy at mining isn't an collateral effect of being crappy at all.


szczszqweqwe

LEt's be honest, in a lower price bucket it's either this or 1030.


mista_r0boto

Crappy and available is better than nothing. It will add supply to the market - both oem supply and bare cards. That’s a good thing.


PoopnEvryDay

Yeah, and I'm not buying it. That said, if it can game and it's available it'll sell well.


ShadowRomeo

They made it so much unappealing that even legit potential gamers are disgusted of it though...


drtekrox

Without hardware video decode, I don't even want it for a host card in a VM machine.


Defeqel

it does have hardware decode, just no AV1


Flaktrack

I had read it does not have h.265 either though?


LectorFrostbite

>Radeon RX 6500 XT is bad ~~at cryptocurrency mining on purpose~~


carnewbie911

You can uncross the last bit. It's bad on purpose.


danij3l__

this is full of BS ... so now 4GB is an advantage ? same as 64bit memory interface? how does 4x PCIE help? how about no video encoder? how about no AV1 decoder? why is it 200$? I would rather unclog drains for rest of my life then work in marketing. disgusting ​ "We have really optimized this one to be gaming-first at that target market," Smith said. "And you can see that with the way that we configured the part. Even with the four gigs of frame buffer. That’s a really nice frame buffer size for the majority of AAA games" "For starters, there isn’t an 8GB version of the 6500 XT. The 6500 XT also uses a 64-bit memory interface, which is exceedingly rare in modern GPUs—you’ll sometimes see it in low-end dedicated laptop GPUs like the GeForce MX 450, but discrete GPUs released within the last couple of generations have mostly stuck to 128-bit memory interfaces at a minimum." GTFO


TehWildMan_

> how does 4x PCIE help? how about no video encoder? how about no AV1 decoder? This *really* smells like a mobile chipset being repackaged into a desktop GPU.


danij3l__

shouldn't be 200$. this is just greed.


stealer0517

Welcome to the current GPU market. And honestly just about every market for the last 2 years. Until GPU based crypto mining dies off again, and transport/supply issues get sorted out we won't have anything appealing for a good long time.


chapstickbomber

GPU crypto will never die. It is a proven solution to Byzantine and costs only commodity electronics and electricity, both of which are essentially zero cost in the long run, while consensus is extremely valuable.


ocelotking

The market wasn't nearly this bad during the dogecoin mining craze a few years back. While doge didn't have this long of a profitable opportunity period, ether is already phasing out proof of work. The bigger issue is likely the shortage lasting so long, which results in normalizing these absurd costs and practices. I'm not the biggest Nvidia fan for how they limit drivers and such, but at least the LHR concept made sense for selectively pressuring out miners. AMD is (or claims to be) cutting off their nose to spite their face. I have to restate Nvidia pissed me off in the past, I've never had to get an unlocked driver for an AMD card for multiple video trancodes, and I also wasn't a fan of the lhr concept limitations. But the lhr at least makes sense in that it does what it is intended to do.


chapstickbomber

It's literally a silicon fingernail and two G6 chips. That's what it takes to attack this market on price.


chithanh

What other card that you can buy today for $200 is better then?


Son_of_El_Duce

Its a turd, AMD knows its a turd & they trying their hardest to polish said turd.


danij3l__

Well ... would be nice if they walked away with some brown on their hands ... but it does not work that way nowadays.


Vicestab

You're getting downvoted for understanding PR and marketing, by people who don't understand PR and marketing. The irony must go over their heads, surely.


danij3l__

arrow up or down means nothing to me :)


skylinestar1986

>how does 4x PCIE help? Imagine if it's capped to PCIe 2.0 x4 like the 4700S desktop kit.


bubblesort33

Because it's a 6400xt that needed to be rebadged because the marketing department saw the price. So the knocked the 4 to a 5.


asterics002

It's bad at everything on purpose


chefanubis

"We purpusefuly designed it wrong, as a joke"


ballsack_man

It's not just bad at crypto, it's bad at everything. It's too expensive for low-budget buyers and anyone looking to spend upwards around $300 is probably looking at a 3060 or 3060Ti. Remember; this card is AIB only so the final price wont be the MSRP AMD announced but whatever the price AIB's set. And if that's not bad enough, this card only has 4GB VRAM. It's pretty sad to see GPU's crippled by their memory capacity. In a couple of years, that card might still be capable of 60fps 1080p, maybe even 1440p, if... it'll fit the game in its memory. This product looks like it's not really targeted at anyone, it's just AMD making a stance against mining and using the card as a statement.


TalkWithYourWallet

Even if it is to stop mining (which it isnt it's a way to cut costs and increase margin), this card will age like milk and basically force a near future upgrade, it's the perfect card from a business standpoint Tbh, even looking at it as a mining angle, this is not a win, you're choosing between a card that has gimped performance that you can probably buy and cards with good performance that you probably can't buy,. Neither way is a win This card is the antithesis of 'if you can make a GPU, it will sell', anyone looking at this for gaming, I implore you to just look at a console, they perform better and will age better than this card, and the system will cost less than a 6500xt pc


fatherfucking

Low end cards have never aged well anyway. The cards that age the best will always be high end cards because they have the most raw power. Cards that can do 4K high settings now will likely still be capable of gaming at 1080p low-med settings in 7-10 years. The main reasons you'd want one of these is as a stopgap, a htpc, or if you're building on a strict budget.


TalkWithYourWallet

True, except this is an exceptionally poor low end solution that is affording a price premium (4 year old midrange performance for the same price and by all accounts worse today, because you can't buy an 8gb version and you lose encoding etc) The stop gap solution is a console for gaming, not the 6500xt, it's too expensive for what it is, people who say they need a dual work/gaming pc can just buy a cheap used office pc and still be in a better position with that and a console


zenoen

It was pritty obvious there intentions here at 200$ MSRP. Now we just have to see if scalpers can keep up with a card being made with a die as small as this. The 6600xt only stayed at retail for the first couple days. This also doesn't stop anyone from price gouging but if we can just finally get a 1080p card back on the market that would be nice. Because nivida just made another miner card and we all can tell that by looking at it. No way there GPU goes for less then 500$


chapstickbomber

miners hate 64 bit cards because it means more power cables, more risers, more config, etc. kills the power efficiency with capital and labor.


szczszqweqwe

Yup, I guess after a few days 3050 will stay at about 350$ and 6500xt will be at about 250$, maybe 300$. If amd keeps high volume of 6500xt it's price will fall in q2 nearer to msrp.


JTTigas

I just accepted that i will no longer be a pc gamer until i finish school and get a job. 1650s are going for 400€ here... And my 390x is collecting dust on my desk after suddenly dying on me...


Defeqel

Once you get a job, you value money too much to spend it in this market (unless you can make money with that investment).


Defiant-Nobody-1713

You could probably get £80 for your 390X Enough for a 1030...


BlatantPizza

A 1030 is worse than a 5600g apu


Defiant-Nobody-1713

5600G is not £80


worldbuilder95

Good


ebrandsberg

I'm honestly thinking that this chip may end up providing an opportunity for a fanless single slot card for my servers.


Farkas979779

You can mine RVN with 4 gb of VRAM AMD


3kliksphilip

[It's so bad, it's good](https://youtu.be/0V3aBGp6kSc?t=424)


hujan86

But will it still get gobbled up by scalpers though?


acko1m018

I think even scalpers do not want this gpu.


[deleted]

[удалено]


rubberducky_93

The CES slide claimed its faster than 1650 and RX 570. So 1660/580 performance is more likely


bubblesort33

RX 590, RX 5500, or GTX1660. It has more teraflops than a 5500xt by 10%, but maybe less effective bandwidth. People shit on it for having a 64bit bus, but it does also have 18gbps memory which is 29% faster than last generation. Add in the fact it has the same amount of L3 cache per CU as the 6600xt at one-to-one, so all that might mean it might actually has the same bandwidth in gaming as a 5500xt, although it might not appear like it at first. As long as you stick to 1080p, and certainly if you stick to 720p and upscale using FSR, or XeSS, it should be fine. Just sucks getting an RX 590 like performance card like 3.5 years after launch for almost the same MSRP.


looncraz

At first, AMD's plan is to inundate the market with 6500XT GPUs as much as they can manage. It's a pretty good idea that might bring over a lot of future users if the experience is good for those users. I foresee their poor OpenGL performance on Windos6 being a top issue, though.


SolidHayterForever

What even uses OpenGL anymore? Vulkan has pretty much taken that spot.


canned_pho

Minecraft Java edition. [The most popular game currently in the entire world and in history...](https://en.wikipedia.org/wiki/List_of_best-selling_video_games) Over 30% of users are running modded minecraft as well and the best most popular servers are on Java only Windows DX12 Bedrock edition runs much, MUCH better, but it ain't popular and lacks good servers and mods


Janmm14

One thing is because microsoft made many decisions against nonfeatured servers. No thumbnail in server list, shorter motd, ppl need to reverse engineer the protocol... Also the user interface of Bedrock on PC is shit. You want the classic javalike ingame? You get ridiculously small menus (width).


Entr0py64

Funny that a minecraft modder wouldn't mod the renderer, as there are performance replacements.


Demy1234

Yup. Sodium and OptiFine make the game run far better on both AMD and Nvidia GPUs.


dbaaz

A lot of 3D software still does. Blender is pretty much crippled on an AMD GPU, which is why I have been vendor locked to NVIDIA the past few years. My last AMD GPU was an R9 280x :/


looncraz

Far more than you'd ever imagine... Vulkan is the buzz word of the day, being derived from AMD's Mantle gives AMD an advantage on that front, but OpenGL is still one of the most commonly used frameworks for games... and AMD has piss poor performance (on Windows) with it. X-Plane 11 is the biggest one that impacted me directly - the native Linux version is amazingly playable and smooth, but the Windows version was terrible when using AMD.


Tzavok

I don't think the card is bad, i think the price is bad, 200 usd for this card isn't right. But if the stock is decent it could bring down the prices of used gpus which would be nice.


juancee22

There is literally nothing in the market for that value. A 1650 is at 350$ or more and it's worse than this gpu.


Tzavok

That's my point, all prices are bad, but this card could make other cards go down in price If this card actually ends up around 200-250, then a 1650 couldn't be sold for 350.


dynozombie

so my 5700 xt goes up in value now?


GoatzilIa

Where have you been? Last time i checked, you can sell your 5700xt for around 900 USD.


eeeponthemove

In Europe too?


MrHyperion_

700-750


WeabPep

€800-850 on the Dutch marktplaats.nl (website for reselling cards). (If you merely search for '5700xt' you get bounties as well, plenty of people have put out a €750 bounty for a 5700xt but some of these bounties are weeks old.)


0Spryth0

Sold mine a couple days ago for around 860€, (970$) EU. I could probably get more if I waited, not sure..sold it on day 1 after dropping price to 860€


shendxx

Yes even bad for content creator, there no Encoder that has standard feature since GCN


knomore-llama_horse

Can they make them all bad at crypto mining please.


Kionera

That would mean investing resources into making a LHR limiter like Nvidia which clearly doesnt work to deter miners anyways or limiting all cards to 4GB of VRAM which then gamers would just complain about like they’re doing now.


Demy1234

Yeah, miner developers keep working at breaking LHR, and it works to varying degrees, with a lot of people getting over 80% of the full mining performance out of the card. It's a waste of resources for AMD to try and implement something similar, especially seeing as NVIDIA cards are being scalped and marked up at insane prices even when they are LHR.


BarKnight

Is it also bad at gaming and video on purpose?


lucasdclopes

It is also not very good at gaming on purpose.


Roph

You mean RX 6500XT is bad on purpose*


PuzzleheadedNote3

The card looks like shit but to be honest given that nvidia releases more exoensive higher margin cards rather than produce more of what the majority of people want. What amd is doing makes sense. At least its a way to address the massive shortages on the lower end. Trying to build right now is damn near impossible unless youre willing to pay a truckload more than you should.


killchain

... and yet it's still sold for 2xMSRP. It's getting ridiculous. Edit: I was thinking about the 6600 when I wrote this (since the 6500 is not sold yet), but it's probably going to be the same.


whosbabo

I really don't understand the hate this entry level GPU is getting. The whole supply chain has jacked up prices. Gas, groceries, everything is more expensive. Even TSMC has said they raised prices. If this GPU really sells for $199 (which it should given how it's useless for crypto). It's a win win. Because if you try buying say an rx580, it's $300+ for a used card on ebay. So for the first time in two years, we will actually have a GPU for $200 capable of 1080p gaming. We will see about the actual price of 3050 but if it's anything like 3060, it will actually cost $500.


st0neh

Why does it also include worse codec support than GPUs from 10 years ago though?


SatanicBiscuit

well the moment you decide to remove his ability to encode/decode av1 and h.265 you made it universally useless


drtekrox

For desktop - it's a mobile part and in that world, the lack of hardware video codecs is fine, since it'll be sitting aside an iGPU that will handle all of that. VCN3.0 == 4 RDNA WGPs in size, so removing it saves a LOT of space and is smart for a mobile GPU. This part should never have been sold on desktop though and AMD deserve the flak they'll cop for it.


mabhatter

Video codex would be THE thing to add on a cheap card. Everyone looking for a cheap card would want codex to hardware accelerate YouTube and Zoom. Those don't need lots of GPU, just the hardware codex built in.


A-nom-nom-nom-aly

Bad at cryptomining... bad at gaming... Only reason it will sell is because buyers are desperate for any kind of GPU


hellorobby

Is it possible that it's cheaper to buy a prebuilt piece of shit with a good card, or buy a decent pre-build and part it out except for the card to get a fair price on the card itself?


bubblesort33

Used to be the case. The pre-build market has caught on, and it's not that affordable anymore.


MandyKagami

It is bad on everything else too for the price.


zakats

The PR blow-back (completely foreseeable btw) probably wasn't worth pulling these from being allocated for laptops.


Final-Rush759

Bad at everything on purpose


Death2RNGesus

Looks like an attack on the used card market more than an anti mining card.


pecony

If it can play on pcie 3.0 x8 at least, I believe it will be very successful product I have no qualms with 4gb of ram, hell rx 570 4gb is still the go to option for most of gamers. (I’ll be honest they should never have ceased the production of the series)


antiname

If this card was the same MSRP as the 1050 there probably would be a lot fewer complaints.


nmkd

It's also bad at everything else.


rasmusdf

It's just bad and overpriced all round - because why not.


tema3210

I really want that miners who use gpu get condemned to 1024 years of mining of something other than cryptocurrency. Use goddamn ASICS!


GeekOfAllGeeks

It's like they turned Clippy into a GPU.


voidspaceistrippy

This is PR speak for, "We realized we had enough wafers to make this, and we know you'll buy it, so here's this POS. At least it's cheap.. at MSRP™"


riderer

as far as i am concerned, its bad at everything


Successful-Willow-72

"Radeon Rx6500 XT is bad on purpose" there, thats more like it


Hito_Z

This statement will have to be backed up. Even worse, if the statement will end up as "RX 6500 XT is bad"


DisgustingDiver

There are many better GPUs out there now.


Cheesybox

Short version: wait for benchmarks before bitching about this card nonstop. Long version: I have a 4GB R9 290. It's fine. I can play a decent number of games on it at 1440p low-medium and get at least 60 frames. Admittedly Doom Eternal was a struggle, but that was also CPU-locked on my end, and I still managed to get 60-70 fps at 1080p on low with like 60% resolution scaling. Any "esports" titles will run just fine (I can get 160ish FPS in CSGO at 1440p. Haven't played LoL in years, but that ran at 140+ maxed out at 1440p also). If this stays around $200-250, then someone who genuinely *needs* a GPU (upgrading from integrated/GT790), can get something brand new with a warranty, new drivers, etc. and not have to hope for the best with a 4+ year old used card. If you've already got a good system, this is a cheaper option than overpaying for a marked up PS5/Xbox (those are going for $700+ usually). But no, we're not getting "good" cards at anything approaching a reasonable price right now. No, there is no end in sight. Maybe another year if we're lucky. Your options are: * Keep running your existing build if you have one * Drop serious money on a scalped card that's genuinely good * Build a system with this at (hopefully) MSRP * Find a new hobby Yes it fucking sucks, but there's nothing we can do about it.


Putins_Pinky

It's definitely good at outputting to one HDMI and one DisplayPort device.


drtekrox

Not really, there are better, far cheaper solutions for that, those have hardware video decoding and use less power too.


mgzaun

Isnt it bad on everything?


gen_angry

If it's actually sold at MSRP, I'll take it. It's SOMETHING at least. A suitable card that offers performance above integrated graphics while waiting for the better stuff to settle down. My nephew is looking to build a PC around the 1300 CAD range soon, this would be suitable for him for the time being. Hopefully it doesn't just paper launch and gets added to the list of scalped cards out there.