Competition is good for us. I hope these get the driver love they need.


Intel has been making integrated graphics since forever. They can't be as far behind as some say they are.


They say that themselves, older DirectX titles are lacking. if youre fine by that, you got a deal. That being said, i heard that DXVK translation can boost performance of these titles significantly


I like that they are pricing these based on the Tier 3 game performance. Hopefully this becomes a good universal option soon, instead of being niche to just modern games.


I cant imagine they're super far behind either. They're titans in the CPU market and like you said, already implement their own integrated graphics. Its a multi billion dollar tech giant. They absolutely have the budget, resources and connections to make a proper GPU. It might take a bit of time since a proper dedicated GPU is newer territory to them but it would be dumb to assume that they couldnt figure it out in a reasonable amount of time.


Right? You'd think that they'ed have some architecture that is feasible in their ability. I've had high hopes for these cards solely as a direct competition and I'm actually surprised they didn't try this sooner, pre crypto. I understand they were probably trying to get in on that market cash cow, but hindsight. For real tho, they should have been doing this years ago. Imagine how it would be today if they released a gpu in mid 2000s early 10s? Daayum. Market competition would be way different


I really hope Intel gets their shit together with GPU's... we could really use a 3rd company to provide some more competition


Numbers show that people don't buy AMD even when they compete with or beat Nvidia on price. Competition is dead because people want Nvidia so the prices will continue to rise.


Mindshare goes a long way. Over here, the average person trusts whatever the salesperson gets them to buy. Intel i7/Nvidia sticker with no mention of the particular model, with the inclusion of the generation if you're lucky, on a crap tier ultrabook? Must be the latest and greatest! Heck, prior to Ryzen, I'd be hard-pressed to find people who even knew as to what AMD/Radeon was over here I'd be more than happy to see Intel and amd knock off nvidia's crown. Prices are atrocious, especially so when retailers price gouge to hell and back here


It fluctuates. 15 years ago AMD and NVidia were neck and neck. AMD did quite well in competition. There was a period where AMD couldn't keep up and NVidia ran away for a whole set of generations. Hard to re-equalize again after that.


That and NVIDIA pretty much cornered the GPU programming and AI market with their wall garden programming environment. From CUDA to tensor, the competition can't offer what NVIDIA has: Wide range of software supporting NVIDIA exclusive features. As long as this remains, AMD and Intel can give cash prizes for taking their GPUs and people will still buy NVIDIA


the unfortunate reality despite Radeon being at its best in years


5700xt gang still pushing 140fps


I mean I've got a 5500XT and I'm able to hit 165fps in a good selection of the games I play. I swapped from an GTX980Ti for a RX5500XT, some would say it's a downgrade but the the issues I don't have to put up with on Linux and the minimal power draw of a TDP of 130w compared to 250w in this current energy climate, it's a win for me.


That card is a beast. Going on close to 3 years with mine now.


OneAPI has proven to be powerful so far


OneAPI to rule them all and in the darkness bind them....


This. Adobe and whole lot of other software manufacturers work in tight integration with Nvidia and Intel. Some software - like Radiant Photo - won't even work on AMD CPU!


Similar to some graphics render engines like Redshift/Octane only work with nVidia/CUDA. AMD support is on the way for Redshift at least so there is some hope.


I use blender for a few work things and my 2070S is miles faster than my co workers 6950 for rendering. Nvidia did their work at integrating well with multiple workflows. AMD either didn’t care or didn’t have dare engineers to work with companies. If I just gamed AMD would be a no brainer.


This is what has kept me reluctantly in team green


Also doesn’t help that even when AMD has competed on the hardware side their software has been extremely weak so the entire user experience was not ideal


Kids, smh, we was buying AMD cards when they were still ATI.


*Cough* Whippersnappers... I was buying GFX cards when the GPU was an add-in that you slotted in with a Bypass from your REGULAR VGA card.


What kind of voodoo is that???


No need to scream like a Banshee about it.


I still say ATI sometimes and the kids at work look at me like I'm crazy


I still use ATI in my rig. And yes, I might be crazy, but that has nothing to do with my GPU


Same, but in my retro rig. My HD3850 AGP is ballin


9700pro for the win


My first proper card was an ATI Rage 32mb of RAM, it was a beast!


Amd gpu mindshare is still fucked because driver issues still plague them abit. Rdna 2 mostly fixes what rdna 1 went through and its only getting better. I do hope rdna 3 is flawless as AMDbgets more money for their gpu division and hopefully better driver team as well. Nvidia is pushing their luck every release and lovelace is no different. Alot of users woke up due to lovelace bad price to performance index and its not even crypto prices lol. Hopefully intel makes it big but they will be faced with driver issues just like amd. Baby steps.. and soon nvidia will fall and face a crashing reality check for their consumer gpu division (if it happens)


The spread of "AMD DRIVER BAD!" is part of the problem. I've been on a solid team red build since 2016ish and have had zero driver stability problems. AMD drivers are not that bad, and of you know anything about proper PC maintenance, you won't have an issue at all.


Yeah that's the point. I switched 2 years ago from nvidia to an amd gpu, a 5700xt, no single issue so far. Drivers and hardware working like a charm and it's a beast, man I love this gpu, will never consider overpriced team green gpus again.


Me too. Since 2015 and had only one issue which drivers which was in 2015. Nothing since.


I’m going AMD for my next build (the EVGA thing was a dealbreaker for me), but I fully expect to have problems. Every third generation or so, I try AMD, and it always has issues, usually driver issues. It’s not FUD. It’s not like everyone just randomly buys the more expensive card for no reason.


The evga thing is a real bummer. I havnt been team green so much as ive been team EVGA for quite some time. Their customer service is amazing, even more so considering the trash service of their competitiors.


That's the thing. There are no major issues with the AMD drivers. Sure, you get bugs from time to time, same with Nvidia or any driver really. But they aren't bad. But people parrot this line over and over, I bet they haven't even used an AMD GPU in like 10 years.




I have generally had a better time with AMDGPU over Nvidia ever since ~Vega 64. Especially on linux where I don't have to install bullshit proprietary kernel modules.


On Linux AMD is way better, but for gaming purposes I had a terrible experience with the Vega 64. The damn thing kept crashing (screen goes black, 10/20 seconds later it goes back up having closed my game). I tried everything I found online as a solution. Undervolting, overclocking, change settings drastically, reduce quality of videos. Of course it's not like all AMD GPUs are like that and most likely I'd just been unlucky, but I'm scared to go near them again


That wasn’t my experience with a Vega 64 at all. I had a sapphire reference card. It was stable at stock speeds and with a mild overclock, and I never had any driver issues except with Riva Statistics Tuner, which caused problems with my 3070 too. Sorry you got a bad card, you should try AMD again next time you upgrade!


Strange, I had no such issues with the Vega 64. Heck I’m even doing the most dangerous thing (CFX dual Vega64, but one card is Asus and the other card is Gigabyte and both card has wildly different clock speeds) and games that do support CFX like GTAV and Elite: Dangerous runs buttery smooth. I’ve owned the Asus card for 5 years and the Gigabyte for 3 at this point. Solid as a rock to this day.


Higher end AMD GPUs straight up can't be had in my country right now. 6900XT went to $550 and boom, gone.


I'm normally an amd guy, I use ryzen cpu. But cuda is why I buy nvidia. I'm a data scientist and need to have access to cudnn. Amd is trying to enter the market with rocm but they don't support consumer level cards. I can go buy any nvidia card and start doing deep learning immediately. That's an issue if you want the open source community to support your ecosystem.


> That's an issue if you want the open source community to support your ecosystem. Ironic since CUDA is closed source and basically the result of Nvidia successfully privatizing academia. I'd like to see Intel and/or AMD support an open source alternative. AMD definitely can't do it alone though.


This is why I also bought Nvidia GPU. For gaming though, I would say that at this point raytracing feels like a gimmick, so AMD GPUs are completely fine.


People want AMD to be competitive so that they can buy an nVidia product for a lower price


Bingo. This is what history tells us. If AMD is competitive this generation, you will see Nvidia lower prices a bit and AMD will lose out again.


Brand image takes some time to change. Some may still associate AMD with "cheap & bad-drivers". Just having the crown of "most powerful gpu" in the lineup will sell some lower tier cards just because of association, even if there are better value options. Similar to sport car brands selling overpriced mediocre bicycles.


I LOVE how competotive the CPU market is rn and im glad AMD stepped up. But I mean, when was the last time AMD was truly competitive with Nvidia without ANY compromises (especially at the higher ends)? Getting the Nvidia card has been the safe bet for quite some time, especially if you aren't up to date on tech news. For the last few generations it been "oh, this is a great card...if you don't care about RTX or DLSS" Or "Oh, this is a great card but the drivers are 50-50" Or " This card is good but not quite as good as Nividia for streaming" And that's even IF they have something in the high-mid to high end in the first place.


Yep. Not to mention that atm Nvidia is miles ahead in applications outside of gaming, mostly due to their very much proprietary and walled-off CUDA. Going red or blue would mean losing GPU support for my 3D scanning software, molecular dynamics software (or bending over backwards to try to regain some functionality via OpenCL) or image recognition software.


If I want to render something my options are: 1. NVIDIA. 2. A 20%+ performance loss, small selection of engines, all less viable, slower, with worse DCC integration, worse support, worse documentation, and less active communities than CUDA counterparts. 3. CPU if I'm Disney running a 40 000 core farm.


I tried for so long, but my RX5700xt crashed a lot, and my new RTX 3070 has not yet.


This so fucking much, I gave AMD a chance, bought a 5700XT and only had issues with it, rma'd got a different one, same problem, different pc, same problem. Ended up getting rid of it for a Nvidia GPU that surprisingly had 0 issues.


Ex 5700xt user here. I was having huge problems with drivers from day one, I jumped back over to Nvidia when 'latest' driver update broke being able to 3d render, every driver I tried broke something in someway on my machine. By it not being able to run certain games, or not being able to 3d render like I mentioned previously. Never had such problems with Nvidia drivers.


My 5700xt would crash the whole pc if I tried to watch a youtube video and it kept going on for weeks till I figured out to turn off hardware acceleration. The driver fix for this came with a bug that would frequently just give a black screen until I unplugged the monitor from the gpu and plugged it back in again. By 2021 I had found one version of the driver that actually worked and never updated again, jumped to 3070ti recently for the peace of mind + as a software engineer the lack of features got really annoying Funniest shit is when the released the adrenaline drivers which fixed many bugs with my 5700xt, it broke the shit out of my other pc’s rx580 to the point of being unusable


Exact same situation here. My 2070S has been perfect.


I had an rx580 for the longest time and frequently had problems. So when I went to build a new pc 2 years ago, I went with the rtx2080 super and well I haven't had a problem since. I hate the idea of supporting a company like Nvidia, but AMD has just disappointed me too often.


AMD is still not competitive in the creative market especially not for 3D.


Their hardware works well enough, however software makers don't certify AMD cards at the same rate as Quadro/RTX cards. Like PolyWorks (Innovmetric) has everything from the K420 to the A6000 certified, but not a single AMD card. I'm hoping Apple keeps stuffing AMD cards in their Mac Pros just so it forces the industry to certify more AMD stuff. I'm also now hoping Intel has enough industry clout to have their cards certified like right at release.


>Apple keeps stuffing AMD cards in their Mac Pros just so it forces the industry to certify more AMD stuff. They won't. They'll release an apple silicon one eventually.


It doesn't help that in many applications AMD GPUs just aren't as good. For example, professional artists use NVIDIA because it's much better in 3D applications, rendering scenes, etc.


Also dlss 2 and now 3


Doesn't mean much until we get comparison benchmarks but hey at least this is affordable so that's a good sign.


I think they’re saying it’s supposed to be around a 3060 or 3060ti but as u said we need benchmarks


It's priced cheaper than any 3060/Ti variant I can find... So long as it can match them for performance, it's a win for Intel. ​ Here's hoping.


As far as i understand it theoretically better because clock speed at 3060/ti is lower than in ARC, but they have same memory interface as 3060ti. I believe they found a niche where they perform (at least supposed to) as 3060ti but cost even less than 3060. Here's to hoping that saving money wont mean settling for something worse.


You can’t compare clock speed for different architectures, it’s entirely meaningless


Absolutely 100% fact - but people haven't understood this since the 90s.... not gonna start now.... BIGGER NUMBER = MOAR BETTER


>BIGGER NUMBER = MOAR BETTER That's why I'm so good at golf


But consider this also, 3060/ti would have better support compared to ARC in the long term. I'm not saying that you shouldn't get it, just that you do sufficient research based on your usecase.


The discount might make up for that at least, in the real world, most 3060's cost like 400 dollars. If Intel sticks to their MSRP, then gamers on a budget might be fine with worse drivers if they can get the same performance for 100+ dollars less.


Why is there a difference in support longevity? I haven't been following too close


I'd actually trust Intels long term driver support over Nvidia (and maybe even AMD) easily. Intel's Linux drivers have been amazing for every product line they launched so far, and their driver support has always been long. It's the short term support that will most likely be mediocre at best, but watching arc reviews from day 1 and now at least shows that they try to fix the dumpster fire they released.


That's what I was thinking. Intel has an extremely long track record of great driver support. It would be weird if the messed that part up.


Did you forget that there's much more than only clock speed? Architecture, cache, drivers, etc. Clock speed can only be used to compare against something in the same generation with similar specs


Not bad, still lagging behind but it's the first time intel is jumping in i wouldn't expect parity with nvidia and amd right away


I think it's perfect, because only a few games need 4000 series level of performance. I have a 1070 and it's run everything I've ever wanted on medium to ultra settings. The affordable pricing and decent pricing makes me seriously considering upgrading to it.


You should check on driver stability too


That's what I'm concerned about regarding these GPUs. I still remember GN's video on Arc's issues with drivers.


I'm pretty sure it was Steve that also said, of all the bugs that they reported, a majority of them got fixed within what felt like 2 to 3 weeks. Definitely have to watch how the drivers are om release, but I think most people playing the most popular games will be just fine.


1070ti here, amen to that. Sort-of considering upgrading to a 2070ti I can just drop into my existing build as is, get 4 or 5 more years out of it before I upgrade to like a 60-series or whatever is out when I'm older and richer


>before I upgrade to like a 60-series or whatever is out when I'm older and richer *and with no time or desire to game anymore* Cherish those years when you still feel like you want to game every day. :3


I'm pretty old and still game every day. I don't get this sentiment.


Yeah, especially if it's cool and quiet and the power demands are reasonable, I've got a few systems that would be fine with that spec.


As someone on a 1060 3GB from 2017, I'll gladly switch to Intel if it's comparable to 3060ti. Also, fuck Nvidia.


1060 club, I'd snap up a proper 4070 if it had been released at a normal price. A770 might be what I end up with


For 200$ we were able to get pretty good deals with GTX960/1060 etc. Now 400$+ will be the bare minimum. If inflation is solely determined by GPU prices many would have starved to death already me included.


You know if they can get the drivers solid, and the encoders do what they claim that's not too bad for $329. I would consider it, probably not for a high end gaming rig. But for a more multi-use computer.


Intel was never going to hit the high end section the Nvidia occupies but if they can take hold of a low to mid range they can work their way up and make Nvidia sweat in a way AMD can’t since Intel has the pockets to be able to feature match Nvidia.


I hope it goes well enough for Intel to keep trying. Anything that has the potential to quash Nvidia's delusions that it's a monopoly is worth it.


Yeah. We need to go back to the era where the market has a dozen players and GPUs were cheap as chips. It hasn’t been the same since S3 was finally officially killed off by VIA. Even Matrox is just an AMD OEM/AIB Partner now.


Matrox. Now that's a name I haven't heard in a long time.


Still available on Server boards for low-effort vga-videoutput.


How about Riva TNT?


Hercules stingray 128. (Though, that might have been based off on existing GPU)


[*I want to go back, even just a little.*](https://i.imgur.com/k3oXRN6.png)


I fuckin felt that, right in the depths of my soul.


I’m ready for the voodoo 17


3D accelerator cards reloaded.


I still have my voodoo2. That bugger was game changing. Now to find a motherboard with an AGP slot.


Voodoo2 were PCI.


Really.... That's old age for you.


Indeed. Put the riva TNT in the agp 4x slot and then 2 voodoo2's in SLI in PCI. That was the dream allright


a reminder that matrox is still alive and kicking, even if they are operating on a different "market". I've caught them at ISE 2022, and had a few looks at what they offer - it's mostly stuff targeted at AV companies, TV multimedia and stuff like that.


I was so proud of my Matrox mystique. Even more so when I added the m3d.


Intel doesn't need this to do well to keep trying. They can sink millions of dollars without even feeling the breeze of it. And they will. Besides, if nobody buys Arc, which I guess will be the case, they will simply slap'em on some prebuilds and call it a day.


Maybe, also in the Enterprise market?


Like datacenter? NVidia garnered major income on the gaming side thanks to scalpers and miners. Their datacenter grew a ton, as well. They've got most of the enterprise/datacenter market cornered using their hardware and software, and are making bigger waves than in the past for AI workloads. Their acquisition of Mellanox was smart, too. JH needs to feed his addiction to spatulas and fine leather jackets somehow.


The problem is that they need to be committed enough to do poorly for several years. Whether or not they have the money, the real question is if their leadership has the balls, and tbh my bet is on no.


It'd be a stupid and pointless business decision if they expected this to be profitable or to get any more than 30-40% of their money back. Because the potential isn't in selling the first gen product, it's in capturing a piece of the market and holding on to it. So if they short term kill this thing, that'd be the dumbest decision I've seen from them I think.


Intel also has nearly unending money to fund this


I'm aware. The money isn't the worry, it's the shareholders.


I don't think a tape out costs just "millions of dollars". It should be multiple orders of magnitudes higher I'd think.


I think it was Steve from GN that reported an Intel exec telling him they can and will invest hundreds of millions to get into the gpu market. So yeah. Whatever the price, it's unlikely to scare intel off.


>They can sink millions of dollars without even feeling the breeze of it. And they will there were rumors of Intel cancelling the entire project last month so where do you get this wisdom from?




Even if it doesn't go well (I don't personally think it will have much of an impact). Intel are in this for the long haul. They aren't dropping out this generation if nobody buys.


The thing is that Nvidia probably don't see Intel as competition when the ENTRY LEVEL product in their upcoming lineup costs $900. They're clearly not interested in fighting over breadcrumbs in the budget and lower midrange market anymore. If the performance is there Intel might be able to steal some RTX 3050 and 3060 sales, to which Nvidia would probably reply "whatever".


Granted I don't remember where I heard this, and it was a few years back, but I heard the majority of nvidia/AMDs profits come from the low and mid range segment. Hopefully these can sell well enough to keep new generations coming, Intel OEM partners will likely buy them up.


\^ This is completely true. Nvidia isn't pricing the 4000 series like this to sell 4000 series cards. Nvidia is doing this to sell 3000 series cards because they have a huge supply of them and don't want to lower the prices. So nvidia absolutely cares about the budget and lower midrange, they just don't want to lower the 30 series budget and midrange cards lower than the now post mining crash prices. Always remember, The 4000 series prices are a ploy to sell the 3000 stock without losing margins on them.


Listen to Jensens speack to the board. Thats exactly what they are trying. Keeping demand high by lowering supply of 3000 series and making the price of 4000 series high. According to the speack they want to keep this at least until the end of the year.


I’m in the market to build my next PC and every instinct is telling me to just buy all the components and keep my current graphics card and make a decision about a new card in late 22 or early 23.


You're smart. Do that.


I'm thinking that the "RTX-4080 12GB" isn't the entry level; there's probably going to be a 4050. It'll probably be a 3060 with a different BIOS, but...still.


It's all ridiculous... I remember I bought my 1060 for £280. And that used the third best card in the market... How they jumped to £900 as the starting price is beyond me.


So, it's a decent mid-tier card, does raytracing, handles everything pretty damn well AND is cheap? Hot damn


It looks like a really good buy if you don't want/need a new card (like if you already have RDNA1/2 or RTX 2000/3000) but you want av1 encoding tbh


It's about the same price as an Rx 6650xt so hopefully it outperforms it or you would be paying the same for a worse experience with the current buggyness of Intel's drivers


Tbh I feel like supporting intel on this one. But i do wished they released this when GPU's were scarce


I feel like they were hit with supply chain issues when trying to break ground in a new market. Getting components you’d need would be hard as manufacturers will be favoringg their long standing customers over new ones




I want to support Intel, but what they did to HDFury makes me think twice about their GPUs. Can they promise me that it won’t be locked down with excessive DRM to the point that I won’t be able to capture and stream from it? For context: a company called HDFury used to make HDMI to component video adapters for early HDTV adopters. Intel sued them and made them stop producing the device because it bypasses HDCP - which it has to, since most people get the device to watch Blu-Ray discs on their old HD CRT TVs which they swear has better color reproduction than LCDs, but the side effect of that means people can hook it up to a composite HD video capture device and make HD rips. If Intel is that zealous with HDCP, which they have a hand in creating, how do I know I’m in control of what I can and cannot do with my GPU? That they won’t end up dictating what games I can and cannot stream? And I know that there are game companies out there who don’t like their games being streamed - Square Enix and Atlus in particular, who has imposed these restrictions on the PS4 versions of their games (which the PS4 streaming function obliges- it won’t stream games that tells it not to).


Watch Intel create hardware DRM for games lmao


What would this card be roughly equivalent to, at a first guess? As in, “at least as good as a and hopefully as good as a….”


I think they’ve said it’s around a 3060 or 3060ti but again that’s probably just speculation and we need benchmarks to actually see how it turns out


Marketed against RX 6600 and RTX 3060 but depends on pricing


Early things I've seen so far are around the 3060 ti is the hope.


I'm really curious what Intel ARC's Linux compatibility is going to be like. Intel contributes a lot to the kernel, and Intel drivers tend to be exceptionally good on Linux. If it's any good, I could see a lot of Linux users going 100% team blue.


I've never had Linux hardware compatibility issues from any of my Intel stuff, nor from my AMD stuff, Nvidia on the other hand...


I've been meaning to do the noose meme, with "everybody's mad at Nvidia" Linux users: "First time?"


Just be sure to Photoshop Linus Torvalds in there somewhere, maybe Richard Stallman too if you're feeling edgy.


Photoshop doesn't work on Linux, did you mean GIMP them in?




Or PhotoPea, if you like the PS interface


It's gotten way better, but yeah, Nvidia just has something out for Linux, and it makes me very unhappy.


im starting to get real tired of my jank ass nvidia experience on arch. performance in games is solid but everything else is simply not great. I'd like to see what amd pulls out with rdna3 but I could see a switch to intel gpu's in the future if the support is good.


A 16gb gpu that performs better than a 3060ti for only 329$! This in an incredible buy, ty intel


I think the 329$ is for the 8gb variant of the A770, the 16gb variant might be more expensive


Tbh, 8gb seems perfect for this class of card. I could see an argument for 10-12, but 16 would absolutely be overkill for gaming use, at least.


I'm actually going to get one :). My gaming rig is an All-Team-Red (5600X + 6800XT) build but my son's is an AMD 3700X and an old 1070 I had laying around. I would 100% buy one on Day 1 just to get one and try it out and support competition in the GPU market. A couple things I recognize: 1. I could probably get a used 3060 or 3060Ti for close to that price that has better drivers; I don't care. I've been waiting for Intel to step up and I'm putting my money where my mouth is. 2. It's possible that the games he plays will see weird issues, possible crashing - If it's so bad that the stuff he plays is unplayable (Valorant, Division 2, Borderlands 3, Genshin, Minecraft), I'll drop the 1070 back in and wait for better drivers. I full understand that I could buy this and it will not end well. I am willing to take that risk.


Using son as beta tester... Love the idea


YOU GOT IT! He'll be happy because more frames and I'll be happy that I have something in a legit gaming PC that's not nVidia or AMD. I think the last GPU I had that wasn't either of those 2 was an S3 Savage 4. I had them all though... The original Nvidia NV-1, 3DFX, Rendition, Matrox... I don't want to think about some of the early shit GPUs I paid good money for.


I’m with you there. I’m gonna to pick up the A770 to ensure Intel keeps investing. Got the A380 as well.


The plane?


Ironically the word "beta" literally means son in Hindi (pronounced as bay-ta)


Without looking it up and only learning from Ms. Marvel, I think it's a title of endearment to one's own child, not just son.


Hindi speaker here, it's both. It's originally been used for son but now it's for both.


Make sure your son's system supports Resizable BAR (or "Smart Access Memory" as AMD calls it), Arc is sort of designed around it at a hardware level. AFAIK Intel intends to work on making it less necessary with their future GPU lineups though.


If you watched ltt's review, you'll remember that they said that this card performs close to a 3070(ti?) In dx12 games, and close to a 3060ti in dx11. So if you can get the games running smoothly, this thing should be a beast!


Yeah - When I was running the 3700X I had it paired with a 3060Ti and was getting 50 - 60 FPS on a 34" UW 1440p monitor in games with Ultra Quality settings.


I wana get one cuz it's kinda historic and first of it's kind, and also to suport intel


Remember that on the Intel GPU you need to set up resizable bar. It was pretty much a requirement for the Intel chips from those early reviews.


2 months earlier and it would have been game changing. Well I hope that their next gen will be on par with Nvidia‘s & AMD‘s best cards. Competition is always good for the Consumer especially with the shit Nvidia pulled off.


Nvidia's lowest tier GPU is still 900 USD for the next lineup. intel only has to beat 3000 series which it still can.


*currently announced lowest tier gpu. No cards have been assigned a 70 or 60 name. You are missing my wording. I know the low memory 4080 is being called a 4070, but not by Nvidia.


The 4070 has been announced. They're just calling it a 4080 because $900 is an insane amount of money for a mid-tier chip.


It's closer to a 4060ti: https://www.reddit.com/r/pcmasterrace/comments/xk2nsf/the_4080_12gb_and_4080_16gb_have_vastly_different/ipbt3d3/


The actual die itself points to it being a 4060. It's a 104 chip. We're acclimatised to Nvidia's shit.


That dude got some things wrong tho. Didn't notice that 3090 is still a cut down die (3090Ti is max) and that 4090 is also a cut down die. I made a slightly easier to read graph as a post that shows the full dies too. I also looked at the in house tests and guessed that the 4080.12 is about 55% perf of top die card (61% of 4090, reference in the comments of my post. 4090 prolly about 90% of top die. 61% * 90% be 55%) Basically we got a 4060Ti, 4070, and 4080Ti.


That's why I worded it the way I did. No card has been assigned a 70 or 60 name.


Named or not, that lower-tier 4080 is UNQUESTIONABLY the 4070...


But lower tier cards will be released, a named 4070 and 4060 will come. And no, it is not unquestionably a 4070. It has the memory bus of a 60 tier card, lol.


They pretty much confirmed that their current line up is 4090, 4080, 3080, 3070, 3060 in the presentation.


They always release top tier cards first, they make more money. Lower tier ones will come.


They have released only top end first since the 970/980 with the lower end variants not being revealed till later.


They don’t need to be comparable to the top end, that’s a very small minority among the minority that upgrades, they just need to have solid entry-level cards, so comparable to the XX50s and XX60s.


I think I’ll be getting one of the intel cards for AV1 in my server. I can’t currently support Nvidia with their prices and practices. I hope this brings even more competition to the GPU market and more options for consumers.


I might actually get one if it helps Intel build into the GPU market. Their involvement will force Nvidia to be more price competitive especially with how powerful Intel CPUs can be as a baseline comparison. A major move would be if EVGA partners with Intel (assuming the EVGA CEO reconsiders his position on the GPU market).


I hope EVGA partners with them. At least they'd be able to set their own terms from the beginning with their Nvidia experience in mind.


Sensibly priced unlike Nvidia. AMD hopefully won't increase prices by too much (but that's unlikely imo)


Before the recent NVIDIA shitshow which made it 100% clear they're happy to take a big ol' dump on consumers so that their margins are maintained, I was planning to get an RTX 4080 for a new build. Now I'm considering just getting a mid-tier build with an Arc A770 and waiting to see what unfolds over the next 18 months in the GPU space. As a consumer, I really hope Intel does well in the GPU market and gets more and more competetive. It is absolutely what we need right now.


I have been waiting for a 4070 but fuck that, I am not even going to wait now to see what a 4070 is now they have put a 4080 badge on the 12GB card and slapped it with a $899 price tag. I am going for an AMD card for the first time in 15 years.


I mean hey, you can get a 4070 for a super low price of $1200 USD. Great price right?


New AMD cards in November as well I believe


Please intel Please u can make it !!.


I don't know the full history of Intel but dam they took forever to make gpu (assumping its the 1st).


They've made some low end stuff in the past, not really for gaming, more for general purpose office work. AFAIK, Arc is their first attempt at anything approaching gaming tier.


Intel's past is not too pleasant tbh. GPU wise they've been trying to branch into the market for a while. Mainly in laptops. They sell some pretty powerful APUs. This would be their first discrete graphics card. Hopefully it's a good one


The first one was the i740 in the 90’s. It was crap.


That and they look elegant as fuck


people back in the day : AMD is our friend people after this photo : intel is our friend


extremely fair price, Good for Intel


Why is tony hawk giving a presentation on intel GPU’s?


Maybe I'll buy one and use it on my test bench PC. Got no need for it as my RTX 3080 is fine. But honestly, would be cool to have.


Now, we have a true red vs blue situation


as a collector I'm gonna be buying one of these at launch! I have a 3060ti currently so I might not daily it, we'll see, but I will get one. I'm a big supporter of competition in any market and I'm happy to welcome a promising new competitor. When I first heard about arc I swore to myself that I will be buying one and I've saved up over this time for one. I plan on submitting all of the bug reports, I really really want to see this succeed and will do my best to make it succeed. I think intel will be able to stay competitive with Nvidia and AMD not because they'll necessarily be faster but because they have features and power draw ahead of the competition. I'm looking forward to what they bring.


Now THAT is a competitive price! Nice one intel!