As we continue to eagerly anticipate a Western release for Intel’s Arc desktop graphics cards, it seems pretty clear that, in terms of performance, the absolute top model from the series will, roughly speaking, probably be looking to offer comparative performance to something like the current-gen Nvidia 3070.
Not bad for sure, but on the other hand, it’s not overly encouraging either given that Nvidia’s 40XX series is less than 2-3 months away (probably) which will clearly, along with the Radeon 7000 series, completely change the landscape and, overall, make Intel’s entries look woefully (almost pathetically) behind the curve despite being effectively brand new!
This is, of course, just a generalisation. Entry-level GPUs, when priced well, often prove to be the biggest financial and commercial successes. Just look at how the Nvidia 1060 has topped the Steam Hardware Survey for 6 years for evidence of that!
Put simply, Intel joining the GPU feels like a positive step for consumers. If, for no other reason, to give AMD and Nvidia, if not fresh competition, then at least something to think about. – Following a report via TechSpot, however, one major tech research analyst has shockingly claimed that Intel might be best to just dip out of the GPU market as soon as possible!
Following reports from Jon Peddie of Jon Peddie Research, he has shockingly claimed that despite Intel not even yet getting a single Arc graphics card onto the Western market, they may be best advised to simply get the launch/releases out of the way and then quickly withdraw from the GPU scene as fast as possible.
Shocking claims for sure, but what on Earth could possibly lead to such a suggestion? Well, overall, Jon Peddie believes that Intel sticking in the GPU market is almost certainly guaranteed to be a huge financial black hole for the company. Particularly as they attempt to not only carve a slice of the market share pie (which usually requires quite aggressive initial pricing) but more so, to try and catch up to Nvidia and AMD who both have nearly 30-years of research behind them!
With Intel currently in the middle of what appears to be a cut effectiveness drive (due to huge losses reported over the last couple of years), therefore, it’s felt that when it comes to balancing the books, graphics card development might be seen as simply too much of an expense with little to no prospects of a profit within the mid-term forecast.
And while this might all sound totally crazy given that Arc is seemingly on the verge of a Western release, don’t necessarily rule it out. As Jon Peddie rightly says, Intel does have a strong history of knowing when to cut its losses and they could undoubtedly find someone willing to just buy the whole thing off of them (possibly even Nvidia who has a long history of buying out its competition) and effectively write it off as a failed experiment.
Overall, I think Intel is probably going to be around the GPU market for a while. If they don’t find any success after their 3rd or 4th generation though, would I consider it unlikely that they might consider selling up? Nope. Not at all!
What do you think though? – Let us know the comments!
Electronic Arts (EA) announced today that its games were played for over 11 billion hours…
Steam's annual end-of-year recap, Steam Replay, provides fascinating insights into gamer habits by comparing individual…
GSC GameWorld released a major title update for STALKER 2 this seeking, bringing the game…
Without any formal announcement, Intel appears to have revealed its new Core 200H series processors…
Ubisoft is not having the best of times, but despite recent flops, the company still…
If you haven’t started playing STALKER 2: Heart of Chornobyl yet, now might be the…