When Intel announced it was entering the GPU market with its own Xe platform, all seemed well: it was fully scalable, which meant great performance at very competitive prices (especially for those who can’t afford to spend several months renting to buy). a new chart each year).
It was first reported that they were aiming for performance like the Nvidia RTX 3070 and then the RTX 3060, when Tom Petersen (Intel) told us that “you’re going to get significantly better performance than the 3060”. Specifically, it promised up to 17% better performance, depending on the specific model or version. Petersen also pointed out that the prices would be cheaper than the RTX 3060, without specifying by how much. To be fair, what they bring us is not a card from Intel, but from an external company, always more expensive.
The A770 version should be “a bit faster” than the A750 thanks to a few extra cores, 32 Xe cores versus 28, to be exact. There are different versions of each card: although the model we tested is the one with 16GB of VRAM, there is also an 8GB model.
As I’m sure many of you have heard, there were a series of issues (to call them that) when the maps were first released. It’s hard to say if this delayed the release of third-party graphics cards, but Acer’s Predator Bifrost, the upgraded version of the A770, should be the best they can offer us before moving on to their next B-series. graphics cards.
The Xe HPG platform promises a lot, as it features AI-enhanced scaling, machine learning, and Raytracing. Still, when they release a performance-targeted graphics card like the RTX 3060, I don’t like it when Acer and Intel use the terms “high performance gaming” or “high performance graphics”. These cards came out when the RTX 3060 was mid-range, the RTX 4000 series had been heard of, and AMD’s RX 7000 series was a few months away (with performance aiming to excel as well).
In theory, everything looks good. 16 GB of GDDR6 VRAM, 32 Xe cores aka 32 ray tracing units, 512 vector processing units, 2400 Mhz maximum frequency, AV1 multimedia codec and a standard 2×8 pin connector, with power consumption as low as 225 watts. It also has a 256-bit memory interface and 560 GB/s bandwidth, with a memory speed of 17.5 Gbps. Overall it looks better than the RTX 3060 it’s being compared to: spec-wise it’s much closer to the RTX 3060Ti, even though it only has 8GB of VRAM and less memory bandwidth. A novelty that this A770 presents is that it is able to encode and decode AV1 video content.
As for the price, it depends on where you look. In stores in my area, the A770 BiFrost can range from €450 to €507. The RTX 3060 is at best up to 20% cheaper, even its 12GB version, and the RTX 3060Ti is available for the same price as the cheapest Bifrost. To be fair, a graphics card other than the Bifrost A770 can also be purchased from my local store for the same price as an RTX 3060, but not for less (as we were promised). Still, we’re talking almost 50% of the cost of an AMD RX 7900XT or RTX 4070Ti.
So what are you paying?
First, a very different cooling system. It uses a combination of a vapor chamber and a 70mm Aeroblade fan, in true “extractor” style like older GPUs. In this case, it features metal turbine blades with an arc-shaped 92mm FrostBlade 2.0 that looks much more like a traditional GPU cooling system and provides stable airflow through its metal shade. mesh.
The card is of fairly good quality for its price. The FrostBlade fan is semi-white, semi-translucent plastic to power the RGB lights, while the rest of the board is solid metal. The chassis is made to add rigidity to the assembly, but also in order to ensure good air circulation.
This card uses the Acer Bifrost Utility application software. It works well and is easy to use, but it may be simpler than I would like. It also comes with a USB stick with the drivers, but we installed them directly from Acer’s software because the initial drivers for the A770 weren’t… optimal.
The card draws just 209.83 watts when fully loaded, with a minimum idle temperature of 28 degrees and a maximum of 70 degrees. That’s a little odd, as it’s 40 watts less than the specs indicate. It might sound weird, but it seems to be because they prioritized quiet: I couldn’t gauge the sound well because the fans were louder, so it gives points for being quiet.
Let’s go to the conclusions. The system used for testing was an Intel 13900K-based Asus Wi-Fi Gaming Plus motherboard, 32GB of GSkill DDR5 6400Mhz memory, and all software stored on NVMe 4.0 drivers.
D Benchmark – Synthetic test
- Time Spy: 14678
- Extreme Spy Time: 7304
- Speed lane: 2407
- Port Royal: 7413
Total War: Warhammer 3
- 1080p: 50.4
- 1440p: 38.7
- 4K: 23.3
Cyberpunk 2077 Ultra – sin Raytracing / Raytracing Ultra
- 1080p: 66.09 / 43.68
- 1440p: 52.01 / 32.32
- 4K: 30.67 / 18.3
far cry 6
The Division 2
Assassin’s Creed Valhalla
- 1080p: 93.9
- 1440p: 80.3
- 4K: 53.2
- 1080p: 93.9
- 1440p: 80.3
- 4K: 53.2
Red Dead Redemption 2
- 1080p: 86.49
- 1440p: 59.96
- 4K: 41.77
Considering the results and comparing them to the review I did two years ago of the RTX 3060 Ti, which had only half the VRAM like this, a few things are clear.
The promise that this graphics card performs better for a lower price does not hold. First, it’s too expensive; second, the performance is even worse. What I expect is for a brand new graphics card to perform at least 15% better than an older, cheaper one. On the contrary, its performance is about 15% lower than the RTX 3060 Ti, on average, although some titles like The Division 2 have a 30% increase in FPS compared to the old Nvidia card. I had no expectations for raytracing or 4K, but this graphics card sells itself as the perfect choice for 1080p and 1440p gaming, without being.
It’s not cheap at all: it’s priced on par with a very decent mid-range graphics card and it still struggles to play games from five years ago at 1440p. Even at 1080p it struggles to run in some cases, crashing completely in CPU-intensive Total War.
How is it possible ? It has 16 GB of VRAM and a completely new microarchitecture. Based on 3D Benchmark numbers, it should far outperform the RTX 3060Ti. It has the raw processing power you need. The only possible answer in my ignorant mind is that it’s the driver’s fault. They’ve been causing trouble from the start, and it seems like they’re even worse than they should be. The card clearly has the processing power you need, but you can’t use it to the fullest.
It’s a shame, because Acer has done a great job with graphics cooling, but I suspect Intel’s drivers are still not up to snuff. So unless you really need AV1 encoding and decoding, you’ll have to look elsewhere for a graphics card. The bad thing is that there is a huge difference in price compared to the cheapest last generation AMD and Nvidia cards, which are more than double the price right now.
Brent Dubin, known as the Gaming Giant among Globe Live Media staff, is the chief Gaming Reporter for Globe Live Media. Having attended all the major events of Gaming around the World, he is sure to give you exactly the update related to gaming world you are looking for.
Work Email: [email protected]