Over the past couple of months, we got to see plenty of low to mid-range GPUs from AMD, Nvidia, and the new challenger, Intel. Intel Arc A380 and the GTX 1630 are two of the most debated low-end GPUs that are currently out there. It is high time we looked deep into both the GPUs’ specs, price, and performance and settled this dispute.
Today, we will compare these two graphics cards and discuss their pros and cons to help you decide which one you should settle for. Let’s get started!
Specs and Price
Priced at around 140 dollars, Intel Arc A380 has 6 gigabytes of GDDR6 memory on a 96-bit memory interface. With a base clock speed of 2000MHz, this card has 8 Xe and Ray Tracing cores at a TDP rating of 75 Watts. With a retail price of about 170 dollars, the GTX 1630 has 4 gigabytes of GDDR6 memory on a 64-bit interface. The base clock speed of this card starts at 1740MHz which can be boosted up to 1785MHz. Other than that, this card has no ray tracing cores, unlike its Intel counterpart.
So in comparison, Arc A380 does have more perks to offer at a much lower cost.
According to YouTuber “For Gamers” testing of both of these GPUs, Intel Arc A380 extracted about 98FPS in Overwatch whereas 1630 got about 90FPS. Similarly, the A380 got 89 FPS in Warzone while 1630 did about 82. Other tested games were Fortnite, Battlefield 2042, and COD Vanguard where the A380 had an upper hand over the GTX 1630.
Therefore, performance-wise, the Arc A380 again defeated the GTX 1630.
In a time such as this, where plenty of terrible low to mid-range cards are showing up, the Arc A380 is a blessing. At a much lower cost, it out-competes Nvidia’s GTX 1630 at a decent performance level. The GTX 1630, on the other hand, disappointed us a bit. It is more expensive than Arc A380 but delivers inadequate performance.
“Nonetheless, it is quite good to see Intel stepping up in the graphics segment with their Arc series cards. The price is good and the performance is also decent which makes the Arc A380 a good 1080p option. The performance will likely go further than what we have shown here, once Intel is done optimizing its graphics driver.”