![]() |
|
![]() |
|
No kidding... Intel is playing catch-up with Nvidia in the AI space and a big reason for that is their offerings aren't competitive. You can get an Intel Arc A770 with 16GB of VRAM (which was released in October, 2022) for about $300 or an Nvidia 4060 Ti with 16GB of VRAM for ~$500 which is twice as fast for AI workloads in reality (see: https://cdn.mos.cms.futurecdn.net/FtXkrY6AD8YypMiHrZuy4K-120... ) This is a huge problem because in theory the Arc A770 is faster! It's theoretical performance (TFLOPS) is more than twice as fast as an Nvidia 4060 (see: https://cdn.mos.cms.futurecdn.net/Q7WgNxqfgyjCJ5kk8apUQE-120... ). So why does it perform so poorly? Because everything AI-related has been developed and optimized to run on Nvidia's CUDA. Mostly, this is a mindshare issue. If Intel offered a workstation GPU (i.e. not a ridiculously expensive "enterprise" monster) that developers could use that had something like 32GB or 64GB of VRAM it would sell! They'd sell zillions of them! In fact, I'd wager that they'd be so popular it'd be hard for consumers to even get their hands on one because it would sell out everywhere. It doesn't even need to be the fastest card. It just needs to offer more VRAM than the competition. Right now, if you want to do things like training or video generation the lack of VRAM is a bigger bottleneck than the speed of the GPU. How does Intel not see this‽ They have the power to step up and take over a huge section of the market but instead they're just copying (poorly) what everyone else is doing. |
![]() |
|
Parent comment requested non-enterprise, consumer grade GPU with tons of memory. I'm sure there is no market for this. However, server solutions could have some traction. |
![]() |
|
If they can sell the board with more RAM for more than their extra RAM costs, or can sell more GPUs total but the RAM itself is priced essentially at cost, then it's not a cost center.
|
![]() |
|
If I were willing to drop $4k on that setup, I might as well get the real NVidia offering. The hobbyist market needs something priced well under $1k to make it accessible. |
![]() |
|
When has an APU ever been as fast as a GPU? How much cache does it have, a few hundred megabytes? That can't possibly be enough for matmul, no matter how much slow DDR4/5 is technically addressable.
|
![]() |
|
What is obvious to us, is an industry standard to Product Managers. When is the last time you have seen an industry player upset the status quo? Intel has not changed that much.
|
![]() |
|
But Nvidia doesn't want to make consumer compute cards because those might steal market share from the datacenter compute cards they are selling at 5x markup.
|
![]() |
|
Nvidia does sell consumer compute cards, they're just sold at datacenter compute prices: https://www.newegg.com/p/pl?d=nvidia+quadro Nvidia's approach to software certainly deserves scrutiny, but their hardware lineup is so robust that I find it hard to complain. Jetson already exists for low-wattage solutions, and gaming cards can run Nvidia datacenter drivers on headless Linux without issue. The consumer compute cards are already here, you just aren't using them. |
![]() |
|
> Intel definitely seems to be doing all the right things on software support. Can you elaborate on this? Intel's reputation for software support hasn't been stellar, what's changed? |
![]() |
|
It's not so simple... The way GPU architecture works is that it needs as-fast-as-possible access to its VRAM. The concept of "overflow memory" for a GPU is your PC's RAM. Adding a secondary memory controller and equivalent DRAM to the card itself would only provide a trivial improvement over, "just using the PC RAM". Point of fact: GPUs don't even use all the PCI Express lanes they have available to them! Most GPUs (even top of the line ones like Nvidia's 4090) only use about 8 lanes of bandwidth. This is why some newer GPUs are being offered with M.2 slots so you can add an SSD (https://press.asus.com/news/asus-dual-geforce-rtx-4060-ti-ss... ). |
Intel definitely seems to be doing all the right things on software support.