GPU for audio

Discussion in 'PC' started by 2poor2, Feb 9, 2021.

  1. 2poor2

    2poor2 Kapellmeister

    Joined:
    Jul 13, 2014
    Messages:
    286
    Likes Received:
    49
    Sorry, don't know where this should be created... please move it if needed.

    For over a decade, I've been asking myself if this was possible. Now, there is no doubt.

    If we take a graphic chip from 2005 and later, even a 'low-end' model, at 150 to 250 $/€, it has tons of processing power. 2015-2020, the high-end graphic cards are so powerful, that the processors used are even more complex and bigger than traditional cpus, with dozens of billions of transistors.
    Then, those graphic cards can come with 6...8...12... or even 24GB of ultra fast gddr6x, for the nvidia flagship rtx3090. That memory is waaaaay faster, and has less latency than our 'old ddr4.

    Then, there is the usage: tons of people, who have a computer to make music, they often have that great game they love to play, like some simulation a la flight simulator, or/and they do all kinds of tasks that require a good/very good graphic card.. like live streaming... video editing.. photoshop (color accuracy...), 4k/8k playback, programming (a powerful card might be needed, to drive the 2/3/4 multi monitor setup, at high refresh rates, etc etc.

    Basically, everybody today has a powerful graphic card (even a 150€ one will be tons of times more powerful than a highend card from 10 or + years ago..), that can go from a simple integrated chip... to a super powerful 500€ card to a 1000-2000€ insane monster.

    Well, everybody knows probably hundreds of millions of those cards are at this moment being used to farm cryptocash. To say the power those cards have.

    So, until today, all we could read about the gpu was "our plugin uses your gpu acceleration to display the interface at 60/120/xxx times per second".
    And that's all.

    Basically, he have tons of power... and who knows... a simple recent amd rx65xx card or a nvidia rtx3060/80/90 could even be 2..5... or even +10 times more powerful than a current 6 or 8-core cpu...
    People have that 300-1500€ monster...sleeping and doing nothing excepting displaying a simple image on the monitor... so much untapped power...

    Until recently, i thought "well.. there must be a reason why nobody is using that for audio... maybe it simply can't be done"...

    ... and a few days ago, there was a post, about a swiss company that is/has created a solution based on the gpu, have coded a bunch of plugins, that run on the gpu, and want to create some sort of marketplace...
    Their solution uses whatever gpu the user has, and people can simply add more graphic cards, like adding a 2nd..3rd UAD dsp...
    The system can run on a classic eternal connection, with a 1ms latency...

    So, that answers the question. YES, we can use a gpu to render audio, run vst plugins, etc etc !
    But 2 questions remain:
    1- why only now ?
    2- why can't others do it ?

    To answer the #1... maybe nvidia/amd released some kind of apis, etc, recently, that allow coding a vst plugin like a normal cpu...?
    And #2... maybe the difference in computing power is so huge, compared to a normal cpu... that a highend graphic card and its gddr6/6x memory could be powerful enough to render hundreds of vst plugins and dozens of thousands of instances... that it would make the cpus and dsp cards look so bad.. that it would disrupt the whole audio business (audio pc builders...UAD charging a ton, their dsp only being able to run a few instances, when a similar plugin could run 5000 times on a gpu..)...

    And this without any kind of optimizations. Further help from nvicia and amd could probably enhance their drivers... and make it deliver even more powerful.

    Steinberg , despite having coded a 64bit engine from scratch, seem to have forgotten people tend to buy cpus with as many cores they can afford.. and , just a couple of years from now, many people could be buying pcs with 128 or 256 cores...
    If they already have all kinds of troubles with 12 or more cores, today... what is it going to be, with 64..128..256 cpu cores...?

    Could gpu audio be(come) the future ?
    While cpus were made to do all kinds of things, gpus Basically are just a bunch of tiny processors, that can crunch maths for the breakfast.

    Maybe it is super easy to code audio plugins and use a gpu ?
    At least, the swiss guys, users can simply add more cards, to have more power... ultra scalable, it seems...

    Maybe devs don't want to have to support a N version of their plugins... vst2..vst3.. aax..au..32bit...64bit... gpu amd... gpu nvidia... gpu intel.... 100% gpu-running.... hybrid gpu/cpu...?

    Then... could these gpus be used to run vst instruments..?
    What about running a full DAW (ex, reaper) on a gpu ?

    Could it become like the videogames ?
    At the beginning, games would use 100% of the cpu, and would only use the graphic card for a few tasks.
    Today, a pc game will use 100% of the Gpu, even at 4k 120hz, and the cpu will barely reach the 10 or 15% usage.

    Maybe one day a daw will 100% run on a gpu, with all the plugins, vstis, etc... and the Cpu could only be used for some particular tasks...?

    Now that this swiss solution exists, and many audio people became aware it is possible to run vst plugins on a graphic card/gpu,

    Maybe we will now start seeing one dev here or there release a gpu vst plugin... and the whole industry will follow suit...?

    What do you guys think ?
     
    • Interesting Interesting x 4
    • Like Like x 2
    • Winner Winner x 1
    • List
  2.  
  3. Pipotron3000

    Pipotron3000 Audiosexual

    Joined:
    Mar 13, 2013
    Messages:
    1,230
    Likes Received:
    611
    Problem is you need to use special NATIVE commands.
    So you need to compile your plugins especially for this task.
    And you need to compile it for "regular" cpu too.

    There are "winner" solutions, like video editors.
    Acustica audio tried the Nebula/CUDA way.
    But there are FAR too much troubles for most devs.

    This is a running idea for years. But as CPU power increased VERY fast, most ppl don't care about the gain GPU can bring.
    Apart some "special" power hungry plugins, like Nebula.
     
    • Like Like x 1
    • Agree Agree x 1
    • List
  4. ptpatty

    ptpatty Platinum Record

    Joined:
    Dec 20, 2011
    Messages:
    307
    Likes Received:
    151
    Location:
    USA (East Coast)
    Isn't that what braingines.com is all about?
     
  5. Obineg

    Obineg Platinum Record

    Joined:
    Dec 7, 2020
    Messages:
    631
    Likes Received:
    218
    the first attempt to do this happened in 2002 - it was an IR reverb - but the problem is indeed that it is quite difficult to program things like that. there are not only some major differences in the architecture, there is also a major difference between the processing of video and audio streams.

    if you want to do this, you have to do almost everything yourself, it is not like just clicking a few buttons in XCode where basic knowlegde about C or C++ lets you create AU plug-ins.

    and while it is something completely different, the original UAD-1 cards also used kind of a "GPU" chip as well. (then switched to sharc)
     
    • Interesting Interesting x 2
    • Like Like x 1
    • List
  6. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,958
    Likes Received:
    2,056
    Location:
    Studio 54
    Agreed. If i am allowed, you are not mentioning though, that just like CPUs, GPU power has also increased very fast. Modern GPUs incorporate thousands of execution units, sort of like cpu cores but much more specific in the workloads they can process. Current example, a GeForce RTX 3080 has ~8700 CUDA cores. Now if -as @ptpatty said- Braingines have got it right, it might be revolutionary. Abbey Road Studios confirm the braingines tech works. I have requested a trial myself, i think my GTX 1080ti will suffice. I also have a GTX 1050 ti, a Radeon 5700XT, an RX 580 and a lot more, i can't wait to test those plugs tbh. It's rare for me to get excited by plugin software but this is intriguing.
    Btw, @Obineg got it right, i still have my pci UAD-1 man. It's powered by an MPACT2 chip which at the time was owned by former ATI, now called AMD Radeon Technologies and UAD licensed the tech from ATI which had acquired it from a company called Chromatic Research. The chip is indeed a vector processor @ 125mhz that can do graphics, video decoding and playback and audio. Afaik, the original MPACT chip was also a modem lol. So it was a GPU but its downfall was that it lacked seriously in 3D performance, but could excellent 2D np.
    Modern GPUs do not differ that much from that old accelerator chip, as they were called back in the day. After all they are both mathematical co-processors. With the difference being that today's gpus, surpass the vast majority of cpus in processing power (although they have a more limited range of instructions) and you can stack more than one of them in a plain desktop pc.
    Now Braingines claim a 1ms latency for their "GPU AUDIO" powered plugs. They also seem to be open to license their tech to anyone who want to develop their own plugs.
    It remains to be seen and of course heard.
    Man, if that sht works UAD is fkd, unless of course they buy Braingines and then we can -sort of- kiss bye bye any sort of low priced GPU powered plugs.
    Still very excited about this. I will let you know guys as soon as i demo these plugs.
    Cheers:)
     
    • Interesting Interesting x 4
    • Like Like x 2
    • Agree Agree x 1
    • Useful Useful x 1
    • List
  7. Paul Pi

    Paul Pi Audiosexual

    Joined:
    Oct 18, 2016
    Messages:
    591
    Likes Received:
    545
    Location:
    London
    Maybe if Steinberg augmented their VST3 featureset to automatically allocate a % of GPU resources for use as virtual VST cores (or, better still, let the daw user decide the %), that would be of immediate, practical benefit to many users currently with underutilised daw gpu's. For me, the braingines model (multiple gpu cards) sounds just about as expensive as VEP etc and currently inferior, 'cos VEP & Audiogridder* do all that braingines does but DON'T require VST developers to entirely re-write their (many) plugins. Besides, the networked pc model also allows for distribution of other resources, such as data storage & PCI bandwidth, too.

    * actually i do seem to recell reading some VSTs need updating to work within audiogridder.
     
    Last edited: Feb 9, 2021
  8. ballinthejack

    ballinthejack Noisemaker

    Joined:
    Jan 27, 2018
    Messages:
    32
    Likes Received:
    3
    Tripping out on this BRILLIANT POTENTIAL!
     
  9. curtified

    curtified Platinum Record

    Joined:
    Feb 3, 2015
    Messages:
    415
    Likes Received:
    268
    • Interesting Interesting x 1
    • List
  10. mrpsanter

    mrpsanter Audiosexual

    Joined:
    Mar 28, 2014
    Messages:
    1,640
    Likes Received:
    797
    Very promising but it means that the developers will have to rewrite all their plugins from the ground up.

    Vst4 maybe?
     
  11. vsuper

    vsuper Ultrasonic

    Joined:
    Aug 17, 2019
    Messages:
    55
    Likes Received:
    38
    GPUs are designed to get data from CPU, render it and show on screen, they are not designed to give data back to cpu. This is the problem described by developers. There is latency issues that are not solved.
     
    • Agree Agree x 3
    • Like Like x 1
    • List
  12. Olymoon

    Olymoon MODERATOR Staff Member

    Joined:
    Jan 31, 2012
    Messages:
    4,205
    Likes Received:
    2,990
    Which company?
     
  13. Scarlett

    Scarlett Member

    Joined:
    Dec 24, 2020
    Messages:
    48
    Likes Received:
    15
    I think you are a dreamer like me and a lot of people around the world.
    With this I mean, you are right, why only now? etc.

    I guess it would mean all developers to create gpu versions of their software, and maybe that's just too much of a hassle.
    Maybe it someone created a "translator" so the transition would be a breeze?

    Anyway, I think if you wanna see the change at the end of the day either wait for it to happen (if ever) or do it yourself.:dunno:
     
  14. flush with your foot

    flush with your foot Platinum Record

    Joined:
    Apr 12, 2017
    Messages:
    337
    Likes Received:
    231
    Location:
    Geneva
    me too I have requested a trial, i life in Switzerland,always happy to read you!!!
     
  15. Obineg

    Obineg Platinum Record

    Joined:
    Dec 7, 2020
    Messages:
    631
    Likes Received:
    218
    exactly.

    a graphics card would be perfect to run 1 or 2 plug-ins, but not dozens. this is what it makes uninteresting for developers.
     
  16. No Avenger

    No Avenger Moderator Staff Member

    Joined:
    Jul 19, 2017
    Messages:
    6,487
    Likes Received:
    4,551
    Location:
    Europe
    This is so nice of you. [​IMG] :winker:
     
  17. No Avenger

    No Avenger Moderator Staff Member

    Joined:
    Jul 19, 2017
    Messages:
    6,487
    Likes Received:
    4,551
    Location:
    Europe
    • Useful Useful x 2
    • Like Like x 1
    • List
  18. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,958
    Likes Received:
    2,056
    Location:
    Studio 54
    Although as you correctly spotted, GPUs are much slower sending back data to the cpu than receiving data from it, they do excel in parallel processing and a lot more than a cpu. Now i am not an expert per se, but it has become obvious the recent years that GPU mechanical design and architecture is not the main cause for this slow response when a GPU sends data back to the cpu, but it's mainly a software issue.
    To explain a bit more, APIs like DirectX or OpenGL create a one way pipeline from the cpu to gpu to the monitor. They are designed to maintain and optimize this pipeline, so 2D and 3D imaging and video is artifact free and smooth with a high -as possible- frame rate. Etc etc.
    So while for instance MS DirectX implies pretty much a one way communication, there are ways of sending back data to the cpu. Open GL's Transform Feedback and Pixel Buffer Objects, both send data back the cpu and very efficiently.
    Nvidia's CUDA API and AMD's Open CL do not restrict programmers into how they will use those thousands of small cores in a GPU nor do they imply or force a strict one way communication to achieve certain results.
    Thus, to put it as simple as i can, it is possible to program compute APIs to interact/inter-operate with graphics APIs so that Vertex data for instance, can be processed through the GPUs internal memory instead of offloading it to system ram. So, imho and feel free to correct me if you know better, Devs need to innovate. Braingines claim they have. If i was to guess, we 'll soon see if any or all of their claims stand true.
    You guys take care:)
     
  19. mageye

    mageye Producer

    Joined:
    Jan 12, 2016
    Messages:
    205
    Likes Received:
    78
  20. Smoove Grooves

    Smoove Grooves Audiosexual

    Joined:
    Jan 26, 2019
    Messages:
    5,224
    Likes Received:
    1,964
     
  21. Pipotron3000

    Pipotron3000 Audiosexual

    Joined:
    Mar 13, 2013
    Messages:
    1,230
    Likes Received:
    611
    They need to create an OPEN format.
    Or it will end like UAD etc ...

    For now, it is not : it is an isolate test, like Acustica CUDA Nebula
     
Loading...
Similar Threads - audio Forum Date
Windows 11 - The Audio Performance Thread PC Yesterday at 1:30 PM
Blue Cat Audio releases "Connector" Streaming Plug-in (20th January 2022) Software News Sunday at 12:04 PM
Audioutopia's Pro Tools 12.5 won't start on Windows 10 Pro Tools Saturday at 5:59 PM
Blue Cat Connector: Send audio and MIDI anywhere, anytime Software News Thursday at 1:36 PM
Crossover/Audiogridder question Software Wednesday at 1:41 AM
Loading...