Apple moves to self produced ARM cpus for the Mac - Bye Bye Intel

Discussion in 'Computer Hardware' started by taskforce, Jun 22, 2020.

  1. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    3,619
    Likes Received:
    2,024
    Location:
    Europe
    Yes, good point. The compatibility with multi nVidia powered video FX workstations sounds very hard. I would be extremely troubled.

    I think for these cases additional GPUs would be somehow possible because, well, there's no other way to get that insane rendering power. Nobody talked about leaving nVidia, did they? :rofl:

    For regular users in theory a bad-ass APU could be enough. Again, like AMD's PS5 APU (or Xbox's). I don't think there's the intermediate case of dual-GPU gamer users on MAC like in the PC side. Well, with the latest nVidia monsters perhaps there aren't many even in the PC game market. I'm outdated in that.

    This is AMD's PS5 APU. And since ARM is the king in already in integration... in fact iPad CPU/APU already have several dedicated chips
    [​IMG]
     
    Last edited: Jun 27, 2020
  2. JMOUTTON

    JMOUTTON Kapellmeister

    Joined:
    Jan 10, 2016
    Messages:
    54
    Likes Received:
    42
    Location:
    Virginia


    The thing with large corporate or studio video editing work station NLE and video-post rendering agglomerate server farms is they generally have a 3 year End Of Life cycle. So the hardware you buy today, even if it is $20K MacPro only needs to be reliable and competitively functional for 3 years before you really don't care. The expected increase in productivity has so far outpaced the cost of replacement, until Intel hit thermal and processing walls with Skylake. Skylake and it's failures on an efficiency, security and processing metrics probably accelerated the arrival of ARM Bionic MacOS by a few years. This is mostly an issue on the artistic side before final rendering as final rending is done on cheaper linux based dedicated servers.

    It's also worth noting that because of the way UNIX kernel works and because Apple is Apple Stubborn and risky it is possible to run MacOS on different architectures and suffer very little if any performance losses if they build in proper virtualization into thier new ARM chip sets and MacOS.

    This isn't the case for smaller shops or musicians who do not tax their machines as heavily but expect them to last and performer for longer periods of time. However, since there are already kext for most really high end workstation graphics cars for different flavors of unix and linux i find highly unlikely that compatibility will be an issue. Also, just because a chip has built in graphics doesn't mean you have to use it and that has the been the case for years now.
     
  3. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,610
    Likes Received:
    1,595
    Location:
    Studio 54
    You are not wrong at all about this. Although SLI, now called NvLink, can yield up to 80% more performance in gaming, most games don't support it anymore. So it is now mostly used for rendering etc, in other words just in the pro/workstation app world. You may see a few top-tier multi thousand gaming comps with SLI but they are usually built just to impress rich kids with money to burn.
    AMD's rival tech Crossfire is practically extinct too, i haven't seen any game the past two-three years played under crossfire. On top of this AMD seems to have ditched support for it altogether.
    Now no gamer the last 20 yrs went to buy a Mac to play their fav games. It is obvious Apple has no interest for this, apart from the mobile side of things and we should add this to the reasons why Apple is bringing full IPad/Iphone app integration in the new Macs.
    Of course one would say, but it's the same cpu or just about, but -imho- all indications show Apple wants to outperform Intel and show they 're leaders in the tech world not just by the numbers of gross income, so logically we should expect more cores and higher clock speeds for the new Mac as a finished product than what's in the IPad pros etc.
    Cheers
     
  4. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    3,619
    Likes Received:
    2,024
    Location:
    Europe
    Nailed it. Besides, Unix/Linux itself is the king of server virtualization.
    But back at the Video FX hardware topic, are you saying they don't use nVdidia's for rendering?
    True, but it seems for the ARM era CPU and GPU would go in the same main chip. As I said, at least for most users. I'm not sure if you're talking about that possibility to use dedicated GPUs.
     
  5. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    3,619
    Likes Received:
    2,024
    Location:
    Europe
    I agree. But even if they don't care about the absolute fastest desktop CPU they have to design faster ARM-based chips to keep up and outperform Intel's in the near future.
    The modified iPad ARM CPU used in the presentation is fast but still nowhere close to the fast i7s, i9s...
     
  6. tzzsmk

    tzzsmk Audiosexual

    Joined:
    Sep 13, 2016
    Messages:
    1,106
    Likes Received:
    644
    Location:
    Heart of Europe
    of what was shown in Apple's WWDC Keynote, that basic "Transition Kit" was capable of triple 4K video in Final Cut Pro, and later in the video there was a mention of dedicated video acceleration within Apple Silicon (ARM) architecture,
    speaking of real performance, I doubt Apple would "open" the ARM platform in any way, I mean, this "Transition Kit" has no THUNDERBOLT which also means no PCIe, which means no external "performance extensions", so the ARM itself needs to do good enough,
    I'd bet Apple will keep pushing/sustaining Mac Pro product line, because that's where heat/battery stuff doesn't matter, and "thanks" to its uncompetitive cost, Apple will be able to focus on ARM for iPads, iMacs, MacBooks and MacMinis
    :chilling:
     
    • Interesting Interesting x 2
    • List
  7. JMOUTTON

    JMOUTTON Kapellmeister

    Joined:
    Jan 10, 2016
    Messages:
    54
    Likes Received:
    42
    Location:
    Virginia
    Unless you are running RedShift or a similar bi-directional render engine most classic render farms are CPU based for now.

    For the conceptual workstations and NLEs though GPU are always going to be kings and I think that external GPUs will always have a presence on top end workstations.

    To be clear though we are talking about NVIDIA Quadro, AMD FirePro and Radeon Pro W GPUs although some consumer level gaming GPUs can probably come close to their specs nowadays... As long as there is a data bus that is capable of the bandwidth and latency requirements I don't see any reason for external GPUs to not have a role on new architecture Macs. Apple has always been very conscious and nearer the bleeding edge on bus speeds, network coms and peripheral bandwidth so I don't expect them to hamstring their workstations in the future regardless of chipset.

    I think the new line of Bionic chips will be very different than what is in the current iPad-Pro which was probably a proof of concept for the design team. I have a feeling there will be some up-scaling and perhaps peripheral processors on the core - some for audio something comparable to Intel FPGA could be on the die.

    There was a trend towards GPU agglomerate rendering but heat and the rise in GPU prices caused by crypto-miners along with cheap higher core counts and faster processors with better thermal management blades has stymied progress. While thermal management isn't that big of a deal on a conceptual workstation or an editor's NLE it is a big concern for server farms which will require dedicated cooling plans and equipment in order to function. The cost of cooling a couple of hundred or thousands of servers isn't inconsequential.
     
    • Like Like x 3
    • Interesting Interesting x 1
    • List
  8. JMOUTTON

    JMOUTTON Kapellmeister

    Joined:
    Jan 10, 2016
    Messages:
    54
    Likes Received:
    42
    Location:
    Virginia
    The lack of external connectivity might have something to do with how bad things are between Apple and Intel at the moment. There is a lot bad anger and resentment between the two over multiple areas from 5G modems, Skylake, USB C patent and a feeling at Apple that Intel sold them down the river in some big lawsuits because it had no balls...

    I doubt that production personal computers or laptops will have no connectivity, what kind of connectivity is in the pipeline is a mystery though.
     
  9. Bermuda Triangle

    Bermuda Triangle Member

    Joined:
    Jun 23, 2020
    Messages:
    15
    Likes Received:
    7
    I am here for all! why tell me what i do?

    always same!
    you make rude mark to all mac user on post everytimes a mac thread - FACT! :deep_facepalm:
    Why you like post Mac threads if you PC user?
    Maybe Mac user kill you hamster? thats why you hate so much?
     
    Last edited: Jun 27, 2020
  10. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,610
    Likes Received:
    1,595
    Location:
    Studio 54
    Apple has been in war with Nvidia for years. From Mojave and forth Apple has ditched support for Nvidia. It all started in '08 when Apple released a Macbook with a Nvidia GPU with a supposedly revolutionary function that would take over both the Northbridge and Southbridge during realtime rendering. This lead to a lawsuit from Intel to Nvidia. If this wasn't enough many of these Nvidia chips were aparently faulty leading to a class act lawsuit to Nvidia. Apple was in the middle of this hurting from the inter-company turmoil and at the same time Nvidia had filed a lawsuit against Apple, Qualcomm and Samsung for patent infrigement on their mobile phones.
    So years went by with bad blood (commonly lawsuits like these take forever) and in '16 Apple decided they 've had enough of Nvidia, they went AMD for their GPUs and that was the end of Nvidia for the Mac.
    Speaking about Nvidia, too many different important tech people hate them and not without reason of course. The last one to blatantly speak out was non other than Linus Torvalds who uttered publicly a big "Fuck You" and middle finger (!) to Nvidia for not supporting Linux denoting them as "the single worst company they have ever dealt with".
    Personally i find their top-tier cards have had no rival for years (cmon AMD give us something better), but yes their leadership seems they are sort of tech bullies, if this makes any sense...
    On @tzzsmk 's comment, i think Apple is still searching a way to bring Thunderbolt to their cpus. Especially now that it's open source it should be easier than previously but it's a difficult bet. The whole serious-peripherals-for-Apple world the last ten years or so is built around Thunderbolt. Abandoning Thunderbolt will have a strong impact on their clientell imho. Almost everything pro for Apple comps is TB based. E-gpus, Raid and Nas boxes, Audio interfaces etc etc. If this happens, i can already hear tens of thousands cursing :rofl:...
    Just some thoughts :)
    Cheers
     
    Last edited: Jun 28, 2020
    • Agree Agree x 3
    • Like Like x 2
    • List
  11. Area51

    Area51 Ultrasonic

    Joined:
    May 3, 2020
    Messages:
    232
    Likes Received:
    30
    Xupito stop distracting us from the original technical post!!
     
    • Dislike Dislike x 2
    • Disagree Disagree x 2
    • List
  12. phumb-reh

    phumb-reh Producer

    Joined:
    Jun 20, 2019
    Messages:
    149
    Likes Received:
    100
    Location:
    Neverwhere
  13. Bermuda Triangle

    Bermuda Triangle Member

    Joined:
    Jun 23, 2020
    Messages:
    15
    Likes Received:
    7
    :dunno:
     
    Last edited: Jun 28, 2020
  14. Xupito

    Xupito Audiosexual

    Joined:
    Jan 21, 2012
    Messages:
    3,619
    Likes Received:
    2,024
    Location:
    Europe
    I agree and that's the most interesting part for me. Actually, could be the most interesting for audio software users, because for sure an audio chip would be included. Perhaps more (TTS and vice-versa).
    My mistake. I assumed nVidia.
    Of course you make sense. Like Intel when he wiped the floor with AMD. Like pretty much any tech giant when it dominates certain market.
     
    Last edited: Jun 28, 2020
  15. Bermuda Triangle

    Bermuda Triangle Member

    Joined:
    Jun 23, 2020
    Messages:
    15
    Likes Received:
    7
    :mates:
     
    Last edited: Jun 28, 2020
  16. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,610
    Likes Received:
    1,595
    Location:
    Studio 54
    I forgot to quote this but better late than never, right?
    Errr... Faster on what, Angry Birds ? I 've had their top of the line Ipad Pro for a week. And although it tries hard to be a laptop with its new mouse and keyboard peripherals, well... The only laptops it's faster from is the very thin Ultrabooks with 1-1.2ghz dual core Intel cpus. It's really far from being a Pro machine for anything (except perhaps Pro Facebookers - pun intended) apart from its beautiful amazing display which i can sit and look at for hrs. And surely doesn't come close to the performance of MacBook Pro or any other decent spec Windows laptop.
    Cheers :)
     
    Last edited: Jun 29, 2020
    • Agree Agree x 2
    • Useful Useful x 2
    • List
  17. tzzsmk

    tzzsmk Audiosexual

    Joined:
    Sep 13, 2016
    Messages:
    1,106
    Likes Received:
    644
    Location:
    Heart of Europe
    honestly I'd say Apple will come up with own "superior" interface, retaining USB-C connector but providing "their own" Thunderbolt/USB proprietary controller chip (or part of the ARM chip),
    Apple users got used to removal of PCIe slots, removal of nVidia, removal of Thunderbolt/MiniDP connector, removal of MagSafe etc.. so Apple can do literally anything IF they convince users of new ways being superior :rofl:
     
  18. taskforce

    taskforce Audiosexual

    Joined:
    Jan 27, 2016
    Messages:
    1,610
    Likes Received:
    1,595
    Location:
    Studio 54
    Hehe, i copy paste from the web:
    Developers who signed up for the transition kit had to agree to terms and conditions that say, in part, that they will not "display, demonstrate, video, photograph, make any drawings or renderings of, or take any images or measurements of or run any benchmark tests on the Developer Transition Kit (or allow anyone else to do any of the foregoing), unless separately authorized in writing by Apple.
    This alone in my book says the dev kit is a rushed device and they want to avoid comparatives. They should perhaps give it more time, but i guess they assume its enough for devs and they think the final hardware will be ready by Xmas.
     
    • Like Like x 2
    • Agree Agree x 1
    • Interesting Interesting x 1
    • List
  19. Qrchack

    Qrchack Platinum Record

    Joined:
    Dec 15, 2015
    Messages:
    593
    Likes Received:
    234
    This is literally no different from every signle devkit ever released (game consoles or whatever).
    Well, it's not like most developers will be rewriting their apps from scratch. For anyone using Swift and the API provided it'll probably be as simple as updating any 3rd party libraries they're using and changing the target to ARM in Xcode. This is all your mass-oriented software, think apps like Evernote, Discord, Spotify, etc. And frankly, Apple keeps confirming that they don't care about any more demanding tasks, they believe they're a lifestyle brand of consumer devices only
     
Loading...
Loading...