Recent ruling from U.S. Supreme Court regarding "Fair Use"

Discussion in 'Industry News' started by quadcore64, Apr 11, 2026 at 7:49 AM.

  1. quadcore64

    quadcore64 Audiosexual

    Joined:
    Jun 13, 2011
    Messages:
    1,981
    Likes Received:
    1,097
    The Supreme Court just rewrote the rules of secondary copyright liability — and almost nobody is talking about what it means for YOU, the creator. On March 25, 2026, in Cox Communications v. Sony Music Entertainment, the Supreme Court unanimously held that simply knowing your users might infringe copyright is NOT enough to make you liable. The Court killed a 50-year-old legal theory called "knowledge plus material contribution" that had been the foundation of nearly every secondary copyright lawsuit against internet service providers, cloud services, and platforms. So here's the question I can't stop thinking about: If the legal pressure that built YouTube's three-strike policy just got dramatically reduced... could YouTube finally drop the three-strike system? Could we be standing at the doorway of a creator renaissance — a moment where remixing, fair use, criticism, commentary, and transformative work can flourish without the constant fear of channel termination?

    www.youtube.com/watch?v=NoT_dsC3ZX8
     
    • Interesting Interesting x 1
    • List
  2.  
  3. PulseWave

    PulseWave Audiosexual

    Joined:
    May 4, 2025
    Messages:
    4,992
    Likes Received:
    2,927
    In my opinion, it's the sheer volume of data that makes it impossible to control. In short, it's simply impossible to prosecute all copyright infringements; they lack both the personnel and the funds, which is why they're relaxing the rules. Please keep in mind that different rules apply to YouTube in Europe, and copyright law remains strict.

    If you make a mistake, YouTube will typically issue one or two warnings before deleting your channel.
     
  4. quadcore64

    quadcore64 Audiosexual

    Joined:
    Jun 13, 2011
    Messages:
    1,981
    Likes Received:
    1,097
    The point being made is that there is no longer a reason for youtube to continue it's hard-line policies.

    Leonard, also explains how youtube can rework their policies using their existing workforce along with their AI to better access content usage before dropping the hammer even once.
     
  5. PulseWave

    PulseWave Audiosexual

    Joined:
    May 4, 2025
    Messages:
    4,992
    Likes Received:
    2,927
    Hopefully, YouTube executives will watch the video and discuss it afterward.
    In my opinion, YouTube employees still decide on account deletions! Right?

    I don't know why you think YouTube's guidelines are strict! YouTube does something for artists and protects copyright.
     
  6. quadcore64

    quadcore64 Audiosexual

    Joined:
    Jun 13, 2011
    Messages:
    1,981
    Likes Received:
    1,097
    From my understanding, youtube, prefers to let their AI make decisions leading ti instant strikes and a tedious appeal process. Just ask Rick Beato.

    The point is not just about copyright protection but, more about how and what protection mechanisms are used as a blunt instrument rather than a scalpel.

    I'm not saying there is never a reason for a video take down, just the way they go about is brutal.

    A lot of the take downs are initiated by companies/people who use bots to scan content & created copyright complaints to youtube's bot/AI.
     
  7. clone

    clone Audiosexual

    Joined:
    Feb 5, 2021
    Messages:
    10,234
    Likes Received:
    4,419
    From your first post, it seems like you think the ruling means open season on copyright itself. But what you describe, is a protection FOR the platforms to operate systems which can be used by users to violate copyrights. That's way different.

    Think about Suno in this context. A user can certainly use their technology to create work product which infringes on copyrights. All those platforms do is provide a vehicle, and they cannot reasonably be expected to assume that is how their platform will be used, even if they know it can be. It does not mean they are suddenly going to stop checking for infringing material. because that could defeat the reason for the decision. Youtube, or other platforms, can say they are checking everything they are hosting on their service by using those scans, normal audits, etc. to prevent their service being used that way. Not doing anything about it could be viewed as either negligent or complicit. Self-policing does not need to be perfect, and if crafty users violate the rules, they can't be expected to catch every instance of it. They are not just going to stop self-policing efforts and become some mecca of copyright infringement. That's how they can maintain plausible deniability.

     
  8. quadcore64

    quadcore64 Audiosexual

    Joined:
    Jun 13, 2011
    Messages:
    1,981
    Likes Received:
    1,097
    The post is directly from the video description section. Those are Leonard's words. Not mine. He is presenting an analysis of the ruling and how it can apply going forward to avoid inaccurate findings of infringement based on a complaint, before, an actual review of the facts is done.
     
  9. mino45

    mino45 Producer

    Joined:
    Sep 3, 2021
    Messages:
    213
    Likes Received:
    111
    I guess it does not come down to preferring anything. It is more the sheer amount of data. If you have 20 million new videos every day, there is no way that you will be able to have people actually watch them to decide if they are compliant with the rules. There are 30,000 hours of video added every hour. With 10,000 moderators employed worldwide, there is no way to handle this without AI, especially considering that only roughly 1/3 of that amount will be working at any given time. YouTube staff probably is not even able to watch only the flagged videos to check if the AI was right, so the only viable option is to allow for AI to take down the videos and have people handle the cases when creators actually fight the decision it made.
    I am sure YouTube would prefer to be able to have humans or AI do the job in a way that would not result in false takedowns, etc., but it probably is just not able to do it, as there are roughly 2 petabytes of data added every day that need to be reviewed.
    The screening process is probably optimized for throughput rather than accuracy, not because that is what they want, but because it is the only way to handle the amount of data that they have to deal with every day.
     
    Last edited: Apr 11, 2026 at 1:33 PM
  10. xorome

    xorome Audiosexual

    Joined:
    Sep 28, 2021
    Messages:
    1,725
    Likes Received:
    1,311
    Providing basic infrastructure/services under fair/reasonable/non-discriminatory terms should never make the provider liable for damages caused by those abusing the service. Nobody in their right mind would sue a parking lot company for obscenity because the local dogging scene decides to make it its preferred gathering spot.

    Likely good for archive.org, MEGA, reddit and the likes - "generic"/"normie" providers - and bad for the lawyers. But no one likes those anyway.
     
Loading...
Similar Threads - Recent ruling Supreme Forum Date
Some stuff I found interesting about recent malware Software Feb 25, 2026
Anyone else here a widower? Recently lost my wife ... Lounge Jan 7, 2026
Recent Katfile porn popups be like: humor Sep 2, 2025
Nexus 5 recent issue Mac / Hackintosh Jan 8, 2025
anyone recently lose the ability to open MSPaint and/or Wordpad? Lounge Dec 25, 2024
Loading...