https://bannedbyanthropic.com/

Stumbled across this today and I have to admit the capriciousness of it put me on my heels.

I guess once you get big enough you can stop having to explain yourself or deal with customer service / resolution.

  • Scipitie@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    The problem is that this is one of the few use cases where I can afford the (dangerous and unreliable) cloud service but I’m far away from being able to do so self hosted.

    I’m actually using LLMs quote a lot to counter some of my brains more weird bugs working-as-intended features. But this is way beyond what I could locally do.

    For anyone in a similar shoe: you can mix local and remote quite well though, using local for everything that works out and remote for everything else. I.e speech to text and email analysis is done on my local server while bigger sets are done on the CLI with remote providers.

    • that’s what I thought too, but apparently there are laptops these days that are incredibly powerful for AI. anything with the newish AMD Ryzen AI Max+ 395 for example. has 128gb of unified memory, so it can run very large models quite easily. it’s in the Asus ROG Flow Z13, which I’m very seriously considering getting. it’s been out about a year, so there are a lot of reviews out that have tested the performance with AI.

      • Scipitie@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        For the price of the Asus rog I’d get 12 years of a pro subscription to any of the big companies.

        And I don’t have that kind of cash at hand anyway…

        • neither do I, if I buy it, it’ll be on credit. and while you can get years worth of a subscription to an AI company for that price, it’s built for gaming primarily, and you can’t run Cyberpunk 2077 on Claude Code.

          I wouldn’t buy a laptop just for AI, but I have been known to spend more money than I should on gaming tech.