• 0 Posts
  • 15 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle

  • The long story short is that you are being made to (by default) give up rights that you should have, particularly around class action lawsuits. It’s strictly bad for you and strictly good for the company. They probably shouldn’t be allowed to do this. Since they are, the only thing we can do to protest it is to opt-out.

    Maybe you’ll never sue discord. But maybe someday there will be a lawsuit brought against discord by someone else. A few ideas for topics might include a security vulnerability that leaks personal information, the use of discord content for AI training data (e.g. copyright issues), or the safety of minors online. If you don’t opt-out, you can’t be a part of such lawsuits if they ever become relevant. This overall weakens these lawsuits and empowers companies like discord to do more shady things with less fear of repercussions.

    And, since the vast majority of people will never opt-out (since you’re opted in by default) these kinds of lawsuits are weakened from the start. That’s why every company in the US is doing this forced arbitration thing. At this point, they would be crazy not to since it’s such a good thing for them and the average person doesn’t care enough about it.


  • Some of that is content categorization in the eyes of the all-seeing algorithm. Let’s say you upload a type of content “A” that gets big views but you’ve been uploading a type of content “B” that gets small views for a while. The youtube algorithm will aggressively try to grow content A and massively deprioritize content B, even among other channels that produce content B.

    A guy I know who does youtube/twitch had to create a second channel for his content B because it would get sub-1k views when he would get tens of thousands of views on his content A. Just by uploading somewhere else he started to get higher view counts.

    Exactly why that happens isn’t known, but a common theory is that youtube wants to push what it knows works. They have no real reason to give your content B a chance because they know content A will sell. And they do this even though this outcome was the result of a feedback loop.


  • I’m almost starting to wonder if that’s the plan. Just keep saying “IPO IPO IPO” to get funding from over-eager VCs who want a piece of the IPO before it becomes widely available.

    But then you just never IPO. Keep making minor to moderate mistakes along the way so you can be all “weeeeell we would have IPO’d but insert thing here so we want to wait another 6 months to let it die down”. Repeat until you’re ready to quit, then actually IPO and ride the initial IPO high all the way down via golden parachute.



  • Huh, go figure. Thanks for the info! I honestly never would have found that myself.

    I still think it should be possible to use in:channel on the channel-specific search though. One less button press and it can’t be that confusing UX-wise since you have clear intent when doing it (if anything, the fact that the two searches work differently has to be more confusing UX-wise).



  • You say “only” 6 months ago but it’s surprising to me just how quickly this time has passed.

    I was a Reddit every day user pre-Lemmy. I happened to get linked to something there yesterday and saw all my sub’s “last visited” dates at 6 months. It’s crazy how easy it was to go cold turkey and I haven’t seen a need to go back.



  • Copilot, yes. You can find some reasonable alternatives out there but I don’t know if I would use the word “great”.

    GPT-4… not really. Unless you’ve got serious technical knowledge, serious hardware, and lots of time to experiment you’re not going to find anything even remotely close to GPT-4. Probably the best the “average” person can do is run quantized Llama-2 on an M1 (or better) Macbook making use of the unified memory. Lack of GPU VRAM makes running even the “basic” models a challenge. And, for the record, this will still perform substantially worse than GPT-4.

    If you’re willing to pony up, you can get some hardware on the usual cloud providers but it will not be cheap and it will still require some serious effort since you’re basically going to have to fine-tune your own LLM to get anywhere in the same ballpark as GPT-4.



  • isildun@sh.itjust.workstoMemes@lemmy.mlSure it is
    link
    fedilink
    English
    arrow-up
    30
    ·
    edit-2
    8 months ago

    Definitely AI generated. Look at the bottom-right of the Confederate flag. It’s all messed up, classic generative AI “artifacting” for lack of a better word for it.

    Edit: lower down in the thread the original was posted. This was upscaled (very poorly) by AI.