• 0 Posts
  • 54 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle

  • I didn’t read the article yet and this is basically what I expected. Hardly a secret, and a fairly good summary. The economy is going to downturn anyway because people just can’t afford the new housing. For a two bedroom anywhere with people, it’s 650,000 and up. At that point you have to be married to someone who also works full time to have any reasonable chance of being able to afford the mortgage. At the average interest rate of 6% you’re paying $4,100 per month, which is about the average pre-tax income (~$54,000 per year, so ~$1038 per week). Of course, that leaves very little wiggle room to save, and if either of you were to lose your jobs or the interest rate goes up, you’d probably default.

    EDIT: I forgot to add the point. It’s this. A whole bunch of people are going to default. People take bad mortgages because they have to, it’s not like its an option to be homeless and sharing a house with strangers can really, really suck. But those people are going to be defaulting, or moving overseas, or just staying at home with their parents. The market will correct itself.



  • I remember Yudowski being a thing like a decade ago, and people were making fun of his “AI” research even then. It’s scary to realize that not only did some people take him seriously, but those people are at the helm of AI companies and making decisions affecting tens of billions of investment capital. I think there was a quote by Kurt Vonnegut that “true horror is waking up one morning and realizing your high school class is running the country”.

    For the other poster lower down, I’d almost successfully forgotten about his Harry Potter fanfiction, people kept praising it so I actually read through a bit of it, it’s painful reading. He also wrote a Superman fanfiction and that’s even worse. I think they both say a lot about his internal mental state and his perception of other people though.


  • I read the BBC article on the recent shooting, you’re right, it’s tragic and wrong. Two gunmen shot up a settlement and then started exchanging fire with an IDF unit and the child was shot in the crossfire. This is awful, but has nothing to do with Palestine being an independent nation since this happened in the west bank, not the Gaza strip, which as I understand it, are different.

    Also, this article (which is the only one I can access) seems to be posted in bad faith, the whole tragic situation started by a bunch of people opening fire on random Israeli citizens. But there’s no mention of this at all when you’re decrying Israeli violence. And the same for your talk about “Strategic Bombing”, that no one is doing, or the political situation in the west bank (which is different from the Gaza strip).










  • nixOS , because it’s a completely atomic distribution, like a docker container OS style. You define the state of the system in a configuration file, which can even control the kernel, and you can switch to an older configuration file in any reboot. It’s more of a pain than the others, but it works ok out of the box and when you fix something it stays fixed so you’ll never end up in a situation where something breaks and you can’t fix it.

    Also, all the packages bring their own versions of their own libraries and directly link to them so they’ll never break during upgrades, but conversely a lot of Linux installers that try to link to system libraries won’t work.





  • It gets me how every single comment is the same three tired old jokes about AI. Between “It’s only good for imagining up stuff that has no right answer”, “And so the downward spiral begins as LLM’s are trained on the output of previous LLM’s.” and "But somehow it still can’t remember what I said one minute ago " we’ve covered every single anti-LLM talking point.

    For one, it looks like the actual training data hasn’t changed, even if the model has changed and it’s been able to access the internet for a while, this is non-news. For another, a lot of people haven’t tried GPT v4 and are just complaining about the free version sucking, well, things you get for free often suck.

    And so the downward spiral begins as LLM’s are trained on the output of previous LLM’s

    I think we can be fairly confident that the people in charge of training the LLMs have heard this too and are probably on top of it.


  • This is what I got.

    Step 1: The Setting
    An old farmhouse, slightly dilapidated but filled with charm and history, stands surrounded by fields and woods. The farmhouse, located in a rural area, has been in the old woman’s family for generations. However, the closest college town is just a 20-minute drive away, making it a feasible place for college students to stay.

    Step 2: The Characters

    Lillian (Lilly): An 88-year-old woman, sharp as a tack, with a mischievous sense of humor. Widowed, with stories of her late husband and her adventures. She's fiercely independent but realizes she could use some help around the place.
    Mike: A 20-year-old college student studying agriculture. He's grounded, serious about his studies, and is always looking for practical experience. However, he's a bit socially awkward.
    Jasper: A 21-year-old college student majoring in digital media and film. He's spontaneous, energetic, and often comes up with wild ideas. He's in college more for the experience than the education.
    
    

    Step 3: The Premise
    Mike finds an advertisement where Lilly offers affordable rent at her farmhouse in exchange for help with chores. Seeing it as a great opportunity to gain hands-on farming experience, he jumps on the offer. Jasper, looking for cheaper rent and a break from the monotony of dorm life, decides to join Mike despite having no interest in farming. Lilly is initially hesitant about having two young men living with her but warms up to the idea, thinking it could be fun and helpful.

    Step 4: Recurring Themes/Scenarios

    Cultural Clashes: Lilly's old-school ways often clash with the boys' modern college lifestyle, leading to comedic misunderstandings.
    Teachable Moments: Lilly imparts wisdom from her long life, while the boys teach her about modern technology and current college life.
    Odd-couple Dynamics: Mike and Jasper have contrasting personalities, which often lead to humorous conflicts and reconciliations.
    Farm Misadventures: Jasper's lack of farming knowledge causes regular chaos, like accidentally releasing all the chickens or trying to turn a part of the farm into a film set.
    College Intrusions: College friends and events spill over into the farmhouse, like surprise parties or study groups, much to Lilly's amusement or chagrin.
    
    

    Step 5: Sample Plotlines

    The Drone Debacle: Jasper tries to film the farmhouse using a drone for a class project. However, the drone scares the livestock, leading to chaos. Lilly, Mike, and Jasper then have to work together to restore order.
    Secrets of the Attic: Mike discovers old photographs and artifacts in the attic, sparking a trip down memory lane for Lilly and history lessons for the boys.
    The Farmhand Competition: After another one of Jasper's mishaps, Mike challenges him to handle the farm chores for a day. Lilly becomes the judge, and comedic challenges ensue.
    Digital Farming: The boys try to modernize the farm operations using apps and digital tools, leading to humorous consequences.
    Lilly Goes to College: Jasper thinks it would be fun to bring Lilly to one of his classes. Her old-world wisdom ends up stealing the show.
    
    

    Step 6: Underlying Message
    Despite the age and cultural gap, the trio forms an unbreakable bond. The sitcom highlights the beauty of intergenerational friendships, the blending of old-world wisdom with modern innovation, and the comedic adventures that ensue when worlds collide.

    In the end, the show would be a heartwarming comedy that celebrates the unlikely friendships that form when people from different walks of life come together.

    EDIT: The point of posting this is to demonstrate that even if two outputs are similar, they’re very sensitive to the user’s built in pre-prompts and the version of ChatGPT being used. So two people can get very different outputs for the same prompt.