• 0 Posts
  • 17 Comments
Joined 1 year ago
cake
Cake day: January 26th, 2024

help-circle


  • The way I would classify it is if you could somehow extract the “creative writing center” from a human brain, you’d have something comparable to to a LLM. But they lack all the other bits, and reason and learning and memory, or badly imitate them.

    If you were to combine multiple AI algorithms similar in power to LLM but designed to do math, logic and reason, and then add some kind of memory, you probably get much further towards AGI. I do not believe we’re as far from this as people want to believe, and think that sentience is on a scale.

    But it would still not be anchored to reality without some control over a camera and the ability to see and experience reality for itself. Even then it wouldn’t understand empathy as anything but an abstract concept.

    My guess is that eventually we’ll create a kind of “AGI compiler” with a prompt to describe what kind of mind you want to create, and the AI compiler generates it. A kind of “nursing AI”. Hopefully it’s not about profit, but a prompt about it learning to be friends with humans and genuinely enjoy their company and love us.





  • I’d argue that weird isn’t being used as an insult, but to state that the bullies are in fact not representing the norm. They are outside the norm but pretend to be normal, they insist on being normal - which makes it weird. It’s not an insult to us, but it is an insult to them. Which makes it funny.

    Fascists believe in inequality based on identity, while we kinda thought we had this already sorted that we all believe in equality now. Like all people created equal. But their need to define an identity as superior and then attacking anybody outside the norm is being used against them. And it IS weird to do that, most people simply don’t care if you’re a little weird. We still have to learn to be more tolerant to more weirdness and not react with biases or irrational emotions. Respect weirdness.

    So weird isn’t being used as an insult but as a way to rob them of their power - their attempt to define a new normal. And their arguments and attacks against anyone else are becoming increasingly bizarre, less founded in reality, absurd claims. Comical. Weird.

    There is the metaphor about slowly cooking a frog without him noticing, shifting the overton window. Weird sort of resets that. It is more an attack on what they DO than what they are.





  • LarmyOfLone@lemm.eetoTechnology@lemmy.ml*Permanently Deleted*
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    Unfortunately marketing matters a lot. One single brand is easier to understand than the many federated servers of mastodon.

    I wanted to check out where this reddit community migrated to some server with something lemmy. It said something about mastadon so I made an account to try to participate. It wasn’t really clear to me lemmy isn’t another mastodon instance, but a different protocol with some federated synergy. My fault, but the marketing is a bit confusing.



  • Yeah that’s what I was just thinking. Once we somehow synthesize this LLM into a new type of programming language it gets interesting. Maybe a more natural language that gets the gist of what you are trying to do. And then a unit test to see if it works. And then you verify. Not sure if that can work.

    TBH I’m a bit shocked that programmers are already using AI to generate programming, I only program as a hobby any more. But it sounds interesting. If I can get more of my ideas done with less work I’d love it.

    I think fundamentally, philosophically there are limits. Ultimately you need language to describe what you want to do. You need to understand the problem the “customer” has and formulate a solution and then break it down into solvable steps. AI could help with that but fundamentally it’s a question of describing and the limits of language.

    Or maybe we’ll see brain interfaces that can capture some of the subtleties of intend from the programmer.

    So maybe we’ll see the productivity of programmers rise by like 500% or something. But something tellse me (Jevons paradox) the economy would just use that increased productivity for more apps or more features. But maybe the needed qualifications for programmers will be reduced.

    Or maybe we’ll see AI generating programming libraries and development suits that are more generalized libraries. Or like existing crusty libraries rewritten to be more versatile and easier to use by AI powered programmers. Maybe AI could help us create a vast library of more abstract / standard problem+solutions.