• 0 Posts
  • 7 Comments
Joined 2 years ago
cake
Cake day: September 5th, 2023

help-circle
  • Full disclosure - my background is in operations (think IT) not AI research. So some of this might be wrong.

    What’s marketed as AI is something called a large language model. This distinction is important because AI implies intelligence - where as a LLM is something else. At a high level LLMs are using something called “tokens” to break apart natural language into elements that a machine can understand, and then recombining those tokens to “create” something new. When a LLM is creating output it does not know what it is saying - it knows what token statistically comes after the token(s) it has generated already.

    So to answer your question. An AI can hallucinate because it does not know the answer - its using advanced math to know that the period goes at the end of the sentence. and not in the middle.







  • Story time! There is series by Tad Williams called “otherland” - it’s a rift in the standard stuck in vr story.

    Anywho. There is a group of hackers, weirdos and nerds who did not like the corporate vr experience and built their own (treehouse). In all honesty it’s an expansion of the tor project.

    But it’s what I hope for. A place to end up in the web that’s not saturated to hell and back by corporate interests, and you need to know someone for the ladder to be let down and you to be let in.