this post was submitted on 06 Nov 2023
42 points (88.9% liked)

Asklemmy

43741 readers
1865 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

with the way AI is getting by the week,it just might be a reality

you are viewing a single comment's thread
view the rest of the comments
[โ€“] TheBananaKing@lemmy.world 19 points 1 year ago* (last edited 1 year ago) (1 children)

An AGI with an actual personality? Cool!

A blow-up doll made of a glorified Markov chain? Yeahno.

[โ€“] yuunikki@lemmy.dbzer0.com 2 points 1 year ago (4 children)
[โ€“] TheBananaKing@lemmy.world 7 points 1 year ago

Take a whole bunch of text.

For each word that appears, note down a list of all the words that ever directly follow it - including end-of-sentence.

Now pick a starting word, pick a following-word at random from the list, rinse and repeat.

You can make it fancier if you want by noting how many times each word follows its predecessor in the sample text, and weighting the random choice accordingly.

Either way, the string of almost-language this produces is called a Markov chain.

It's a bit like constantly picking the middle button in your phone's autocomplete.

It's a fun little exercise to knock together in your programming language of choice.

If you make a prompt-and-response bot out of it, learning from each input, it's like talking to an oracular teddy bear. You almost can't help being nice to it as you teach it to speak; humans will pack-bond with anything.

LLMs are the distant and very fancy descendants of these - but pack-bonding into an actual romantic relationship with one would be as sad as marrying a doll.

[โ€“] ivanafterall@kbin.social 4 points 1 year ago (1 children)

I believe a Markov chain is an old, old wooden ship.

[โ€“] ZILtoid1991@kbin.social 5 points 1 year ago (1 children)

If I replace all of its code line by line, will it be the same ship? If no, at which point does it become a different ship?

[โ€“] xmunk@sh.itjust.works 1 points 1 year ago

Trick question! Nothing is permanent and the person you were a moment ago is complete different than the person you are now.

Using this one simple trick I made millions on the stock market... I just held an apple in my hand for five minutes and then sold all the billions of different apple moments on the commodity market. Imagine how rich Theseus could've been with that one simple trick! (Smash that like button and hit subscribe!)

[โ€“] Dirk@lemmy.ml 4 points 1 year ago

A chain of pseudorandom results.

[โ€“] MxM111@kbin.social 2 points 1 year ago

It is memory-less random process.