May 13, 2024
11:36:40am
franklyvulgar All-American
why is it scary, it is curve/manifold fitting but adding some other
modalities (images,sound) besides text in one model, and improving the speed. They're still trying to get everyone to want to pay to use their GPT when you can download and run locally your own LLM (ollama).

I wouldn't worry about GPT's directly leading to AGI, the program is just predicting the next word token from a probability distribution based on the context that increases as they create bigger networks. No matter what you ask chat gpt it will take the same amount of time to answer regardless of how 'hard' your question is, compare that to humans where some questions are indeed harder and require more thought and energy to process. This shows it's much like a look up table. Not to say these models can't be very useful and transformative, they're going to ubiquitous.

The new models in research that involve energy, chain of thought, and world models are trying to incorporate more of a self-supervised method, we'll see how far they get.

Here is a good interview on LLM's and their limitations
https://www.youtube.com/watch?v=5t1vTLU7s40&t=910s
This message has been modified
Originally posted on May 13, 2024 at 11:36:40am
Message modified by franklyvulgar on May 13, 2024 at 11:37:27am
Message modified by franklyvulgar on May 13, 2024 at 11:37:57am
Message modified by franklyvulgar on May 13, 2024 at 11:40:46am
Message modified by franklyvulgar on May 13, 2024 at 11:41:31am
Message modified by franklyvulgar on May 13, 2024 at 11:41:59am
Message modified by franklyvulgar on May 13, 2024 at 11:43:49am
franklyvulgar
Bio page
franklyvulgar
Joined
Sep 11, 2006
Last login
May 21, 2024
Total posts
26,777 (20 FO)