Skip to content

ChatGPT: 30 Year History | How AI Learned to Talk



This video explores the journey of AI language models, from their modest beginnings through the development of OpenAI’s GPT …

37 thoughts on “ChatGPT: 30 Year History | How AI Learned to Talk”

  1. STAY TUNED: Next video will be on "History of RL | How AI Learned to Feel"
    SUBSCRIBE: https://www.youtube.com/@ArtOfTheProblem?sub_confirmation=1
    WATCH AI series: http://www.youtube.com/playlist?list=PLbg3ZX2pWlgKV8K6bFJr5dhM7oOClExUJ
    Timestamps:
    00:32 hofstader's thoughts on chatGPT
    01:00 recap of supervised learning
    01:55 first paper on sequential learning
    02:55 first use of state units (RNN)
    04:33 first observation of word boundary detection
    05:30 first observation of word clustering
    10:10 sentiment neuron (Ilya & Hinton)
    12:30 transformer explaination
    15:50 GPT-1
    17:00 GPT-2
    17:55 GPT-3
    18:20 In-context learning
    19:40 ChatGPT
    21:10 tool use
    23:25 philosophical question: what is thought?

  2. And then you have work like Google’s geometry Olympiad which I would say blows everything out of the water. The only cap now to AI’s growth is it’s context window since it still doesn’t technically have memory

  3. just imagine what 10x more compute can do. the models won't predict a single token but an entire coherent paragraph in the future, an idea, completely eradicating the downside of the last token having had a different idea than the first. it will be like us, thinking totally. eventually llms will feed thoughts recurrently into themselves 1000s of times and refine them. all this will happen in <10 years

  4. Just wanted to say I love you guys. I’ve been watching you guys for almost a decade, when you guys were the only computer science channel on YouTube. You guys deserve WAY more attention than you’re getting

  5. Dude, duck your music when you are using a source with embedded music. Hofstadtler at 1min has music on his video and it's hurting our brains. AI would never make that mistake. Great vid tho

  6. Wait, so they still haven't solved the memory problem? ChatGPT's memory is still only as large as its context window. Isn't there a way to give it permanent long-term memory like we do?

Comments are closed.