The genie is out of the bottle
There will be no AI winter. Unless you mean total, utter civilizational disarray and societal turbulence due to seismic shifts and transfers of skills between AI and humans with economy-crippling asymmetries. Then, yes - AI winter is coming.
ChatGPT is OpenAI's test run of a sophisticated chatbot, which can do anything from having a proper common sense conversation, the ability to detect whether you are asking a bullshit question, answer questions in any discipline at university level, or code for you. I have been playing with it conversing with it, doing some adversarial attacks, letting it spit out chunks of coding I do and I got it to produce very sensible to pretty excellent results.
This is like Google on steroids. Search with a brain. A virtual exocortex finally starting to be worthy of that moniker. It is not AGI yet, but it is pseudo-AGI. And 2023 will be rife with with pseudo-AGIs. We have entered the AGI precursor era.
The great AI disruption is coming and it's coming even a tad faster than I anticipated mere months ago. That is why most of my upcoming blog posts at here will be covering the issues we will face as civilization the coming decade.
And I say this while also stressing the following: large language models are like the shadows in Plato's cave w.r.t. to meaning and understanding. The sheer amount of data and precursor to salience of transformers (attention is all you need) gets you very far and so this is why we'll see pseudo-AGIs popping up like shrooms in a forest after a rainy day. However, ultimately, understanding is an AI-complete problem, like vision,...meaning to solve them means entails solving AGI. For symbol grounding you need to ground them in something, and it's not endless vectors. You can't transformer or Markov chain your way into understanding.
Language and meaning cannot be "reverse-engineered" to get symbol grounding. But yes, understanding is certainly be possible. NGI (natural general intelligence) is possible, so AGI is as well. But understanding piggybacks on multi-modality and interfacing with itself and the world in various ways, so it requires specific hardware and I/O systems - a deep topic for another time. The point is that language is not meaning. It's a symptom of meaning.
However, because we can simulate and mimic plenty of features of intelligence and understanding, these latest developments will fuel a lot of precursors to AGI the coming months. Why “months” and not “years”? I expect paradigm shifts to start hitting us more frequently the coming years, so I truly think a year from now we’d be shocked with the transformation and disruption the world has undergone and would massively update our views on what’s next. So let’s stick to months for now.
Thus, the conclusion for now is that we're facing many of the aforementioned challenges the coming year already. And we could not be less ready. AI will start eating everything. Education, search, art, programming. Let’s face it, a lot of human activity and labor amounts to busywork. I stress again this is not just hype and hot air. This is the beginning of the biggest self-induced disruptive event any civilization faces once they get to this technological point of no return. And here we are.
The genie...is out of the bottle.
Mental Contractions is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.