In an earlier post (2021) , I argued that much of the “powered by AI / ML” labelling and marketing out there was bogus and disingenuous. That AI / ML technologies were getting commoditised to the point of being as simple as pip install
, where most organisations would not need to do any serious R&D to be able to use these technologies, enough to warrant the claim “Powered by AI / ML”. Excerpt from the post:
“Assuming good levels of competence, one is not missing out on anything, and the day one finds a legitimate, objectively quantifiable usecase for an “AI / ML” model, it is most likely going to be a simple
pip install
away.”
Turns out, one does not even need to do pip install
anymore. The most sophisticated AI / ML technologies can now be availed via an even simpler HTTP / curl
request, and for non-techies, via simple, familiar, chat interfaces. It is now possible to integrate such technologies into existing products in minutes. Forget needing to know anything about AI / ML or neural networks or weights or the latent space, one does not even need to know the basics of programming to use sophisticated AI technologies—one just needs to be able to converse in natural language. That is, prompt. Write code, stories, screenplays, synthesize any kind of text in any language, get instant answers to questions and dissect logical conundrums (of course, today, at the peril of the answers being wrong), semantically digest large amounts of unstructured information, generate high quality imagery … The breakneck speed of breakthroughs in the space, initially exciting and now increasingly worrying, has been stunning. Just in the LLM (Large Language Model) space, off the top of my head, there is GPT-3/4, Chinchilla, PaLM-1/2, LLaMA, BLOOM, Alpaca, Vicuna, AutoGPT, llama.cpp, LangChain, and numerous other models, their derivatives, tools, and hacks that have been coming out practically by the week. In no time, people have figured out how to run large models on everything from phones to RaspberryPis.[1] Then there’s the entire other genre of image and voice models that are also exploding. Multi-model-modal models will soon be a thing. One just has to look at the rate of model related stories posted on HackerNews[2] to get a sense of the hype and the dizzying pace. Forget the cat, the elephant in the room has broken loose and has started to run wild.
In the past several months, I have come across people who do programming, legal work, business, accountancy and finance, fashion design, architecture, graphic design, research, teaching, cooking, travel planning, event management etc., all of whom have started using the same tool, ChatGPT, to solve use cases specific to their domains and problems specific to their personal workflows. This is unlike everyone using the same messaging tool or the same document editor. This is one tool, a single class of technology (LLM), whose multi-dimensionality has achieved widespread adoption across demographics where people are discovering how to solve a multitude of problems with no technical training, in the one way that is most natural to humans—via language and conversations.
That is both fascinating and terrifying. I have been actively writing software, tinkering, and participating in technology/internet stuff for about 22 years. I cannot recall the last time a single tool gained such widespread acceptance so swiftly, for so many use cases, across entire demographics.[3] Until the recent breakthroughs, that is.
Skepticism
I have been an ardent skeptic of technology hype cycles who has always mentally blocked out phrases like “paradigm shift”. The level of frenzy and hype surrounding the recent AI breakthroughs is unprecedented, but understandable. To objectively evaluate some of them, I have experimented with the tools, read, reflected, and questioned myself over and over to make sure that I have not accidentally drunk some sort of LLM-KoolAid. More than everything, my increasing personal dependency on these tools for legitimate problem solving convinces me that there is significant substance beneath the hype. And