OpenAI vs OpenSource
In November of 2022, I started stumbling upon a bunch of articles, TikTok videos, and reels about some kind of AI chat. The AI agenda had existed for a pretty long time, and I was always skeptical about those AI products and usually skipped them. But I decided to check this one out because it was generating a lot of hype. After using it for a while, I was shocked by its capabilities. It was able to answer in a human-like way, and the answers were pretty reasonable. ChatGpt even helped me write a privacy policy and terms and conditions for the app. Since that time, the changes in the world have been significant. Let me give you an example: in November, when I showed my colleagues the privacy policy, terms and conditions, and app description written by ChatGpt, they would not believe that something non-human wrote them. They thought I did it, and it was almost impossible to convince them that AI did it. A month later, they couldn’t believe that I wrote something myself and not through ChatGpt. This time, it was impossible to assure them that I wrote it.
The question I asked myself was: ‘Would a company such as OpenAI have too much control and influence with such technology, and should it be regulated by antitrust authorities?’
During those months, from November to the present day, a lot of things changed. OpenAI has already released a new model, the GPT-4, which is exclusively available to Plus users. Many companies quickly adapted to this new trend and tried to implement OpenAI’s API. Notion now has Notion AI, Intercom added the AI chatbot Fin, Microsoft announced Bing with AI, and a whole family of AI-supported products are in Office 365. I could continue listing those companies for hours, but the most disturbing thing is that all of them are using the OpenAI API. Basically, there is no other alternative. For today, it is the only decent solution, and it is a little bit scary.
Also, taking into account that OpenAI, a former non-profit company, announced that it has become a for-profit one, makes you think that maybe the temptation of earning money and power prevailed over the mission to make "Open" AI for everyone.
The former AI leader, Google, has been desperately trying to launch a competitor called Bard because ChatGpt is basically an alternative to Google search. And things didn’t go very well for them, which again makes you think about the gap between OpenAI and others.
So, does it mean that we are doomed to be dependent on one big player and live our nearest years under a monopoly? The good news is that it seems we are not.
In February, Facebook announced their very own AI model, LLaMA, and again, the good news for us ordinary people is that it was leaked in March. But why is it good news, you might ask? I think Google’s leaked memo explains it very well. I strongly recommend you read the memo yourself, but in short, after the leak of Facebook’s model, the open-source community improved it three times in just three weeks, increasing its capabilities from 68% to 92% of ChatGPT. In only 3 weeks!
Here is a quote from the memo to give you an understanding of the power of the open-source community:
Open-source models are faster, more customizable, more private, and pound-for-pound more capable. They are doing things with $100 and 13B params that we struggle with at $10M and 540B. And they are doing so in weeks, not months.
To give you a better understanding of how fast things change, I will give you a timeline from the Google memo:
Feb 24, 2023 — LLaMA is Launched
Meta launches LLaMA, open sourcing the code, but not the weights.
March 3, 2023 — The Inevitable Happens
Within a week, LLaMA is leaked to the public. The impact on the community cannot be overstated.
March 12, 2023 — Language models on a Toaster
A little over a week later, Artem Andreenko gets the model working on a Raspberry Pi.
March 13, 2023 — Fine Tuning on a Laptop
The next day, Stanford releases Alpaca, which adds instruction tuning to LLaMA.
March 18, 2023 — Now It’s Fast
Georgi Gerganov uses 4 bit quantization to run LLaMA on a MacBook CPU. It is the first “no GPU” solution that is fast enough to be practical.
I think I’m going to stop here. You can check out the whole timeline in the leaked memo. But why did I stop here? Because on March 18, someone was able to run that AI model on their Mac without an internet connection. I think this is enough to show you how much more powerful the open-source community is compared to IT giants.
As mentioned in the Google memo, open-source is going to outcompete OpenAI and other players, and it’s only a matter of time. So what does this mean for us? It means that there is a place for a competitive market, which is always good for customers. It also means that AI will be easily and relatively cheaply accessible, so anyone will be able to implement it in their business without being forced to use a specific API.