The age of AI-based Copilots has arrived — but what does this mean for the future?
With thousands of tech companies rushing to adopt or integrate AI, we’re finally seeing AI-enabled products such as “Copilots” entering the fray.
2023 is marked as the year big tech has dived into the power of LLM-based digital assistants designed to expedite user productivity. But how did the age of AI-enabled products like Copilots arrive, and where do we see them going in the future?
The Rise of AI-based Copilots
Of course, all of this is mostly enabled by large transformer models which, while having been around for awhile, never skyrocketed into commercial popularity until OpenAI introduced GPT3 back in June 2020. Of course, during that similar timeframe was when GitHub began development of GitHub Copilot (which took 3 years until general availability!). But why didn’t they start building back in 2018, when OpenAI first released GPT 1?
Starting with every transformer model that OpenAI (and other research teams) released since 2018, the compute cost has drastically gone down while performance (all different shots of learning) has improved. AI-based products were perhaps way too expensive to run a few years back, due to both limited GPU resources and the absurd amount of processing power needed for just one computation (prompt and response).
But fast forward to early 2024, and that has all changed. Copilots have become a dominant product name encompassing the commercial AI surge, and will soon become even more mainstream than regular chatbots.
Future Predictions
1. AI-based Copilots will become a saturated market, and a “Copilot” will emerge for most industrial and personal use cases.
If there’s a market for it, there will be a Copilot. And that’s hardly a joke.
Microsoft has already embraced this idea with it’s M365 Copilot that serves all subscribed users in their M365 suite of products, including Word, Excel, PowerPoint, and even PowerBI. Moreover, they’re having Copilot available for Microsoft Fabric — an integrated connector tool that can integrate different services from third party solutions for to bring all data services into one single platform for valuable insights. Just imagine the possibilities.
But obviously, many companies across the tech landscape, including the B2B enterprise space, are making a splash on “Copilot-esque” AI products. Planview —the company I work for — has already made its announcement for our own AI-enabled Copilot which will support Planview’s suite of project management, portfolio management, and value stream management products.
The use cases seem limitless. Whether it’s code development (like GitHub Copilot), productivity (like M365 Copilot), value stream management, project management, analytical dashboards, backlog tracking, or other use cases, there will probably be a Copilot.
My personal predictions for where Copilots will be built include:
Game development scenarios (Like for Unity/Unreal or even art programs)
Advertisement and copywriting scenarios
Music production
Digital art creation
Recipe and food apps
Browsers and tab management
2. The GPU market will become a battleground, making the costs of Copilots incredibly volatile.
While I do think AI will continue to evolve, you can’t forget the importance of GPUs and other machines who are needed to train and support this evolution.
As of 2023, it’s pretty clear who’s winning the AI battle: Nvidia. Based on this article by Vahid Karaahmetovic, Nvidia not only controls over 80% of the GPU market, but also over 72% of the data center market share as well. Across the major competitors which includes Intel and AMD, Nvidia has often branded itself as the more high-end GPU producer with machines that can support some of the world’s most powerful computational needs. Of course, training generative AI models falls within this category.
Nvidia’s A100 GPUs are key to this market, albeit obviously not the only solution. And while having market majority has played in their favor, it’s not a monopoly. One can only imagine that competition will rise as generative AI continues its growth and adoption across all industries. With news that AMD plans to launch their MI300 GPU — a machine anticipated to rival even Nvidia’s most high-end products — I can firmly predict that the market will turn more competitive in the coming years. I see AMD and even Intel making a big bet into an all-out competition for the machines that support this age of generative AI, and of course, with that comes both a processing and pricing war.
Copilots, due to their generative AI nature, are directly dependent on this GPU market. If competition continues to grow, then LLMs could become cheaper and more efficient to train, thus making Copilots cheaper as well.
3. AI regulation will continue, causing a battle between pro-AI investors and pro-regulators.
With many countries already having invoked an “AI act” that aims to regulate the use and activity of AI, including the EU, it’s not hard to state the obvious. AI and its various applications has not only the potential to change the landscape of current and future jobs, but also mankind as a whole in terms of lifestyle, communication, and politics. Even right now, we’re seeing many creative or customer-facing roles become “at-risk” thanks to the power of LLMs; these include copywriters, customer specialists (think: those at the other end of an online chat conversation), and online marketers. It’s interesting if you read about the EU AI Act, which describes a comprehensive view of various negative risks and downstream impact that AI could have on humanity. Of course, there’s also Biden’s AI Executive Order, which highlights similar concerns.
Because of this potential impact, I believe everyone will observe a battle between venture capitalists and investors — those who think like Marc Andreesen (how “AI will save the world”), and staunch defendants of heavy regulation. Regulators could vouch for transparency requirements, ensuring the safety and fundamental rights of people, and the prevention of extensive job loss thanks to AI. Even radical regulators or protesters may become popular thanks to the expansive growth of AI; these groups would look at AI and its applications with immense disapproval, believing it would lead to the end of humanity.
Now of course, this applies to Copilots as well — these products rely on the genuine speed and sophistication of large transformer LLMs. We’ll see in the coming decade how much Copilots will be regulated — there could be a big push from regulators to ensure Copilots are not breaching privacy, security, data transparency, or even job stability concerns.
4. A whole market for self-built Copilots will emerge.
Well, to be honest, it already has.
Microsoft has already introduced the notion of allowing any developer or tech enthusiast to build their own copilot using their cloud services.
The Azure AI studio formally introduced this concept at Microsoft Ignite in 2023: build your own copilot with OpenAI LLMs and Azure infrastructure. Now of course, those who understand LLMs a bit better will know that the costs based on token usage will be huge, especially for larger context windows. But thankfully, OpenAI doesn’t have a monopoly on the world’s best LLMs; a pricing war between OpenAI , Anthropic, and Meta’s LLaMA could make for faster developments in price efficiency and performance for all shots of learning. Perhaps X’s Grok model — one that introduces more humor in its responses — could even do some damage in the future. Any how, a pricing war between LLM providers will undoubtedly make Copilots continuously cheaper to build and thus sustain for long-term profitability.
About Me
My name is Kasey, AKA J.X. Fu (pen name). I’m passionate about writing, and thus I’ve found myself deep in the abyss on weeknights creating novels. I do this while working a full-time tech PM job during the day.
Follow me on Substack for more writing, product, gaming, productivity, and job-hunting tips! Check out my website and my Linktree, and add me on LinkedIn or Twitter, telling me you saw my articles!
Interesting prediction: there will be AI copilots for everything. Definitely makes sense with the way things are currently going and the strengths of LLMs