⚡[Insights #14] OpenAI Is Going All in on Chips 🃏

🤑 AI Is A Money Loser (For Now)

GM Readers! ☀️

Welcome to the 14th issue of Evolving Internet Insights.

Quote we’ve been thinking about this week: “Using [generative AI] to summarize an email is like getting a Lamborghini to deliver a pizza.” 🍕(WSJ Article)

Thanks for reading!

Liang and Dan 🙌

🌰 In a Nutshell

  • OpenAI wants to build its own chips

  • Big Tech is not (yet) making money on AI

  • Adobe expands its Gen AI product suite with Stardust (an AI editing tool)

  • ChatGPT significantly lags major consumer apps in terms of monthly visits

All In on [AI] Chips, Image Made with AI

💾 Byte-sized Stories

This week’s top stories with our insights on top.

1. OpenAI Is Going All in on Chips 🃏

⚡️ TL;DR: OpenAI is wants to develop its own graphics processing units (GPUs), aka computer chips. It was reported that Sam Altman, the CEO of OpenAI, is open to building its own chips. While the company does not design and manufacture chips today, it’s open to acquiring a chipmaker to bring design and manufacturing in-house.

⚡️ So What: With the generative (gen) AI boom and the resulting demand for GPUs, supply chains have been strained. Nvidia’s best performing GPUs are sold out until 2024. GPUs represent the critical infrastructure that powers AI computation. Acquiring a chipmaker would allow OpenAI, one of Nvidia’s biggest customers, to diversify its business model and dependency on external chip makers. Fun Fact: Nvidia’s has 80% market share in the AI chips market.

⚡️ Zoom Out: There are a few other reasons why OpenAI may want to bring chip manufacturing in house: optimize costs and create bespoke circuitry for its hardware plans (i.e. OpenAI’s rumored AI-first phone, which we covered in 🧠 Brain Food #8 and ⚡ Insights 13).

  • Costs: A Bernstein analyst noted that if ChatGPT queries “grew to a tenth the scale of Google Search, it would require roughly $48.1 billion worth of GPUs initially and about $16 billion worth of chips a year to keep operational. OpenAI may be able to drive costs down by designing their own chips and fine tuning them for their specific products and use cases.

  • Hardware Synergy: With ambitions to build its own AI-first mobile phone, there is a competitive advantage to having AI chips designed in-house and customized for the hardware. Remember when Apple “cut the cord” with Intel, its long time computer processor manufacturer, in favor of their own in-house chipsets. 👀

Read More Here, Here

2. Big Tech Is Not (Yet) Making Money on AI 💸

⚡️ TL;DR: Big Tech companies like Microsoft and Google do not yet make money off of their generative AI products. This WSJ article reported that Microsoft loses $20 per user per month (and sometimes as high as $80 per user per month) from its GitHub Copilot tool. Since building Gen AI models and apps requires a lot of upfront investment / costly ongoing operating costs, Big Tech companies want to raise prices for its AI products. 😭

⚡️ So What: Even with the Big Tech companies losing money, they are able to absorb the losses. All these “loss leaders” could be seen as strategic investments to build market share with the hope that in the long-term they can either drive down costs or increase prices. One could compare AI business units within the Big Tech companies as being analogous to OpenAI being a startup that is willing to raise a bunch of money to compete and capture market share. But in Big Tech companies their “bets” in AI are being funded by their core businesses (which are profitable).

⚡️ Zoom Out: The WSJ article also characterized the product-market fit problem well: “using [gen AI] to summarize an email is like getting a Lamborghini to deliver a pizza.” When you need to write an email, the question is do you need the next generation model? Said another way, do you need to use the cutting-edge and more GPU-demanding ChatGPT-4 or could you use the slightly less-refined ChatGPT-3?

Our take is that in the future, the orchestration layer of AI (read: the integration of various AI models into a company's products) will play an even greater role in optimizing which model to use for a given business case or product. 

Read More Here, Here

P.S. Our next Brainfood Issue will cover AI usage and how AI might be stuck in the skeuomorphic phase.

Subscribe to our newsletter to get it straight to your inbox 💌

3. Adobe Expands Its Gen AI Product Suite With Stardust  

⚡️ TL;DR: Adobe is launching a gen AI-powered “editing tool” in its product suite. In a tour de force of its capabilities, their teaser video highlights how a user can easily swap out the clothes in an image.

⚡️ So What: In the creative process, editing is as important but takes much longer than creation. This post-production phase is where Adobe truly stands out in the minds of consumers. For most creative use cases, effective editing tools are huge time savers . With this new gen AI product suite, Adobe is doing their best to “level up” creator tools product segment.

⚡️ Zoom Out: Adobe’s big competitive advantage is consumer mindshare and number of users it already has. It has become the go-to tool suite in creators' workflow. By betting on creator tools in gen AI, its existing users have a reason to stay and potential users a reason to adopt.

Read More Here, Here

📊 Let’s Get Graphic

One visual we couldn’t stop thinking about.

Number of Monthly Visits for Popular Consumer Apps

⚡️ Takeaway: While ChatGPT broke many records, like being the fastest app to reach the 100M user mark, usage of the app still pales in comparison to traditional consumer applications like WhatsApp and YouTube.

The above chart highlights that ChatGPT is comparable to Reddit, LinkedIn, Twitch in terms of monthly user visits.

Basically, it’s still the early days 🌅

🐇 Down the Rabbit Hole

Some deeper dives to help you get smarter on emerging tech.

  1. 40 Companies Beating the West: a curated list of the top 40 companies built outside of the West.

  2. Future of AI Survey: 10 AI investors break down their thinking on the Future of AI.

  3. What does the AI revolution mean for our future: a debate on the future of AI between Mustafa Suleyman (Co-founder of DeepMind) & Yuval Noah Harari (Author of Sapiens).

Subscribe for free to see some elite jobs in our “Elite Jobs Corner” 👇

🗳 We’d Love your Feedback

What did you think of today's issue?

Login or Subscribe to participate in polls.

🙏 Shameless Asks

It takes us days to put this together, sharing it takes you 19.45 seconds 😘

An easy way to support us and the newsletter is through the referral program.

⚡️ Share Evolving Internet Insights with a friend and ask them to subscribe

⚡️ Share on Twitter and Linkedin with a short note

⚡️ Share on your company Slack/Teams channels and communities

DISCLAIMER: This post is provided strictly for educational and informational purposes only. Nothing written in this post should be taken as financial advice or advice of any kind. The content of this post are the opinions of the authors and not representative of other parties. Empower yourself, DYOR (do your own research).

Join the conversation

or to participate.