Microsoft "gets" AI

Let me start by saying that I can’t believe I’m writing a post with Microsoft “getting” something. I’ve disliked this company for decades because I’ve seen it as a dumbing down follower of others’ ideas and better technology, even though it’s hard to argue with its market success (except for the smartphone era, where Ballmer’s laughing at the iPhone was the seminal sell MSFT moment).

But, I have to admit, Microsoft is doing pretty well in the new AI world. It started early this year, when MS first announced an investment in OpenAI and then an integration of ChatGPT into their Bing search engine. Google’s response was an AI demo that gave a wrong answer and Alphabet stock dropped 7% that day.

That’s not all. Microsoft is integrating AI into a number of its other software products. I’ve previously written here about Github’s Code Copilot product, which acts as the reviewer in a pair-programming structure, catching errors and suggesting optimizations to a human programmer, making them incredibly more efficient, and now Microsoft has co-opted the “Copilot” term for over 4 dozen new products.

You can find many articles on the release, but I thought this one, which includes their own experience with the products, to be informative.

We have built our own Josh Bersin Copilot (come see the preview at Irresistible 2023) and in only a few weeks we now have full conversational access to more than 20 years of research, more than 600 blogs and articles, our 100+ podcasts, and soon the Josh Bersin Academy. You can literally ask it any question and 90% of the time it delivers a perfect, easy to understand answer – right out of our large corpus of research, case studies, vendor analysis, and data. It’s unbelievable to tell you the truth.


Microsoft is very focused on this solution now, and it’s going to transform how we do HR, implement HR technology, and develop IT systems in general. I can speak directly from our experience: our use of a Copilot is totally transformational to our users.

Read the article. I find myself wanting to quote almost every paragraph, but obviously that’s not the best use of a post. So, let me touch upon one aspect that I think isn’t being considered by many:

not only can we use these tools to better understand and use all the Microsoft tools, but we can now build applications for HR self-service, onboarding, training, leadership development, recruiting, and much more with the out-of-the box Copilot development stack. The actual LLM (GPT4) is simply available as a platform service, and Microsoft protects your security and performance management behind the scenes.

What’s happening here is that one use of these Copilots is to upload your data to them so that the LLM can operate of them. (As an aside, I had this discussion with a Disney VP of Imagineering recently, and pointed out to her that Disney should be looking at Generative AI to produce copywrighted images based on Disney’s own copywrighted image/video portfolio. ) I believe that LLMs that have the whole internet as their database are just one AI application, and that there is probably a market for AI that operates on a company’s own proprietary database. Microsoft apparently agrees and is providing such tools.

I think it’s a really big deal. Microsoft is probably too big to make this an investing thesis, but it does reinforce my previously posted thoughts on what DataDog should be doing but apparently isn’t (or, at least they’re not telling anyone). AI tools that operate on your own data could be the next wave - anyway it’s something I’m paying attention to in looking for new investment opportunities (or getting out of existing ones).


I fully agree with this. I feel the bigger opportunity here is SNOW more than DDOG though. SNOW could be the clearing house to get access to proprietary data in a secure and governed way for AI use through their data marketplace.

They talked about this in their earnings call. I am looking forward to their June summit to see the announcements.

The Snowflake mission is to steadily demolish any and all limits to data, users, workloads, applications and new forms of intelligence. You will, therefore, continue to see us add, evolve and expand our functions and feature sets. Our goal is for all the world’s data to find its way to Snowflake and not encounter any limitations in terms of use and purpose. From our perspective, machine learning, data science and AI are workloads that we enable with increased capability, continuous performance and efficiency improvements.


I share those thoughts. Regarding Datadog, the CFO elaborated a bit during SVB MoffettNathanson’s Inaugural Technology, Media and Telecom Conference, May 17.

And then the last thing [Referring to ways they’re looking at AI] would be in our platform itself, which is to create greater utility to discover problems and use. And we’ve always had machine learning and automation, and we’ve been on that journey for a long time. And we’re optimistic that we’re going to be able to add parts to the platform.

We have a number of things we’re working on. We tend to announce things at DASH. So, we’re working on different things to see if we can find the things that will be most useful to customers and then have the time to get customer feedback to sort of be on that schedule.

So that was what that was. [Referring to the announcement about API integration to OpenAI] That’s like a - that’s like 1 thing you do, but that’s not embedded in the platform, and that’s what I was speaking about in my previous remarks about embedding the large language models within the platform.

(Shortened snippets. [  ] = added context. The full conversation might provide more insight.)

Which, in my opinion, is a lot more inspiring than the earnings call.