Skip to main content

Flipping the math: How AI changes Build vs. Buy

Flipping the Build vs. Buy Script. Image by TW with some help from ChatGPT
For the longest time, companies have been trapped by enterprise software vendors.

First by shrink-wrapped software packages.

Then by SaaS offerings. 

Both situations led to what one even in a SaaS world can call shelfware – although these days the shelf is a virtual one instead of a physical one. Buyers still get enticed to purchase more capabilities than they need, which leads to them paying more than necessary while often using software packages that offer overlapping capabilities.

One of the promises that SaaS started with, was to end this. Sadly, it looks like this promise was not kept. And this is no wonder; after all vendors want to be sticky. And they need to have increasing revenues. This means that they need to offer an ever-increasing number of capabilities, aka features, to warrant their pricing and eventually regular price increases. Combined with the frequently used strategy of offering related capabilities, i.e., seats for an adjacent software that is not yet needed by a customer, this led to two things: bloat and shelfware. Both go at the expense of the enterprise buyer.

Since the dawn of packaged software, the argument to buy, i.e., to voluntarily step into this trap, is the same: Buying is cheaper than building. 

Which probably was correct. Buying from a specialist was the logical choice. Engineering talent was, and still is, scarce. Building software includes a lengthy process of requirements engineering, years of development and ultimately never-ending maintenance.

Just that most of this is true for most implementations of purchased enterprise software, too.

And the buying process is arguably broken. Need identification is often done without the right stakeholders, the software selection becomes a procurement-heavy process that is more based on checking boxes than in fulfilling user needs and the implementation turns out to be a death march. Who has not read – or at least heard of – the statistics that show implementation failure rates of more than 60 percent?

But then, who got ever fired for buying IBM, or Salesforce or SAP, for that matter. Or Oracle? Take your pick.

The result?

As a consequence, we see processes that are not improved or that not necessarily differentiate the company, as they are either implemented to follow the “same ole” or along “best practices”, which often translates to “average”, i.e., mediocrity. Users are forced to adapt to the tool, and not the other way round. Their pain is not solved. This, combined with shelfware, contributed to low adoption and shadow IT which ultimately harms all efforts of a digital transformation. 

And it is costly.

Entry low-code, no-code and GenAI

Low- and no-code environments are basically there since, well … forever. At least as measured in Internet time. I had my first experiences with one back in 1995 (yeah, I am that old). 

Depending on who got its fingers on these environments, results have been good or not so. Anyone remember the infamous Lotus Notes app graveyards? It needs guardrails.

However!

The combination of low-code, no-code and generative AI has the potential to reduce the marginal cost of software development to almost zero. Instead of engaging into a multi person year software implementation project, it is now theoretically possible to “vibe-code” a bespoke application in a short time and at low cost. The scarcity of IT personnel is mitigated, and the procurement process is no hurdle anymore.

At least theoretically. Again, it needs guardrails and the right tools for the right people.

Still, and this is important, it is now possible to create what one could call a throwaway MVP, or a working prototype, that covers a requirement’s happy path at almost zero cost. To be clear, this is a capability that we didn’t or only barely have with all the traditional low-code and no-code environments. And this is a big deal.

With this prototype it is possible to quickly identify whether a real problem is solved or at least mitigated; and this before big money is spent for the customizing and deployment of a new SaaS solution. This flips the procurement process to something for which one could use the term prompt-to-product.

A new paradigm?

As said, the traditional software procurement lifecycle: requirements identification software selection implementation is flawed. It relies on abstract and static written requirements. Text is ambiguous whereas software is explicit. The gap between a fuzzy, written requirement (e.g., "The system must support flexible workflows") that we see all too often and the delivered reality is where millions of dollars in enterprise value can get tanked. 

With the help of Generative AI, it is possible to establish a methodology that moves the build phase to the very beginning. It serves not as the delivery mechanism, but as an agile discovery tool in a three phased process.

Phase 1: Dynamic Discovery

Instead of collecting stakeholder needs in a static document, this process begins with live prototyping. Business stakeholders work with an AI engineer or directly with an LLM-enabled no-code environment to describe their problem in natural language, rapidly developing a working prototype that supports the happy path to the desired outcome. This is agile development on steroids. The prototype does not need to be secure or scalable; it only needs to fulfill the job and be interactive. As a result, it becomes very clear what the users actually want. Plus, some implicit requirements get surfaced early in the process instead of after the purchasing decision and project budget assignment.

Questions like “Does the user actually want a dashboard, or just a daily email summary?” or “Does the data structure actually fit the way the team works?”, and more, are answered before they require costly change requests.

Phase 2: Stress Test

After the prototype solves the business users’ pains, IT leadership is in a better position to decide whether to buy, build, or opt for composing a low-code solution. Based on the assumption that the existing software packages do not cover the requirements, this decision can be taken by answering three main questions based on the generated code.

  • Does this tool need to read/write to business-critical system like the ERP, or does it live in isolation?
  • Does the logic involve high-liability calculations (tax, payroll, health data)?
  • Is the logic static, or will it require constant updates based on external factors (e.g., changing shipping tariffs)?

Phase 3: Strategic Fork

Based on the answers, the organization moves down one of three paths. Crucially, the outcome of phase 1 is valuable on all three paths.

Build

If the prototype is self-contained, low-liability, and specific to the company’s internal operations, the decision is to build.

The prototype code gets refined to cater for edge scenarios and for compliance and security, if the development environment of the prototype didn’t already take care of these. After that, it can get deployed. 

Because the cost of generation stays at near zero, the resulting software is disposable. If the process changes over time, the application is not patched but simply discarded and regenerated.

As a result, the company has a solution with perfect process fit, low implementation cost and zero licensing fees.

Buy

If the prototype reveals that the requirements are more complex than anticipated, for example, if there are more regulations to consider, the decision is to buy. In contrast to the traditional process, this is now an informed decision.

The organization stops building but uses the functional prototype as a key part of the RFP that demonstrates the desired process. The conversation shifts from "Can you meet our requirements?" to "Here is exactly how our process works; demonstrate that your software can replicate this specific behavior."

The result is risk mitigation for both the company and the winning vendor. The prototype proves that building potentially creates unmanageable technical debt. It also prevents buying vaporware by forcing vendors to prove capability against a live model. For the vendors, it takes away considerable uncertainty in assessing the project size.

Compose

If the prototype requires the flexibility of custom logic but the governance of a standard platform (Microsoft, SAP, Oracle, Zoho, Salesforce, etc.), the decision is to compose it using a low-code environment.

The AI-generated logic gets transferred to an existing low-code/no-code platform. This platform handles identity management, UI standardization, and hosting, while the generated code still handles the unique business rules.

This enables speed of deployment with the safety net of IT governance.

What does this mean?

Executives should flip the purchasing process using three key actions.

  • Provide an infrastructure that allows for rapid, AI-supported prototyping, aka vibe-code environments. Ideally, this environment already embraces security and compliance rules.
  • Train users, business analysts or IT personnel to use this environment to bridge the gap between business and AI.
  • Instead of asking for written requirements only, make it the creation of prototypes in this environment that serve as core elements of the demand mandatory.

This flipped process opens the build vs. buy question to no longer being binary. It creates a build-to-define process that ensures that the decision of how to deliver required functionality in a better informed, de-risked way that has a higher chance for success at lower cost. It isn't killing SaaS but stopping to buy hope. Low-code/no-code in combination with GenAI can help to know more exactly what gets delivered, regardless of whether you build or buy.


 

Comments

Last Year's Top 5 Popular Posts

You are only as good as your customer remembers

As you know, I am very interested in how organizations are using business applications, which problems they do address, and how they review their success. In a next instance of these customer interviews, I had the opportunity to talk with Melissa Gordon , Executive Vice President, Enterprise Solutions at Tidal Basin about their journey with Zoho. You can watch the full interview on YouTube. Tidal Basin is a government contractor that provides various services throughout the government space, including disaster response, technology and financial services, and contact centers. Tidal Basin started with Zoho CRM and was searching for a project management tool in 2019. This was prompted by mainly two drivers. First, employees were asking for tools to help them running their projects. Second, with a focus on organizational growth and bigger projects that involved more people, Tidal Basin wanted to reduce its risk exposure and increase the efficiency of project delivery. This way, the compa...

Sweet Transformation: Inside SugarCRM’s New Direction

Fresh from the 2025 SugarCRM Analyst Summit, waiting for my plane home, it is time to sort my thoughts. From Monday, 1/27 evening to Wednesday 1/29 in the morning we had some time jam packed with information and good conversations with SugarCRM execs, customers, and in between analysts. The main summit started with a bang, namely the announcement that industry icon Bob Stutz joins the SugarCRM board of directors , which is something that few of us, if any, had foreseen. This is exciting news.  With David Roberts , who succeeded Craig Charlton in September 2024, SugarCRM itself has a new CEO with a long time CRM pedigree.  As with every leadership change, this promises some change. Every new CEO evaluates what they see vs. where they want their company to go and then, together with the team, establishes and executes a plan to get there. Usually, this involves some change in the structure of the executive leadership team, too.  This is what happened and happens with SugarCR...

Data Wars: SAP Vs. Salesforce In The AI-Driven Enterprise Future

The past weeks certainly brought a lot of news, with SAP Sapphire and Salesforce's surely strategically timed announcement of acquiring Informatica , ranging at the top. I have covered both in recent articles. The enterprise software landscape is crackling with energy, and Artificial Intelligence (AI) is certainly the star of the show. It isn't anymore about AI as a mere feature; it's about AI as the strategic core of enterprise software. Two recent announcements underscored this shift: SAP's ambitious AI-centric vision that was unveiled at its Sapphire 2025 conference, and, arriving hot on its heels, Salesforce's agreement to acquire data management titan Informatica for $8 billion. Both signal an intensified battle for AI supremacy, where trusted, enterprise-wide data is the undisputed new monarch. Of course, SAP and Salesforce are not the only ones duking this one out. SAP's Sapphire Vision: An AI-Powered, Integrated Enterprise At its Sapphire 2025 event in ...

The CDP is dead – long live the CDP!

In the past few years, I have written about CDPs, what they are and what their value is – or rather can be. My definition of a CDP that I laid out in one of my column articles on CustomerThink is:  A Customer Data Platform is a software that creates persistent, unified customer records that enable business processes that have the customers’ interests and objectives in mind. It is a good thing that CDPs evolved from its origins of being a packaged software owned by marketers, serving marketers. Having looked at CDP’s as a band aid that fixes the proliferation of data silos that emerged for a number of reasons, I have ultimately come to the conclusion and am here to say that the customer data platform as an entity is increasingly becoming irrelevant – or in the typical marketing hyperbole – dead.  Why is that? There are mainly four reasons for it.  For one, many an application has its own CDP variant already embedded as part of enabling its core functionality. Any engageme...

CPQ, Meet Price Optimization: Your Revenue Lifecycle Just Got Serious

The news On October 1, 2025, Conga announced its intent to acquire the B2B business of PROS , following PRO’s acquisition by Thomas Bravo . At the same time, ThomaBravo and PROS announced that PRO’s travel business segment will be run as a standalone business . The bigger picture Revenue operations, revenue management and revenue lifecycle management have become a thing in the past years, as evidenced by the number of specialized companies that solve parts of the overall problem of optimizing revenue. It also got abused to some extent (e.g., surge pricing models) when the users of the corresponding capabilities consider optimizing being the same as maximizing. Reality check: It is not. While optimizing involves a bit of identifying how much a customer is willing to pay, it also involves the thought of repeat business, or in other words customer loyalty, even without a formal loyalty program. And that involves the customer experience, part of which the speed of creating a quote with mat...