Enterprise Generative AI Enters Its Citizen Development Era

Enterprise Generative AI Enters Its Citizen Development Era

There are times where we get a clear before-and-after moment that demands a reevaluation of our most basic assumptions. This month, OpenAI announced custom GPTs, a no-code tool for people to create their own GPT models based on their own data and using their own plugins. What used to be a tight mandate for a team inside a large R&D group or a chatbot startup can now be accomplished by my grandfather in five minutes while using a couple of wiki links as a knowledge base. More importantly, these GPTs can act on the user’s behalf. OpenAI’s tight integration with Zapier means thousands of connectors at your disposal, letting the AI query your CRM, update your ERP, or monitor your servers with a few clicks. How does the AI authenticate to all these services, you might ask? Great catch, but more on that later. One thing you might be thinking is, well, this is amazing and all, but we will never allow this to happen in our highly regulated security-focused enterprise. You might have even blocked chatgpt on the network level long ago, and are now constantly monitoring for more bots to add to that deny-list — which is annoying, but you can manage. Enter Microsoft. Last week at its Ignite conference, Microsoft announced Copilot Studio, its own no-code GPT creator. It has everything the OpenAI tool has, from uploading files to use as a knowledge base to a chat interface for configuration and click-to-add integrations called plugins. Copilot Studio allows users to integrate their Copilots with Microsoft 365, Azure SaaS, and hundreds of other enterprise systems. This integration is done via user impersonation, meaning the Copilot acts on behalf of users. Here’s the thing about these Microsoft-generated user impersonation bots: You can’t block them. You have no way to distinguish between an AI-generated operation and a user-triggered operation, as these look exactly alike in logs. Copilots are hosted as applications inside your M365 environment, so forget about network-level blocks. Users log into these Copilots with their corporate credentials. The bottom line is that while GPTs live in the consumer world, Copilots live in the enterprise world. How Did This Happen So Quickly? Well, it didn’t. Microsoft and other major vendors — like Salesforce, UiPath, and ServiceNow — have been building low-code/no-code platforms that lowered the bar to building enterprise applications for years now. These companies have been building out hundreds of integrations, visual builders, automated production deployments, and credential sharing as a service. Chat bots are the killer app for low-code/no-code platforms. Who needs to code when you can leverage a platform that out of the box gives you everything you need to create, share, monitor, upgrade, and embed your bot within minutes inside the enterprise, directly on top of business data? A crucial point here is just how easy it is to build no-code apps now. In recent years, professional developers and business users alike have used platforms like the Power Platform to build millions of new business applications, including some that handle sensitive data and facilitate business-critical processes. While some companies have started to centralize the GenAI apps being created by the engineering teams, this won’t be enough. We have to look at what business users are building as well. In fact, the sheer number of business users, combined with the ease of creating bots, suggests that we should in fact focus more on what business users are building. Where Do We Even Begin? Luckily, a growing number of organizations have already integrated citizen development (business users building apps) into their application security program, and some of their insights have been publicly shared. Industry standards that categorize, explain, and suggest remediation for security risks of low-code/no-code apps have emerged. Not using code doesn’t mean no vulnerabilities, especially logical ones. However, it typically does mean lack of SDLC, visibility, and controls. Whether our users are creating a GPT or a Copilot, they are doing so today, and in large quantities. For security leaders, it’s either get onboard now and bring these new developers under the security umbrella — or miss the train and hope for the best.