Going Past Chatbots: The Connector Strategy

January 3, 2026
Written By Christi Brown

Christi Brown is the founder of AdapToIT, where modern IT strategy meets hands-on execution. With a background in security, cloud infrastructure, and automation, Christi writes for IT leaders and business owners who want tech that actually works—and adapts with them.

Most organizations built custom AI tools because the native options didn’t exist yet. Now they do. Here’s how to think about what to keep, what to retire, and what to connect.

Six months ago, if you wanted your AI assistant to look up ticket information or query your documentation system, you had to build it yourself. Custom GPTs, API integrations, middleware layers – the whole stack. It worked, but it required development time, ongoing maintenance, and someone who understood both the AI side and the business systems side.

That landscape has changed fast.

Microsoft’s Copilot connector gallery now includes over 100 prebuilt connectors for services like ServiceNow, Salesforce, Jira, Confluence, and Azure SQL. Model Context Protocol (MCP) reached general availability in Copilot Studio in May 2025, enabling real-time connections to external data sources. And as of November 2025, admins can create and deploy custom MCP connectors directly from the Microsoft 365 admin center.

The custom solutions we built a year ago are now competing against turnkey alternatives that didn’t exist when we started.

This isn’t a theoretical problem for me. I’m actively evaluating whether to retire our internal GPT – a custom assistant we built in Azure AI Foundry for ticket lookups and documentation searches – in favor of just connecting Copilot directly to our existing data. Our entire staff already has Copilot licenses. The data is already in Azure SQL. The question is whether the custom tool still earns its keep.

Here’s how I’m thinking through that decision, and how you might approach similar choices in your environment.

Understanding the Connector Landscape

Before diving into decision frameworks, it helps to understand what’s actually available now versus six months ago.

Microsoft 365 Copilot Connectors (formerly Graph Connectors) ingest external content into Microsoft Graph, making it searchable and accessible to Copilot. According to Microsoft’s documentation, these connectors use semantic indexing to enable natural language queries against your data. When you connect Azure SQL to Copilot, users can ask questions like “show me open tickets for Contoso” and get answers without knowing SQL or navigating to a separate tool.

The Azure SQL connector specifically requires that your Microsoft 365 subscription and Azure subscription hosting the database exist within the same Entra ID tenant. It supports both full and incremental crawls to keep the index current, and you can restrict access to search results based on users or groups defined in your SQL queries.

Model Context Protocol (MCP) takes a different approach. Instead of indexing content into the Graph, MCP servers provide real-time access to external data. Think of it as the difference between a search index (Graph connectors) and a live API call (MCP). MCP-powered agents can fetch current data on demand, which matters when you’re querying a ticketing system where status changes by the minute.

Microsoft’s roadmap shows continued investment here. The December 2025 roadmap update includes custom MCP connector creation in the admin center, hierarchical ACL support for ServiceNow and Jira connectors (coming April 2026), and incremental permission syncs (coming June 2026).

The Build vs. Connect Decision Framework

When we built our internal GPT, the calculus was simple: we needed AI to query our ticketing system and documentation, and there was no off-the-shelf way to do it. Custom was the only option.

Now there are three paths for any AI integration:

Build custom – You control everything, but you own all the maintenance. Makes sense when your requirements are genuinely unique or when native options don’t exist.

Use native connectors – Copilot’s Azure SQL connector, SharePoint integration, or similar. Less control, but Microsoft handles the infrastructure and your users already know the interface.

Hybrid approach – Use native connectors for the common stuff, keep custom solutions for the edge cases.

The mistake I see most IT leaders make is treating this as a one-time decision. It’s not. The connector landscape changes quarterly. What required custom code in January might have a native solution by June.

A Real Example: Our Ticket Lookup Decision

Here’s the specific situation I’m working through right now.

What we built: A custom GPT in Azure AI Foundry that lets technicians look up ticket information and search our documentation system (currently IT Glue, migrating to Hudu). It works. Technicians use it. But it requires maintenance, and every time we want to add a data source or change how queries work, someone has to touch code.

What already exists: We have an internal automation dispatching tool that syncs ConnectWise ticket data to Azure SQL. It’s been running for months – we’ve got six months of historical ticket data stored and we’re expanding what we sync. The database is already there, already populated, already maintained as part of a different system.

The new option: Copilot with Azure SQL connectors can query that same database. Every technician already has Copilot. They already know how to use it. No additional training, no separate interface, no custom maintenance.

The question: Does our custom GPT do anything that Copilot with connectors can’t?

Evaluating What Custom Actually Buys You

When I dig into what our internal GPT actually does versus what Copilot could do with the right connectors, a few things become clear.

Things the custom GPT does that matter:

The custom solution has specific prompt engineering tuned to how our technicians ask questions. When someone types “what’s going on with Contoso,” it knows to search by company name across open tickets, recent closures, and recurring issues. It formats ticket summaries in a consistent way that matches how our team thinks about problems. And it integrates with IT Glue’s API for documentation lookups – something Copilot doesn’t have a native connector for.

Things I thought mattered but probably don’t:

I initially valued having “control over the AI behavior,” but Copilot is controllable enough for this use case through declarative agents and prompt configuration. I thought the custom UI mattered, but our technicians would rather stay in the tools they already use than context-switch to another interface. And performance optimization? The native connectors are fast enough for lookups.

The adoption reality I didn’t want to admit:

Here’s the uncomfortable truth: people didn’t use the custom GPT much. Not because it didn’t work – it worked fine. They didn’t use it because it was just another window to open. Another bookmark. Another login. Another thing to remember exists.

Meanwhile, Copilot is already sitting in Teams, in Outlook, in the apps they have open all day. The barrier to asking a question drops from “let me find that tool we built” to “let me just ask Copilot.” That’s not a technical difference. That’s a behavioral one. And behavioral friction kills adoption faster than any missing feature.

I say this as someone who sometimes overcomplicates things just because I can. It’s easy to justify building custom solutions – the control, the flexibility, the satisfaction of solving a hard problem. But if the result sits unused because it’s one more thing in an already crowded tool landscape, what was the point?

With the number of AI tools flooding the market right now, there’s real value in consolidation. One interface that handles most of what your team needs, connected to the data sources that matter, inside the applications they already live in. It’s less impressive than a custom build. It’s also more likely to get used.

The honest assessment:

The IT Glue integration is the only thing keeping the custom GPT relevant. Once we migrate to Hudu, that calculus might change again. And frankly, if I spent the same hours I spend maintaining the custom GPT on building a proper connector for Copilot, I might come out ahead.

This is where MCP becomes interesting. If Hudu exposes an MCP server (or I build one), I could connect it directly to Copilot without writing a full custom application. The MCP integration in Copilot Studio automatically discovers tools and resources from connected servers and keeps them updated as the server evolves.

Connector Architecture: What’s Actually Happening

Understanding the technical architecture helps make better decisions about which approach fits your needs.

Graph Connectors (Indexed):

When you configure an Azure SQL connector, Microsoft crawls your database on a schedule and indexes the content into Microsoft Graph. Your data gets semantic labels applied – things like title, url, and iconUrl – that help Copilot understand what it’s looking at. Users query against this index, not your live database.

This means there’s latency between when data changes in your source system and when it appears in Copilot. For ticket lookups, that might be minutes to hours depending on your crawl schedule. For historical reporting or knowledge base queries, that’s usually fine. For “is this ticket still open right now,” it might not be.

The upside is that queries are fast and don’t hit your production database. The index handles the load.

MCP (Federated/Real-time):

MCP servers respond to queries in real-time. When a user asks about a ticket, the request goes to your MCP server, which queries the live data source and returns current information. No indexing delay, but you’re now responsible for handling the query load and ensuring your data source can respond quickly.

Microsoft’s documentation notes that MCP servers integrate with enterprise security controls including Virtual Network integration, Data Loss Prevention, and multiple authentication methods. This isn’t a wild west connection – it goes through the same governance framework as other Copilot connectors.

Choosing between them:

Use indexed connectors (Graph) when your data doesn’t change frequently, query latency of minutes is acceptable, and you want Microsoft to handle the infrastructure. Use MCP when you need real-time data, your source system has APIs but no native connector, or you’re already maintaining an API layer anyway.

For our situation, ticket status needs to be current, but ticket history (which is most of what technicians search for) doesn’t change. A hybrid makes sense: Azure SQL connector for historical data, MCP for real-time status checks.

The Connector Strategy: Principles Over Products

Here’s what I’ve learned from working through this decision:

Start with the data, not the AI.

The reason I can even consider this switch is that the data already lives in Azure SQL. Our internal dispatching tool did the hard work of syncing and storing ticket data. The AI layer is almost an afterthought – it just needs to query what’s already there. If you’re planning AI integrations, get your data architecture right first. Connectors are worthless without clean, accessible data.

Native connectors reduce your maintenance burden.

Every custom integration is technical debt. It works until it doesn’t, and then you’re debugging at 2 AM. Native connectors push that burden to Microsoft. When the Graph connector gallery moved to the Copilot Control System in the admin center, I didn’t have to do anything. When they add incremental permission syncs in June 2026, same thing.

Your users don’t care how it works.

They care that it works. If Copilot can answer the same questions our custom GPT answers, using the same data, with less friction – the custom GPT doesn’t have a compelling reason to exist. Don’t preserve custom tools out of sunk cost attachment.

Meet users where they already are.

The best AI implementation is the one people actually use. A sophisticated custom solution that requires opening a new window loses to a “good enough” native solution that’s already in their workflow. With AI tools multiplying everywhere you look, consolidation has value. One interface connected to multiple data sources, inside the apps your team already has open – that’s not settling. That’s strategy.

Plan for migration, not permanence.

We’re migrating from IT Glue to Hudu. Our documentation connector will break regardless of whether it’s custom or native. The question is which approach makes the migration easier. Sometimes that’s custom (more control), sometimes that’s native (someone else’s problem). Think about your next two platform changes, not just your current state.

Audit your custom tools quarterly.

The connector landscape moves fast. That custom integration you built in Q1 might have a native alternative by Q3. Set a calendar reminder to ask “could this be simpler now?” The Copilot Studio “What’s New” page is worth bookmarking – they’re shipping meaningful connector updates monthly.

What I’m Actually Going to Do

I haven’t made the final call yet, but here’s where I’m leaning:

Phase 1: Enable Copilot’s Azure SQL connector.

Let technicians query the ticket database directly through Copilot. The database already exists, the schema is defined, and the connector setup is documented. Don’t retire the custom GPT yet – run them in parallel and see which one people actually use. This takes a few hours to configure, not weeks to build.

Phase 2: Build an MCP server for real-time status.

For queries that need current data – “is this ticket still open” or “what’s the latest note” – build a lightweight MCP server that queries ConnectWise directly. This gives us real-time capability without duplicating all our custom GPT logic.

Phase 3: Evaluate documentation integration after Hudu migration.

Once we complete the Hudu migration, evaluate whether to build a Hudu connector for Copilot or maintain a custom integration. The answer probably depends on whether Hudu or a partner releases their own connector in the meantime. If not, MCP is probably the path.

Phase 4: Retire the custom GPT if it’s not earning its keep.

If Copilot handles 90% of what the custom GPT does, retire it. Keep the Azure AI Foundry setup documented in case we need custom capabilities later, but don’t maintain something that isn’t earning its keep.

The goal isn’t “all native” or “all custom.” It’s matching the right tool to the right problem, and being honest about when that answer changes.

The Bigger Picture

This is what “going past chatbots” actually looks like in 2026. It’s not about bolting more capabilities onto AI assistants. It’s about recognizing that the integration layer is often more valuable than the AI layer.

The AI is commoditized. GPT-4, Claude, Copilot – they’re all capable enough for most business queries. What differentiates your AI implementation is what data it can access and how seamlessly it fits into your users’ workflows.

Microsoft is clearly betting on connectors as the extensibility model. The November 2025 announcement showed MCP-based agents appearing in the admin center, connector visibility for governance, and user-level federated connectors that let individuals connect their own apps (like Notion or Canva) to Copilot.

The trajectory is clear: more connectors, more real-time options, more admin control. Custom code isn’t going away, but its role is shifting from “build everything” to “fill the gaps.”

Connectors are the strategy. The AI is just the interface.

If you’re still maintaining custom AI tools you built when there weren’t other options, it might be time to ask whether those options exist now. The answer might surprise you.


This is part of the “Going Past Chatbots” series, exploring what comes after the initial AI deployment. Next up: When AI Agents Replace Workflows – understanding when to let AI take action versus just inform.