Part 2 of 5
🤖 Ghostwritten by Claude · Curated by Tom Hundley
This article was written by Claude and curated for publication by Tom Hundley.
A developer asks for help. I point him at AI. A server dies. Good riddance.
Last week, one of my developers came to me with a familiar problem.
"The build server is out of date," he said. "It's going to be a nightmare to update. I think we need to build a new one, and I need your help walking through it to make sure we do it right."
This is the kind of request that used to eat days. Maybe weeks. Provisioning a new VM, configuring Visual Studio, setting up the X++ toolchain, getting the AOS running, connecting to source control, testing the build pipeline—all while keeping the old server limping along so the team can keep shipping.
I've done this before. Multiple times. It's soul-crushing work.
So instead of rolling up my sleeves, I asked him a question.
"Why don't you ask AI?"
Here's the thing: I could have walked him through the build server setup. I've done it enough times that the muscle memory is still there. But that's exactly the problem—I would have given him the answer I already knew. The answer from two years ago. The answer that assumes the world hasn't changed.
AI doesn't have that problem. It doesn't have muscle memory pulling it toward outdated solutions.
So I sat down with Claude and framed the question. Not "how do I set up a D365 build server" but something more open: "We need to automate builds for Dynamics 365 Finance & Operations. Our current build VM is outdated. What are our options?"
Claude came back with clarifying questions:
After a few exchanges—maybe five minutes of conversation—we landed on something I hadn't expected.
We don't need a build server at all.
Turns out Microsoft has been quietly enabling a different approach: Azure Pipelines with Microsoft-hosted agents.
Instead of maintaining a dedicated build VM—yes, you can deprovision it when not in use, but that's still overhead, still a process, still time spent managing infrastructure instead of shipping code—you can use Microsoft's hosted agents. Ephemeral VMs that spin up when your pipeline runs and disappear when it's done. The X++ compiler and all the build tools? They're packaged as NuGet packages that get installed fresh on each build.
No server to maintain. No updates to apply. No disk space to monitor. No monthly VM costs eating into the budget.
One server gone. Just like that.
Let me be clear about what we're dealing with here. The traditional D365 build setup looks like this:
The Old Way:
The New Way:
The community has been writing about this for a while, but somehow it hadn't penetrated my awareness. I was still operating on the mental model from 2020.
Before I get too excited, let me be honest about what this approach doesn't do.
According to Microsoft's documentation:
"This feature is limited to compilation and packaging. There is no support for X++ unit testing (SysTest), database synchronization, or other features that require the runtime (Application Object Server [AOS]) or its components."
So if your build pipeline includes automated tests or database sync, you still need a VM. For us, that's not a blocker—our testing happens elsewhere in the workflow. But your mileage may vary.
The other consideration: you need to keep those NuGet packages updated. Every time you upgrade your D365 environment to a new platform version, you need to download the matching compiler tools and reference packages from LCS and publish them to your Artifacts feed. It's not automatic.
Here's why this matters beyond just saving money on one VM.
I've been wanting to rip out TFVC and move to Git. Microsoft has been phasing out TFVC—it's no longer available for new projects as of 2024—and staying on it is technical debt accumulating interest.
But modernizing version control means modernizing the entire CI/CD pipeline. And I was dreading that work because it meant dealing with the build server problem.
Now? The build server problem is solved. Or at least, it's solvable in a way that doesn't involve maintaining another dinosaur VM.
The path forward is clearer:
One less server. Modern version control. Cloud-native CI/CD. This is what progress looks like in the D365 world—incremental, painful, but real.
Important disclaimer: I haven't actually implemented this yet.
This article isn't me telling you "here's how I did it and it worked great." This is me saying "here's what I discovered, and here's the plan I'm going to follow."
The next part of this series will document the actual implementation—the setup process, the gotchas, the things that didn't work the first time. I want to follow my own instructions and see where they break.
Based on my research, here's the implementation plan I'm going to follow. I'm documenting it here so you can follow along—and so I have something to compare against when reality inevitably diverges from the plan.
Before starting, you need:
Step 1: Create an Azure Artifacts Feed
In Azure DevOps, go to Artifacts and create a new feed. This will host the D365 compiler tools and reference packages.
Step 2: Download NuGet Packages from LCS
Go to LCS → Shared Asset Library → NuGet packages. Download the packages that match your D365 environment version. For environments on 10.0.40+, you need five packages:
| Package | Purpose |
|---|---|
Microsoft.Dynamics.AX.Platform.CompilerPackage |
The X++ compiler (xppc.exe) and build tools |
Microsoft.Dynamics.AX.Platform.DevALM.BuildXpp |
Platform module compiled references |
Microsoft.Dynamics.AX.Application1.DevALM.BuildXpp |
Application module part 1 |
Microsoft.Dynamics.AX.Application2.DevALM.BuildXpp |
Application module part 2 |
Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp |
Application Suite module |
Critical: Download versions that match or exceed your environment version. If you're on 10.0.38, don't download 10.0.35 packages.
Step 3: Publish Packages to Your Feed
Download NuGet.exe and run:
nuget.exe push -Source "https://pkgs.dev.azure.com/<org>/<project>/_packaging/<feed>/nuget/v3/index.json" -ApiKey AZ Microsoft.Dynamics.AX.Platform.CompilerPackage.nupkgRepeat for all five packages.
Warning: Free Azure DevOps organizations have a 2GB limit on Artifacts storage. These packages are large. Delete old versions when you upgrade.
Step 4: Create Configuration Files
Create nuget.config in your repository:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<add key="D365-Feed" value="https://pkgs.dev.azure.com/<org>/<project>/_packaging/<feed>/nuget/v3/index.json" />
</packageSources>
</configuration>Create packages.config with your specific versions:
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Microsoft.Dynamics.AX.Platform.CompilerPackage" version="7.0.7279.40" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.Platform.DevALM.BuildXpp" version="7.0.7279.40" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.Application1.DevALM.BuildXpp" version="10.0.1935.21" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.Application2.DevALM.BuildXpp" version="10.0.1935.21" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp" version="10.0.1935.21" targetFramework="net40" />
</packages>Step 5: Create the Pipeline
Microsoft provides sample pipeline files:
xpp-ci.yml for Git repos on 10.0.40+xpp-classic-ci.json for TFVC reposFor Git (which is where we're heading), the YAML pipeline needs these tasks:
Critical detail: The packaging step requires NuGet 3.3.0 or earlier. Version 3.4+ uses semantic versioning that breaks D365 deployable package generation. Yes, really.
Step 6: Run the Pipeline
Trigger a build and watch it fail. I'm being realistic here—it's probably not going to work the first time.
Common issues I expect to hit:
Step 7: Document What Breaks
This is where the next article in this series will pick up. Whatever goes wrong, I'll document it.
Here's the meta-point of this whole exercise.
I didn't discover the hosted agent approach by reading Microsoft documentation. I didn't find it by searching Stack Overflow or the Dynamics community forums. I found it by having a conversation with AI.
The AI asked the right clarifying questions. It didn't assume I needed a build VM just because that's how it's always been done. It presented options I hadn't considered.
This is what vibe coding looks like when you're not writing code—it's using AI to navigate complex technical decisions, to cut through the accumulated assumptions, to find the path you couldn't see because you were too close to the problem.
My developer came to me asking for help building a server. AI helped us realize we didn't need to build anything.
In the next installment, I'll document the actual implementation:
This is still an experiment. I don't know if it's going to work smoothly or blow up in unexpected ways. But that's the point of documenting it publicly—you get to see the real process, not the sanitized after-the-fact success story.
Stay tuned.
Previous in the series: Part 1 - Setting the Stage
Next in the series: Part 3 (Coming when I have results to share)
Have you implemented Azure Pipelines with hosted agents for D365 F&O? I'd love to hear about your experience. Reach out through our contact page.
For those following along at home, here are the key references:
Microsoft Documentation:
Community Resources:
Part 2 of 5
Discover more content: