Migrating a 6-Year-Old .NET Core 2 SaaS to .NET 8: Lessons Learned
A pragmatic, zero-downtime migration strategy for a production freight SaaS — from .NET Core 2 to .NET 8 with microservice decomposition.
Why Migrate?
The Gamasuite codebase was born on .NET Core 2 in 2018. By 2023 it had:
- 6 tightly-coupled services in a monorepo
- Performance bottlenecks at 2,000+ concurrent rate requests
- No support for newer C# features (records, pattern matching, async streams)
- End-of-life runtime with no security patches
Migration Strategy: Strangler Fig
We used the Strangler Fig pattern — incrementally replace subsystems without a big-bang rewrite.
Phase 1: Upgrade in-place to .NET 6
Minimal code changes. Surfaced all the deprecated APIs we needed to fix.
Phase 2: Extract the Rate Engine Microservice (.NET 8)
The rate calculation engine was the hottest path. We extracted it first:
// Old: synchronous, blocking
public List<CarrierRate> GetRates(RouteRequest request)
{
var results = new List<CarrierRate>();
foreach (var carrier in _carriers)
{
results.Add(_carrierClient.GetRate(carrier, request)); // blocking HTTP
}
return results;
}
// New: true parallelism with async streams
public async IAsyncEnumerable<CarrierRate> GetRatesAsync(
RouteRequest request,
[EnumeratorCancellation] CancellationToken ct = default)
{
var tasks = _carriers.Select(c => _carrierClient.GetRateAsync(c, request, ct));
await foreach (var result in Task.WhenEach(tasks).WithCancellation(ct))
{
yield return await result;
}
}This alone gave us 40% latency reduction on quote generation — results streamed to the UI as each carrier responded, rather than waiting for all.
Phase 3: Minimal APIs for Internal Services
Internal microservice comms moved to .NET 8 Minimal APIs:
app.MapPost("/rates/calculate", async (
RateRequest req,
IRateCalculatorService calculator,
CancellationToken ct) =>
{
var rates = await calculator.CalculateAsync(req, ct);
return Results.Ok(rates);
})
.RequireAuthorization("internal-service");Key Metrics After Migration
| Metric | Before | After | |--------|--------|-------| | Avg quote latency | 2,800ms | 480ms | | Memory per instance | 1.2 GB | 380 MB | | Cold start (App Service) | 12s | 3s | | Concurrent requests/node | ~800 | ~3,200 |
Lessons Learned
- Profile before you optimise — used dotnet-trace to find the real bottlenecks
- Async all the way down — partial async caused deadlocks in transitional code
- Feature flags for gradual rollout — used Azure App Configuration to route % of traffic