CarneiroTech/Content/Cases/en/asp-to-dotnet-migration.md

9.8 KiB

title slug summary client industry timeline role image tags featured order date seo_title seo_description seo_keywords
ASP 3.0 to .NET Core Migration - Cargo Tracking System asp-to-dotnet-migration Tech Lead in gradual migration of mission-critical ASP 3.0 system to .NET Core, with dual-write data synchronization and cost reduction of $20k/year in mapping APIs. Logistics and Tracking Company Logistics & Security 12 months (complete migration) Tech Lead & Solution Architect
ASP Classic
.NET Core
SQL Server
Migration
Tech Lead
OSRM
APIs
Architecture
true 2 2015-06-01 ASP 3.0 to .NET Core Migration - Carneiro Tech Case Study Case study of gradual ASP 3.0 to .NET Core migration with data synchronization and $20k/year cost savings in API expenses. ASP migration, .NET Core, legacy modernization, SQL Server, OSRM, tech lead, routing API

Overview

Mission-critical cargo monitoring system for high-value loads (LED TVs worth $600 each, shipments up to 1000 units) using GPS satellite tracking. The application covered the entire lifecycle: from driver registration and evaluation (police background checks) to real-time monitoring and final delivery.

Main challenge: Migrate legacy ASP 3.0 application to .NET Core with zero downtime, maintaining 24/7 critical operations.


Challenge

Critical Legacy System

The company operated a mission-critical system in ASP 3.0 (Classic ASP) that couldn't stop:

Legacy technology:

  • ASP 3.0 (1998 technology)
  • SQL Server 2005
  • On-premises failover cluster (perfectly capable of handling the load)
  • Integration with GPS satellite trackers
  • Google Maps API (cost: $20,000/year just for route calculation)

Constraints:

  • 24/7 system operation with high-value cargo
  • No downtime allowed during migration
  • Multiple interdependent modules
  • Team needed to continue developing features during migration

Solution Architecture

Phase 1: Infrastructure Preparation (Months 1-3)

Database Upgrade

SQL Server 2005 → SQL Server 2014
- Full backup and validation
- Stored procedure migration
- Index optimization
- Performance testing

Dual-Write Synchronization Strategy

I implemented a bidirectional synchronization system that allowed:

  1. New modules (.NET Core) wrote to the new database
  2. Automatic trigger synchronized data to the legacy database
  3. Old modules (ASP 3.0) continued working normally
  4. Zero downtime throughout the entire migration
// Synchronization implementation example
public class DualWriteService
{
    public async Task SaveDriver(Driver driver)
    {
        // Write to new database (.NET Core)
        await _newDbContext.Drivers.AddAsync(driver);
        await _newDbContext.SaveChangesAsync();

        // SQL trigger automatically syncs to legacy database
        // ASP 3.0 modules continue functioning
    }
}

Why this approach?

  • Enabled module-by-module migration
  • Team could continue developing
  • Simple rollback if needed
  • Reduced operational risk

Phase 2: Gradual Module Migration (Months 4-12)

I migrated modules in increasing complexity order:

Migration sequence:

  1. Basic registrations (drivers, vehicles)
  2. Risk assessment (police database integration)
  3. Cargo and route management
  4. Real-time GPS monitoring
  5. Alerts and notifications
  6. Reports and analytics

Migrated application stack:

  • .NET Core 1.0 (2015-2016 was the beginning of .NET Core)
  • Entity Framework Core
  • SignalR for real-time monitoring
  • SQL Server 2014
  • RESTful APIs

Phase 3: Cost Reduction with OSRM ($20k/year Savings)

Problem: Prohibitive Google Maps Cost

The company spent $20,000/year just on Google Maps Directions API for truck route calculation.

Solution: OSRM (Open Source Routing Machine)

I implemented a solution based on OSRM (open-source routing engine):

Solution architecture:

┌─────────────────┐
│  Frontend       │
│  (Leaflet.js)   │
└────────┬────────┘
         │
         ▼
┌─────────────────┐      ┌──────────────┐
│  API Wrapper    │─────▶│  OSRM Server │
│  (.NET Core)    │      │  (self-hosted)│
└────────┬────────┘      └──────────────┘
         │
         ▼
┌─────────────────┐
│  Google Maps    │
│  (display only) │
└─────────────────┘

Implementation:

  1. OSRM Server configured on own server
  2. User-friendly API wrapper in .NET Core that:
    • Received origin/destination
    • Queried OSRM (free)
    • Returned all route points
    • Formatted for frontend
  3. Frontend drew the route on Google Maps (visualization only, no routing API)
[HttpGet("route")]
public async Task<IActionResult> GetRoute(double originLat, double originLng,
                                           double destLat, double destLng)
{
    // Query OSRM (free)
    var osrmResponse = await _osrmClient.GetRouteAsync(
        originLat, originLng, destLat, destLng);

    // Return formatted points for frontend
    return Ok(new {
        points = osrmResponse.Routes[0].Geometry.Coordinates,
        distance = osrmResponse.Routes[0].Distance,
        duration = osrmResponse.Routes[0].Duration
    });
}

Frontend with Leaflet:

// Draw route on map (Google Maps only for tiles)
L.polyline(routePoints, {color: 'red'}).addTo(map);

OpenStreetMap Attempt

I tried to also replace Google Maps (tiles) with OpenStreetMap, which worked technically, but:

Users didn't like the appearance Preferred the familiar Google Maps interface

Decision: Keep Google Maps for visualization only (no routing API cost)

Result: Savings of ~$20,000/year while maintaining route quality.


Results & Impact

Complete Migration in 12 Months

100% of modules migrated from ASP 3.0 to .NET Core Zero downtime throughout the entire migration Productive team throughout the process Faster and more scalable system

Cost Reduction

💰 $20,000/year saved by replacing Google Maps Directions API 📉 Optimized infrastructure with SQL Server 2014

Technical Improvements

🚀 Performance: .NET Core application 3x faster than ASP 3.0 🔒 Security: Modern stack with active security patches 🛠️ Maintainability: Modern C# code vs legacy VBScript 📊 Monitoring: SignalR for more efficient real-time tracking


Unexecuted Phase: Microservices & Cloud

Initial Planning

I participated in the design and conception of the second phase (never executed):

Planned architecture:

  • Migration to Azure (cloud was just starting in 2015)
  • Break into microservices:
    • Authentication service
    • GPS/tracking service
    • Routing service
    • Notification service
  • Event-driven architecture with message queues

Why it wasn't executed:

I left the company right after completing the .NET Core migration. The second phase was planned but not implemented by me.


Tech Stack

ASP 3.0 VBScript .NET Core 1.0 C# Entity Framework Core SQL Server 2005 SQL Server 2014 OSRM Leaflet.js Google Maps SignalR REST APIs GPS/Satellite Migration Strategy Dual-Write Pattern


Key Decisions & Trade-offs

Why dual-write synchronization?

Alternatives considered:

  1. Big Bang migration (too risky)
  2. Keep everything in ASP 3.0 (unsustainable)
  3. Gradual migration with sync (chosen)

Rationale:

  • Critical system couldn't stop
  • Enabled module-by-module rollback
  • Team remained productive

Why OSRM instead of others?

Alternatives:

  • Google Maps: $20k/year
  • Mapbox: Paid license
  • GraphHopper: Complex setup
  • OSRM: Open-source, fast, configurable

Why not OpenStreetMap for tiles?

UX-based decision:

  • Technically worked perfectly
  • Users preferred familiar Google interface
  • Compromise: Google Maps for visualization (free) + OSRM for routing (free)

Lessons Learned

1. Gradual Migration > Big Bang

Migrating module by module with synchronization enabled:

  • Continuous learning
  • Route adjustments during the process
  • Team and stakeholder confidence

2. Open Source Can Save a Lot

OSRM saved $20k/year without quality loss. But requires:

  • Expertise to configure
  • Own infrastructure
  • Continuous maintenance

3. UX > Technology Sometimes

OpenStreetMap was technically superior (free), but users preferred Google Maps. Lesson: Listen to end users.

4. Plan for Cloud, but Validate ROI

In 2015, cloud was just starting. On-premises infrastructure (SQL Server cluster) was perfectly capable. Don't force cloud if there's no clear benefit.


Context: Why 2015 Was a Special Moment?

State of technology in 2015:

  • ☁️ Cloud in early stages: AWS existed, Azure growing, but low corporate adoption
  • 🆕 .NET Core 1.0 launched in June 2016 (we used RC during the project)
  • 📱 Microservices: New concept, Docker in early adoption
  • 🗺️ Google Maps dominant: Paid APIs, few mature open-source alternatives

Challenges of the time:

  • Non-existent ASP→.NET migration tools
  • Scarce .NET Core documentation (version 1.0!)
  • Architecture patterns still consolidating

This project was pioneering in adopting .NET Core right at the beginning, when most were migrating to .NET Framework 4.x.


Result: Successful migration of 24/7 critical system, $20k/year savings, and solid foundation for future evolution.

Want to discuss a similar migration? Get in touch