All Articles
Tech hub

ASP.NET Web Forms application migration FAQ

Nadezhda Petrova
28 Oct 2021
5 min read
Nadezhda Petrova
28 Oct 2021
5 min read

ASPNET Web Forms Modernization FAQ

While your ASP.Net Web Forms applications will remain functional as long as you have the talent to support them, just like Classic ASP applications, they will become increasingly brittle and outdated over the years.

  • Rewrite your application from scratch using MVC or Blazor – a very time-consuming task
  • Rewrite the application while maintaining the ASP.NET Web Forms version - Building a new version in parallel with maintaining the old one proves very expensive.
  • Modernize your ASP.NET Web Forms application - By replacing Web Forms with a modern UI framework while keeping the business logic intact, the development team can incrementally switch to .NET Core, keeping the lights on the Web Forms application while modernizing their system.

Classic ASP is part of Internet Information Services (IIS), which is a component of Windows Server. This means it has the same support lifecycle as the underlying OS. That’s up to 2027 on Windows Server 2016 and 2029 on Server 2019. However, there’s been no active development on Classic ASP since 2000, when version 3 was last released. This means that, while supported, the framework has long been considered a legacy technology.

Technology life cycle, project duration, impact on the business, technical skillset, and engineering capacity are some of the primary risk assessment areas that you should consider.

Modern tech stack choices may create special considerations for the data layer, where the new system must run on the same database and schema version. For example, suppose your legacy Web Forms application uses Entity Framework or another enterprise quality ORM framework. In that case, it might be easier to preserve the data layer by replacing the ORM with a version compatible with the latest .NET framework like Entity Framework Core.

Your business layer (also called Application Core) should generally be divided into multiple sub-areas:

  • Entities – plain model classes and data containers representing your domain entities
  • Business services – your domain-specific business logic classes
  • Interfaces – the public API facade of your application core, facilitating decoupled system components through dependency injection
  • Other system objects – domain events, exception classes, aggregates, value objects – all players in your domain model and required artifacts in an enterprise service architecture

When mapping between legacy and modern .NET versions, this table can help you choose the correct .NET Standard to target. Start migrating from the bottom of your dependency tree (usually a project named Entities, Model, or BLL) and work your way up through the dependency graph. Migrating projects between .NET Framework and .NET Standard is sometimes super easy and other times hard, depending on the framework APIs and NuGet packages you’re using.

Most modernization projects turn into major UI overhaul initiatives, introducing modern user interaction paradigms like touch and voice, multi-device and on-the-go access through a responsive user interface, and better security and encryption. Picking the right choice of technology, tools, and patterns for your user interface is key to achieving success in a world of ever-increasing user expectations for fast, engaging, and intuitive applications. Developers call these “consumer-grade applications” – a term denoting the trend that massively popular consumer apps like Facebook, Twitter, and Google apps have created – shaping user expectations for all app experiences, including business apps. The modern business application faces the exact same user expectations for high-quality, interactive, easy-to-use visual interfaces.

ASP.NET Web Forms is a “server pages” paradigm at its core – it presents fully rendered HTML pages from the web server. When the page loads, users can interact with it – click on buttons, expand sections, navigate further into detail pages. Such changes in the visual state require a roundtrip to the server, where the server generates an updated page. This interaction pattern is, by design, bound to suffer from the network overhead of making requests between the browser and the server. Not only is the user waiting, but they will see the flickering between the old pages being dismissed and the new pages being rendered, creating a jarring user experience. The original Web Forms technology alone is highly insufficient to meet the requirements of the rich, modern browser applications.

The Single-Page Application (SPA) paradigm shifted the responsibility for building and updating the web page to the browser with the help of JavaScript APIs and browser capabilities. If you are keen on using frameworks like Angular and React, the latter can help you create a clean and maintainable layered architecture for the application front-end and help scale the server by offloading more and more of the data processing to the browser.

A recent alternative by Microsoft – Blazor – promises to combine the best of modern JavaScript MVC patterns with the ability to write C# running in the browser for maximum .NET skill reuse. Blazor bootstraps a slimmed-down version of the .NET runtime in the browser process and allows .NET developers to build web interfaces and share code in .NET Standard assemblies – all in their favorite programming language.

Not all business applications require robust and dynamic client-side user interfaces that change state rapidly in a highly interactive manner. For Server pages, their equivalent in the modern .NET framework is Razor pages, which provide a simple, elegant, and intuitive programming model for developers who need to write page-focused app scenarios.

Making the right decision requires a careful assessment of the core system requirements and dealing with the impact of that choice. The decisions we make in this area affect the rest of the application layers. The frameworks have their strengths and weaknesses, learning curves, ecosystem strength, and varying degrees of complexity (framework complexity and system complexity).

Both Angular and React are considered mature and well supported but picking the proper framework for your project still requires additional research on framework capabilities, the richness of third-party components, and their long-term roadmap.

Angular, for example, is an all-in-one framework that comes bundled with a component model, routing, forms, and localization in addition to the core MVC framework. On the other hand, React is considered more an MVC library that deals mainly with presenting the UI and tracking changes on the page, where additional application components must be brought in as external dependencies and third-party components from the ecosystem. The level of support components get from the engineering team behind the framework should also impact your choice.

Blazor’s major value proposition is that developers can use C# to create web applications typically coded in JavaScript. Our take on Blazor is that it certainly has its place in the list of viable choices for modern web applications running on .NET, particularly where developers need to share and run the same code on both the client and the server. However, we don’t consider the notion of “C# instead of JavaScript” true. Instead, we can generalize our thoughts as “C# and JavaScript together in the browser,” as Blazor alone cannot replace all cases where JavaScript is required, particularly when applications need modern browser capabilities such as local storage, IndexedDB, web workers and encryption.

We still observe the need to write significant amounts of JavaScript in Blazor applications, and Blazor apps still require the same level of JavaScript code organization and discipline as pure JavaScript SPA frameworks.

When a Web Forms application is the subject of modernization, a REST API may be a new system requirement since the typical Web Forms application has its client and server code tightly coupled through page code-behind files (.aspx.cs files) instead of a decoupled, generic public API. When migrating to a modern .NET application, the need to access data and business logic on the server through a standalone API interface has a significant implication on your overall system design. It creates a shift from a page-focused to an endpoint-focused design. Since modern APIs are modeled around the notion of a resource in your domain model that exposes data and operations through a uniform interface, this often constitutes a change of abstraction compared to Web Forms, where you expose functionality local to a specific server page, aggregating multiple entities and actions. This shift from a function-based to a resource-based system design affects the data pathways (how data is exchanged) and the “chattiness” (how frequently is data exchanged) between the client-side and the server-side of your application.

Regardless of whether you use Razor actions or API controller actions, your client application calls into server code and expects the operation result. The principle of thin controllers states that you must keep your controllers thin and simple, limiting their responsibility to user authorization, data validation, sanitization, and formatting. Furthermore, any business logic, data queries, or modification must happen within the dedicated business layer (BLL). Thus, API or Razor controllers must inject and call into the required business classes, returning the computation result to the client through JSON results or rendered Razor views. The principle of thin controllers constitutes a significant architectural benefit in your modern .NET application, enabling app developers to quickly refactor, enhance and extend the client application without introducing change to the domain specific business layer. Further, decoupling your business functions from your presentation layer facilitates code reuse, preventing the repetition of identical data entry and manipulation of code we typically see in long-maintained legacy systems.

A healthy testing strategy includes a combination of unit tests for the domain model and business services and a set of automated end-to-end tests that verify at least the critical use cases in the system, optimally all use cases.

You can choose the runtime experience (CoreCLR or Mono).

We love using NUnit and xUnit for unit testing, Postman and SoapUI for API testing. For JavaScript frameworks, some of the tools we use include Jasmine and Jest for writing test specs, Protractor for end-to-end tests running in the browser, and Jenkins for tying everything together in nicely orchestrated, build, test, and deploy workflows.

With the proliferation of highly independent, loosely coupled components in a microservice architecture, agile teams are increasingly adopting continuous delivery practices like automated testing, automated environment provisioning, source control-based build and deployment, quality gates, deployment promotion, and zero downtime deployment in production.

Combining these practices, collectively termed Continuous Integration and Delivery (CI/CD), reduces the cost of deployment and operation of a live system and increases the rate of delivering new value to end-users. The principles of CI/ CD are increasingly considered an essential part of the modern software development life cycle. They lead to a DevOps culture that builds significant deployment and operational efficiency – a positive outcome not just for the software development team but for the entire business.

Web Forms
.NET Modernization

stay tuned

Subscribe to our insights

Secured with ReCAPTCHA. Privacy Policy and Terms of Service.