Pioneer in Offering CRM Solutions since 2010...
Need Help ?
project-overview-vector

Project Overview

Client already created a desktop application using Delphi, Web API 2 (to communicate with desktop application), Customer portal using .NET MVC Framework and SAP Sybase SQL as database. The Client has multiple small modules available which are interconnected and tightly coupled. Database connectivity is maintained using NHibernate ORM. There are separate servers for database, web application, utilities, file service, etc. Deployment process on each server was manual. Dependency injection is managed by Autofac.

challenge-vector

Challenges

The client is a well known entity in the Legal and lender process industry. It has it’s own various challenges:

  • Multiple small applications tightly coupled to each other.
  • Manual deployment process on multiple servers.
  • Most of the code is written in the older technology - .NET framework.
  • Big time performance issues, slow response & occasional hung ups.
  • Need to implement OAuth 2.0 to integrate with various 3rd party APIs.
  • Data should be fetched from SAP Sybase SQL and oyster 3rd party providers.
  • Multiple requests to the web server for javascripts and css.
  • Server side calls are vulnerable to CSRF attacks.
  • Circular dependency issues due unorganized dependency injection practice.
  • Log files are created using their own code which is not thread safe.
solution-vector

Solution

We convinced the client to migrate the legacy application to modern .NET application with the latest .NET framework.

  • Message Broker : As the client's application consists of multiple small modules, we decided to implement RabbitMQ as the message broker. By doing so, we achieved loose coupling between these modules, enabling them to communicate with each other in a decoupled manner. For instance, when the customer portal needs to send an email, it publishes a request to the RabbitMQ exchange. The Email sending service, which has subscribed to this exchange, receives the request and initiates the process of sending the email. This decoupled architecture allows for scalability and flexibility, as new modules can be added or existing modules can be modified without disrupting the overall system.
  • CI / CD : Given the client's requirement of deploying the application to multiple different servers, we adopted Azure DevOps as our chosen CI/CD platform. Azure DevOps provided us with a comprehensive set of tools and capabilities to automate the build, test, and deployment processes. We created separate code pipelines for each server, ensuring that changes made to the codebase are automatically built and deployed to the respective server when pushed to the release branch. This streamlined approach eliminated manual intervention and reduced the chances of human error, resulting in more efficient and reliable deployments.
  • .NET 7 Implementation : To meet the client's needs for a project with messaging functionality and a login screen, we opted to develop a backend minimal API using .NET 7. This version of the .NET framework offers various enhancements and features that facilitate rapid development and improved performance. Leveraging .NET 7's capabilities, we were able to design an API that efficiently handled messaging tasks, such as sending and receiving messages, while also incorporating a user-friendly login screen. To complement the backend, we utilized Blazor WebAssembly for the frontend, allowing for a responsive and interactive user interface.
  • Upgrading to the Latest .NET Framework Version : Recognizing the importance of maintaining a supported and secure codebase, we undertook the task of upgrading the existing code from .NET Framework 4.5.2 to the latest supported version, .NET Framework 4.8. By upgrading to the latest framework, we ensured that the client's application benefited from the latest security patches, performance improvements, and compatibility with other components. This upgrade involved a meticulous process of updating dependent NuGet packages to their compatible versions, making necessary code changes, and conducting comprehensive testing to guarantee the stability and reliability of the system. The successful completion of this upgrade allowed the client to leverage the advancements offered by the latest .NET Framework version, enabling them to stay up-to-date with the latest industry standards.
  • Implementation of Dapper : In certain areas of the application where efficient data retrieval into lists from the database was critical, we made the decision to replace the existing NHibernate ORM with Dapper. Dapper, a lightweight micro-ORM, offers superior performance for such scenarios due to its optimized data mapping and minimized overhead. By implementing Dapper, we were able to improve the response time of select operations, enhancing the overall performance of the application. This change positively impacted the user experience by providing faster and more efficient data retrieval, particularly in areas where large datasets were involved.
  • Third-party API Integration with OAuth 2.0 Mechanism : The client's requirement of displaying combined data from a legacy database and an external API necessitated the integration of a third-party API with OAuth 2.0 authentication. To achieve this, we developed a robust integration solution that seamlessly merged the results obtained from the legacy database and the API response. To enhance usability and performance, we implemented pagination and filtering mechanisms, allowing users to efficiently navigate through the combined data. As OAuth 2.0 requires the periodic generation of access tokens based on refresh tokens, we designed a reliable mechanism using generics and delegates. This approach ensured that whenever an error, such as an expired token, was encountered during API communication, the system automatically fetched an updated token using the refresh token and seamlessly resumed the original request. This implementation provided a secure and efficient integration with the third-party API, ensuring a smooth user experience and reliable data retrieval.
  • Improved Front-end Page Load Performance :Recognizing the importance of optimal page load times for a seamless user experience, we focused on various strategies to improve front-end performance:
    • Minifying JavaScript Files: We analyzed and optimized the JavaScript files used in the application, ensuring they were minimized to their essential code. By utilizing minified versions of these files, such as jquery.min.js instead of the regular jquery.js, we reduced file sizes and improved loading times.
    • Leveraging Content Delivery Networks (CDNs): We identified specific JavaScript libraries that were commonly used across the application and replaced local file references with their corresponding versions hosted on reputable CDNs. This approach allowed us to leverage the distributed infrastructure of CDNs, resulting in faster and more reliable delivery of these files to end-users.
    • Script Bundling: To further optimize the loading of common scripts used across multiple pages, we implemented script bundling. By combining and compressing these scripts into a single file, we reduced the number of HTTP requests required and minimized the overall file size. This approach contributed to improved load times, especially for repeat visits where the bundled script could be cached by the browser.
    • Optimized Script Code: We reviewed and refactored script code in critical sections, identifying opportunities to enhance performance. This included optimizing loops, reducing unnecessary DOM manipulations, and improving asynchronous operations where applicable. These optimizations aimed to minimize execution time and improve the responsiveness of the user interface.
    • Asynchronous Functionality: Wherever feasible, we converted synchronous functions to asynchronous counterparts. This allowed the application to handle time-consuming operations, such as API calls or database queries, without blocking the main thread. By leveraging asynchronous programming, we achieved better concurrency and responsiveness, resulting in an overall improved user experience.
    • By implementing these front-end performance optimizations, we significantly reduced page load times, ensuring that users could access and interact with the application more quickly and efficiently.
  • Antiforgery Token Mechanism : To enhance the security of the application and prevent Cross-Site Request Forgery (CSRF) attacks, we incorporated the built-in Antiforgery mechanism provided by the .NET Framework. This mechanism generates a unique token for each request and validates it upon submission. By including this token in forms or AJAX requests, the server can verify the authenticity of the request, mitigating the risk of unauthorized actions. This implementation effectively protects the application and its users from potential CSRF vulnerabilities.
  • Serilog Implementation : In order to improve the logging capabilities of the application, we replaced the manual log file implementation with Serilog, a popular third-party logging tool. Serilog offers a wide range of features, including various sinks for storing logs in different formats and destinations. By utilizing Serilog, we gained advantages such as enhanced log management, the ability to easily configure log levels and destinations, and comprehensive support for log rotation and retention policies. Additionally, Serilog automatically handles concurrency concerns, ensuring that every log line is correctly recorded in the designated log files. This implementation improved the overall logging experience, making it more efficient and reliable for monitoring and debugging purposes.
  • Azure KeyVault : Recognizing the importance of secure storage for sensitive information such as secrets, we integrated Azure KeyVault into the application. Azure KeyVault provides a centralized and highly secure platform for storing and managing secrets, such as API keys, database connection strings, and cryptographic keys. By utilizing Azure KeyVault, we removed the need to store secrets directly in configuration files, reducing the risk of unauthorized access. Instead, the application retrieves the required secrets securely from Azure KeyVault at runtime, ensuring a higher level of security and compliance with best practices

Legal and Lender Process Outsourcing (REST API)

diagram-img

Legal and Lender Process Outsourcing (Customer Portal)

diagram-img

Technology we used

The goal is to migrate a legacy desktop based application to .NET 7. Variance Infotech studied various technology options and decided to go with .NET 7, SQL Server as base technology. We used RabbitMQ and Action Step to implement a publish subscribe pattern.

Related Pages

Frequently Asked Questions

  • Why did the Legal Agency decide to migrate their Legacy/Desktop application to .NET 7?

    The decision to migrate to .NET 7 was driven by the Legal Agency's aim to modernise their technology stack. .NET 7 provides advanced features, improved performance, and robust support, aligning with the agency's commitment to streamlined legal and lender process outsourcing.

  • How does the .NET 7 migration contribute to the services offered by the Legal Agency?

    The migration to .NET 7 enhances the capabilities of the Legal Agency in providing services such as Digital Mortgage Documentation and Settlements, Mortgagee in Possession Services, Mortgage Debt Recovery, Online Search Services, and Legal Agent Services. It enables a more efficient and modernised service delivery.

  • What specific functionalities and features of .NET 7 are leveraged in the migration?

    specific functionalities and features, including performance optimizations, new language features, and compatibility improvements. These elements contribute to a more robust and future-ready application for the Legal Agency.

  • How is the migration process planned and executed for the Legal Agency's Legacy/Desktop application?

    The migration process is meticulously planned and executed, involving steps such as code refactoring, compatibility testing, and performance optimization. The goal is to ensure a smooth transition with minimal disruptions to the agency's legal and lender process outsourcing activities.

  • Are there any considerations for legal compliance during the migration?

    Yes, the migration process includes considerations for legal compliance to uphold the agency's commitment to regulatory requirements. This encompasses data security, privacy, and any legal standards relevant to the services provided by the agency.

Trusted Customers

Contact Us About

By sending this form I confirm that I have read and accept Variance Infotech Privacy Policy

What happens next?

  • Our sales manager reaches out to you within a few days after analyzing your business requirements.
  • Meanwhile, we sign an NDA to ensure the highest level of privacy.
  • Our pre-sale manager presents project estimates and an approximate timeline.

We use cookies to provide better experience on our website. By continuing to use our site, you accept our Cookies and Privacy Policy.

Accept