Review structure of a web application that I’m working on

There is a work-flow in Scrum called retrospective. It is about reviewing work done in a sprint. I love the idea, I think talking and communication in a software development team is very important. Inspired from scrum retrospective I'd like to have a review on architecture and design of a project that recently I've been involved in. The project is not finished yet but I think it is a good time to review its structures.

 

The project back-end is implemented by ASP.NET MVC Core and Entity Framework Core and is serving both web API and server backed contents (ASP.NET MVC). Develop is done mostly in Ubuntu but also in Windows too.

copyright https://hakanforss.wordpress.com/2012/04/25/agile-lego-toyota-kata-an-alternative-to-retrospectives/

Projects Structure

While project is not very big and complex and have about 20 tables on the database we decided to have 4 projects. One for domain, dtos and contracts named Domain, another for mainly business logic called Core, another for web project itself that contains cshtmls, controllers and wwwroot directory called Web and another for unit tests called Test. I do agree this is a very common project structure for ASP.NET web applications but I saw not benefit over them except than a simple categorizing that also was achievable with directories. I think it was better to have 2 projects, one for web (combining domain, core and web) and another for test.

 

Interfaces

Programming against interfaces are very popular in C#. It has gained more popularity with wide usage of dependency injection in ASP.NET. Also ASP.NET Core have a built in dependency injection that has increased popularity programming against interfaces. We followed this manner in our project. We create an interface per each service class. We have no mockups in our unit tests so I think using so many interfaces are a bit over-engineering because we have no interface in our project that has more than one implementation. Having large number of interfaces just decreased development velocity. Adding each new method needed changes in two place, service itself and interface that it was implementing.

 

Soft Delete

Soft delete means not deleting database records physically, but instead keep them in database and set a filed in each record named IsDeleted to true. So in the application soft deleted records are not showed or processed. We add this feature to the application so we can track data changes and do not allow any data being really loosed. For this purpose we could have used logging mechanism in a way if a record is deleted a log is added that says who deleted what data and when. Implementing soft delete imposed many manual data integrity check to the application. With delete of each record we must check if any dependent item exists or not, if so prevent deletion.

 

Authorization

I personally made authorization over-engineering. I checked roles in both controllers and services in many cases. My emphasis on separation of controllers and services was a bit high. No entry to the app exists other than MVC controllers and API controllers (they are same in ASP.NET Core). So checking roles in controllers in enough.

 

Using dtos to access data layer

Many application designs allow a direct access to database models. For example controllers action get a MyModel instance directly and pass it to dbSet or services to save it. It is dangerous because ORM dirty checking mechanism may save them in the database in mistake. In this project I used a Dto to pass data to or get data from CRUD services. So controllers are not aware of database models. It increased volume of development but I think it saves us from mysterious data updates in the database.

Update:

Second part of this post can be found here.