WCF Data Services ships with two built-in query providers, a query provider for Entity Framework that uses the CSL model to infer all the service metadata for the exposed entities and their associations, and another provider, a Reflection Provider that uses .NET reflection over the exposed object model to infer the same metadata.
The Entity Framework provider is usually the one that most people use for its simplicity. The simplicity of this provider resides on the fact that you actually don’t need to do much for getting a data service up and running. You only need to define a entity framework model and expose it in the data service using the DataService<T> class.
The Reflection Provider on the other hand requires some more work, as you also need to implement the IUpdatable provider if you want to make your model read-write (otherwise, it’s read-only by default).
While the Entity framework provider is simple to use, the resources you want to expose in the data service gets tied to all the limitations you find in an Entity Framework model (for instance, you might entities or properties you don’t really want to persist in a database). This provider also implements IUpdatable and that implementation can not be customized, extended or replaced for providing some additional business logic functionality in the data service. And although you can use interceptors, I find that technique very limited as they represents aspects that are applied to a single entity. There is no way to inject cross-cutting aspects that affect all the entities (Entity based Authorization for instance). You can use the Data Service Pipeline for injecting that logic, but you don’t have entities at that point, only the messages on the wire (atom feeds or json messages). A technique I used in the past was to extend the EF entities with partial classes to add some additional business logic to the entities, and attach the data service to the Saving event in the Entity framework to run some logic before the entities were created, updated or deleted. However, I still don’t like this technique much because you end up with a model totally limited to what you can define in EF.
The advantage of using the reflection provider is that you can expose a much rich object model in your data service, even if you need to write some more code. In addition, as you are also writing an IUpdatable implementation, you can inject all the business logic or cross-cutting concerns that are common for all the entities in that class. However, the problem with the reflection provider is to make the service implementation efficient enough to resolve the queries in the data source and not the memory. It does not make sense at all to use a rich object model on top of entity framework for instance, if you still need to load all the object in memory to use linq to objects to perform the queries (unless the number of entities you manage is really small). So, the only possible solution to implement a service that manages a large number of entities and expose a rich object model at the same time is to use an ORM other than EF, like Linq to SQL or NHibernate.
NHibernate in that sense is a much mature framework, you are not tied to a single db implementation (sql server), and the number of features that this framework can offer is obviously higher, making NHibernate a good technology for implementing data services. In addition, NHibernate 3.0 already ships with a Linq provider out of the box that works really well (Entity Framework Code Only looks promising too but it is a CTP at this point).
In order to use NHibernate in a data service, you need to provide an IUpdatable implementation for making it read/write (support for POST PUT and DELETE). Otherwise it will behave as read only by default (only GETs supported).
Read more: Pablo M. Cibraro (aka Cibrax)
The Entity Framework provider is usually the one that most people use for its simplicity. The simplicity of this provider resides on the fact that you actually don’t need to do much for getting a data service up and running. You only need to define a entity framework model and expose it in the data service using the DataService<T> class.
The Reflection Provider on the other hand requires some more work, as you also need to implement the IUpdatable provider if you want to make your model read-write (otherwise, it’s read-only by default).
While the Entity framework provider is simple to use, the resources you want to expose in the data service gets tied to all the limitations you find in an Entity Framework model (for instance, you might entities or properties you don’t really want to persist in a database). This provider also implements IUpdatable and that implementation can not be customized, extended or replaced for providing some additional business logic functionality in the data service. And although you can use interceptors, I find that technique very limited as they represents aspects that are applied to a single entity. There is no way to inject cross-cutting aspects that affect all the entities (Entity based Authorization for instance). You can use the Data Service Pipeline for injecting that logic, but you don’t have entities at that point, only the messages on the wire (atom feeds or json messages). A technique I used in the past was to extend the EF entities with partial classes to add some additional business logic to the entities, and attach the data service to the Saving event in the Entity framework to run some logic before the entities were created, updated or deleted. However, I still don’t like this technique much because you end up with a model totally limited to what you can define in EF.
The advantage of using the reflection provider is that you can expose a much rich object model in your data service, even if you need to write some more code. In addition, as you are also writing an IUpdatable implementation, you can inject all the business logic or cross-cutting concerns that are common for all the entities in that class. However, the problem with the reflection provider is to make the service implementation efficient enough to resolve the queries in the data source and not the memory. It does not make sense at all to use a rich object model on top of entity framework for instance, if you still need to load all the object in memory to use linq to objects to perform the queries (unless the number of entities you manage is really small). So, the only possible solution to implement a service that manages a large number of entities and expose a rich object model at the same time is to use an ORM other than EF, like Linq to SQL or NHibernate.
NHibernate in that sense is a much mature framework, you are not tied to a single db implementation (sql server), and the number of features that this framework can offer is obviously higher, making NHibernate a good technology for implementing data services. In addition, NHibernate 3.0 already ships with a Linq provider out of the box that works really well (Entity Framework Code Only looks promising too but it is a CTP at this point).
In order to use NHibernate in a data service, you need to provide an IUpdatable implementation for making it read/write (support for POST PUT and DELETE). Otherwise it will behave as read only by default (only GETs supported).
Read more: Pablo M. Cibraro (aka Cibrax)