Ratko Ćosić - lamentations of one programmer

utorak, 28.10.2008.

I'm preparing my first session!

Yes, that's right! Enough with pitty lammentations, I'm going to say something in live!
Support me (and don't throw eggs and potatoes on me)!



The title of the session will be: C# Business Objects: Patterns of Application Business Core

The summary of the session should look something like:
- Intro,
- New challenges in dev-world,
- Paradigm shift,
- Business Application Core,
- Business objects,
- ....
- CSLA.NET project demo,
- Future of BO: ADO.NET Entity Framework, Oslo
- Summary.

Complete text and slides I will publish here as soon as I survive next monday, November 3, 2008 at MSCommunity in Cakovec.

Until then...

see what I have actually done!

- 14:53 - Comments (0) - Print - #

subota, 25.10.2008.

ORM of the Future (ADO.NET Entity Framework)

And now, let me finish with the lamment. After presenting NHibernate and LINQ, I will try to write sth about ADO.NET Entity Framework. Here it is:

Entity Framework: LINQ to Entities

King is dead! Long live the king!
Well, LINQ is not exactly dead as we are not dead as majority of developers today still use 2.0 fx and 3.5, thus it is still bleeding-edge technology (although the wispers of 4.0 can be felt in the waters). BUT, the future ORM (and more!) is gonna be here soon.

The ADO.NET Entity Framework, part of the ADO.NET components of the .NET Framework, is a framework for providing services on data models from Microsoft. Although an object-relational mapping service is an important part of it, it aims to provide more services such as query, view and reporting services. It is geared towards solving the mismatch between the formats in which data is stored in a database and in which it is consumed in an object-oriented programming language or other front ends.

ADO.NET Entity Framework is included with .NET Framework 3.5 Service Pack 1 and Visual Studio 2008 Service Pack 1, released on 11 Aug 2008. It also includes the capability of executing LINQ against ADO.NET Entity Framework entities.



The architecture of the ADO.NET Entity Framework, from the bottom up, consists of the following:


  • Data source specific providers, which abstracts the ADO.NET interfaces to connect to the database when programming against the conceptual schema.

  • Map provider, a database-specific provider that translates the Entity SQL command tree into a query in the native SQL flavor of the database. It includes the Store specific bridge, which is the component that is responsible for translating the command tree into a store-specific command tree.

  • EDM parser and view mapping, which takes the SDL specification of the data model and how it maps onto the underlying relational model and enables programming against the conceptual model. From the relational schema, it creates views of the data corresponding to the conceptual model. It aggregates information from multiple tables in order to aggregate them into an entity, and splits an update to an entity into multiple updates to whichever table contributed to that entity.

  • Query and update pipeline, processes queries, filters and update-requests to convert them into canonical command trees which are then converted into store-specific queries by the map provider.

  • Metadata services, which handle all metadata related to entities, relationships and mappings.

  • Transactions, to integrate with transactional capabilities of the underlying store. If the underlying store does not support transactions, support for it needs to be implemented at this layer.

  • Conceptual layer API, the runtime that exposes the programming model for coding against the conceptual schema. It follows the ADO.NET pattern of using Connection objects to refer to the map provider, using Command objects to send the query, and returning EntityResultSets or EntitySets containing the result.
    Disconnected components, which locally caches datasets and entity sets for using the ADO.NET Entity Framework in an occasionally connected environment.
    Embedded database: ADO.NET Entity Framework includes a lightweight embedded database for client-side caching and querying of relational data.

  • Design tools, such as Mapping Designer are also included with ADO.NET Entity Framework which simplifies the job on mapping a conceptual schema to the relational schema and specifying which properties of an entity type correspond to which table in the database.

  • Programming layers, which exposes the EDM as programming constructs which can be consumed by programming languages.

  • Object services, automatically generate code for CLR classes that expose the same properties as an entity, thus enabling instantiation of entities as .NET objects.

  • Web services, which expose entities as web services.

  • High level services, such as reporting services which work on entities rather than relational data.



An Entity Framework application requires creating a conceptual model defining the entities and relationships, a logical model that represents the underlying relational model, and the mappings between the two. A programmable object model is then generated from the conceptual model.

Entity Data Model Tools

The Entity Data Model (EDM) is a model for defining data as sets of entities and relationships to which common language runtime (CLR) types and storage structures can be mapped. The EDM enables developers to program against a conceptual data model instead of directly against a storage schema.

There are three tools that are designed to help you graphically build applications with the EDM: the Entity Data Model Wizard, the ADO.NET Entity Data Model Designer (Entity Designer), and the Update Model Wizard. These tools work together to help you generate, edit, and update an EDM, as follows:

Entity Data Model Wizard allows you to generate an EDM from an existing database, add database connection information to the application, and generate C# or Visual Basic classes based on the conceptual model. When the Entity Data Model Wizard finishes generating an EDM, it launches the Entity Designer.

Entity Designer allows you to visually create and modify entities, associations, mappings, and inheritance relationships. You can also validate an EDM with the Entity Designer.

Update Model Wizard allows you to update an EDM when changes have been made to the underlying database. You must launch this tool from within the Entity Designer.

There is also a command-line tool designed to help you build applications with the EDM: the EdmGen.exe tool. This tool can generate an EDM, validate an existing model, produce source code files that contain object classes based on the conceptual model, and produce source code files that contain views generated by the model.

So, what to choose?

Well, I can't aswer you on that. In fact, some of partners I work with have built their own application architecture on that. And they did it quite well. But the question arises - aren't ADO.NET Entity Framework gonna swept them soon? Well, I can't answer on it also. Maybe they coded it so well and crafted for them so suitable that they won't be in need to use something else. But, from my own experience, I know that most of the dev-companies haven't any idea about what it is and how to use it.

Would they crash and stumble? Nah! But did they picked the right track?

We'll see....

- 08:16 - Comments (0) - Print - #

petak, 24.10.2008.

ORM of the future (LINQ)

Let's continue with the lammentation of the second "ORM of the future":

Language Integrated Query (LINQ)



LINQ is a Microsoft's invention, released as a part of .NET Framework 3.5 on November 19, 2007.

LINQ defines a set of query operators that can be used to query, project and filter data in arrays, enumerable classes, XML, relational database, and third party data sources - so, it's not limited to a database storage. While it allows any data source to be queried, it requires that the data be encapsulated as objects. So, if the data source does not natively store data as objects, the data must be mapped to the object domain.
The results of a query are returned as a collection of in-memory objects that can be enumerated.

That reminds me to Repository pattern from M. Fowler and usage of ActiveRecord. Well, acually, you have this ActiveRecord pattern entangled inside the SandCastle project's application building block. Well, actually, all these solutions mingle with each other, and I admit, it's really hard to say the real differences.

The expression trees are at the core of LINQ extensibility mechanism, by which LINQ can be adapted for any data source. The expression trees are handed over to LINQ Providers, which are data source-specific implementations that adapt the LINQ queries to be used with the data source. The LINQ Providers analyze the expression trees representing the query ("query trees") and generate a DynamicMethod (which are methods generated at runtime) by using the reflection APIs to emit CIL code. These methods are executed when the query is run.

The power of LINQ reflects with the usage of the LINQ Providers for: in-memory object collections, SQL Server databases, ADO.NET datasets and XML documents. These different providers define the different flavors of LINQ: LINQ to Objects, LINQ to XML, LINQ to SQL, and LINQ to DataSets.

The LINQ providers can be implemented by third parties for various data sources as well:
Data Services: LINQ to ADO.NET Data Services
Entity Framework: LINQ to Entities
DLinq: LINQ to MySQL, PostgreSQL, Oracle, Ingres, SQLite and Microsoft SQL Server
Google search: LINQ to Google
Windows Search: LINQ to System Search
NHibernate: LINQ to NHibernate

In the last post of this subject, I will present the third king - ADO.NET Entity Framework. Until then...

- 20:50 - Comments (0) - Print - #

četvrtak, 23.10.2008.

ORM of the future (NHibernate)

"We hire people who build doghouses, give them cranes and ask them to build skyscrapers. Then we're surprised when they fail.
- Eileen Steets Quann

Some may ask why I'm doing this lammentations about such an obscure topic. Well, it's all about sharping tools for building skyscrapers...

...



NHibernate is a object/relational persistance and query engine adapted for .NET (Hibernate as its predecessor). It lets you develop classes through association, inheritance, polymorphism, composition, and collections. Hibernate allows you to write queries in its proprietary SQL extension (HQL), as well as in native SQL, or with an object-oriented Criteria and Example API. It is open source solution.

The creators also commonly advocate using of TDD (Test Driven Development) with NAnt and DDD (Domain Driven Design). Although NHibernate can also be used to generate the domain model by starting from an existing database schema you can also start from the opposite direction and start with a domain model first and let NHibernate generate the database schema from the model (and the mapping meta data).

Since NHibernate is an open source project anybody has free access to the binaries and/or source code. The source code is stored in a SVN repository (Subversion Source Control System). The most current source code is called with a funny buzzword - the "trunk".

An instance of an entity in the domain corresponds to a row in a table in the database. We have to define a mapping between the entity and the corresponding table in the database. This mapping can be done either by defining a mapping file (an xml-document) or by decorating the entity with attributes (as declarative security in .NET).

Mapping file has to be defined as .hbm.xml. Please note the "hbm" part of the file name. This is a convention used by NHibernate to automatically recognize the file as a mapping file (which also builds as an "Embedded Resource").

NHibernate doesn't get in your way such as that it defines many reasonable defaults. So if you don't provide a column name for a property explicitly it will name the column according to the property. Or NHibernate can automatically infer the name of the table or the type of the column from the class definition. As a consequence my xml mapping file is not "cluttered" with redundant information.

We also have to tell NHibernate which database product we want to use and provide it the connection details in form of a connection string. This is done by adding xml-based configuration file (hibernate.cfg.xml).

To summarize, "NHibernate's primary feature is mapping from .NET classes to database tables (and from CLR data types to SQL data types). NHibernate also provides data query and retrieval facilities. NHibernate generates the SQL commands and relieves the developer from manual data set handling and object conversion, keeping the application portable to most SQL databases, with database portability delivered at very little performance overhead." - from the documentation.

NHibernate is primarily tested on Microsoft SQL Server 2000. It is also known to work on these databases:

- Microsoft SQL Server 2005/2000
- Oracle
- Microsoft Access
- Firebird
- PostgreSQL
- DB2 UDB
- MySQL
- SQLite

NHibernate 2.1 is currently under development and will include a provider for Linq. (see my next post)

Continue with the Linq

- 19:49 - Comments (0) - Print - #

srijeda, 22.10.2008.

ORM of the future - Bring it on!



"A Repository mediates between the domain and data mapping layers, acting like an in-memory domain object collection. Client objects construct query specifications declaratively and submit them to Repository for satisfaction. Objects can be added to and removed from the Repository, as they can from a simple collection of objects, and the mapping code encapsulated by the Repository will carry out the appropriate operations behind the scenes. Conceptually, a Repository encapsulates the set of objects persisted in a data store and the operations performed over them, providing a more object-oriented view of the persistence layer. Repository also supports the objective of achieving a clean separation and one-way dependency between the domain and data mapping layers."
- Martin Fowler (Patterns of Enterprise Application Architecture)


Repository pattern is very intriguant one, as it troubled myriads of developers through years. Also, I have had some experience on doing such, as well as using some proprietary 3rd party application architecture models.

...

The problem with the business objects is that they cannot be mapped with the data storage objects without our intervention. This issue have troubled many software architects for years. While developing their own frameworks, or adopting their existing architecture to, somewhat, misterious solutions, they has been no good answers on doing it.
Well, we have had some solutions, but personally, somehow it just do not fit into my paradigm of thinking.

Please, feel free to visit the blog the see how I'm going with the lammentation about the ORM which I consider investing in the future.

I plan to begin with the NHibernate, then switch to LINQ, and finish with the ADO.NET Entity Framework, which is promising child.

Kind regards,
Ratko

You can start to follow my lamment by clicking here!

- 19:34 - Comments (0) - Print - #

ponedjeljak, 20.10.2008.

New Data Types in SQL Server 2K8

Seven new data types are being built into SQL Server 2008, and they provide the means for working with and simplifying the management of more complicated data.

Frankly, there are extensions in two fields: time and spatial information. I'll present in this post these brave ones...



The Date Data Type

The problem with the old SQL Server's datetime data type was that users did not have the ability to work with date and time information separately. Date data type now stores only date component, ranged from January 1, 1000 to December 31, 9999. Each date variable requires only 3 bytes and has a precision of 10 digits. So, the unit is a single day.

The Time Data Type

Time data type separates time component from the old datetime data type. It deals with hours, minutes, seconds and fractions of seconds. It is based on a 24-hour clock, and has supported range from 00:00:00.0000000 to 23:59:59.99999999 (dot separates seconds and its fractions). The default precision is 7 digits, but you can adjust it as you like when you create the column. The accuracy equals to 100ns.

CREATE TABLE WorkingHours
(
EmployeeID uniqueidentifier,
WorkingDay date,
StartTime time,
EndTime time
)

The Datetimeoffset Data Type

This data type provides time-zone information along with the datetime information. It is indicated with plus or minus sign before time part of data. So, the format of one data could be: '2008-10-21T08:34:00.1234567+01:00', i.e. October 21 2008, 08:34 AM in +1 hour zone (Zagreb, Wien). It is notable to say that it is depicted in ISO 8601 format - a standard format for date and time data types.

The Datetime2 Data Type

Datetime2 provides us with more precise datetime data type. More exactly, it ranges from January 1, 0000 (instead of January 1, 1753) through December 31, 9999. The precision of the time component is the same as in time data type, so 7 fractional seconds. The original datetime type provided three digits of precision and a time range of 00:00:00 through 23:59:59.999.

The Hierarchyid Data Type

Now we comes to something very cool - a special type for helping us storing hierarchy structured data.
Imagine that you need to create a table with some project structure in which there are developers responsible to project managers and project managers responsible to chief executive officiers. You would normally implement this as one table with some additional key depicting this relation. Well, the hierarchyid is such a key, in fact, it resembles this structure and it is being filled with the corresponding values by using special functions, as presented below:


  • GetAncestor - returns a hierarchyid that represents the nth ancestor of this hierarchyid node

  • GetDescendant - returns a child node of this hierarchyid node

  • GetLevel - returns an integer that represents the depth of this hierarchyid node in the whole hierarchy

  • GetRoot - returns the root node of this hierarchy tree (static method)

  • IsDescendant - returns true if the passed child node is a descendant of this hierarchyid node (static method)

  • Parse - converts string representation of a hierarchy (i.e.'/2/1/' to a hierarchyid value (static method)

  • Reparent - moves a node of a hierarchy to a new location withing the hierarchy

  • ToString - returns a string representation of this hierarchyid node


Here is one example on how you can employ this data type:

CREATE TABLE ProjectStrucure
(
EmployeeId uniqueidentifier NOT NULL,
EmployeeName varchar(50) NOT NULL,
ProjectNode hierarchyid NOT NULL
);

DECLARE @manager hierarchyid = hierarchyid::GetRoot();
DECLARE @employee hierarchyid;

INSERT INTO ProjectStructure VALUES (NEWID(), 'Ratko', @manager);

SET @employee = @manager.GetDescendant(NULL, NULL);

INSERT INTO ProjectStructure VALUES (NEWID(), 'Tvrtko', @employee);

SET @employee = @manager.GetDescendant(@employee, NULL);

INSERT INTO ProjectStructure VALUES (NEWID(), 'Simon', @employee);

From this code snippet you can see that 'Ratko' is manager to 'Tvrtko' and 'Simon' should report to 'Tvrtko' as he is his direct supervisor. We can easily query to whom should 'Tvrtko' report by using the following line:

SELECT EmployeeName FROM ProjectStructure WHERE ProjectNode.GetAncestor(1) = (SELECT ProjectNode FROM ProjectStructure WHERE EmployeeName = 'Tvrtko')

Hierarchyid columns tend to be very compact because the number of bits required to represent a node in a tree depends on the average number of children for the node (commonly refered to as the node's fanout). So, a new node in an organizational hierarchy of 100.000 employees , with an average fanout of six levels would take around 5 bytes of storage.

GEOSPATIAL DATA TYPES



Yes, this is my house somewhere in the countryside... thanks to Google Earth! And it presents one small example of usage of spatial data types - types that identify geographic locations and shapes, primarly on our mother Earth. These can be your cottage, a company location, a road, a hidden treasure on some pirate island.
In fact, SQL Server 2k8 provides us with two different data types - a geography and a geometry data type.

The Geography Data Type

Geography data type works with 'round-earth' data, that factors the curved surface of the earth into account in its calculations. Position is given in longitude and latitude (not yet in height ;) ).

The Geometry Data Type

On the other hand, geometry type works with the 'flat-earth' or planar model. In this model, the earth is treated as flat surface beginning in one known point. This flat model doesn't take into account the curvature of the earth, so it's primary used for short distances, as building interior or metropolitan area.

All these data are standardized in format, using Open Geospatial Consortium (OGC) Simple Features for SQL Specification. You can find more information on this site.



- 19:15 - Comments (0) - Print - #

ponedjeljak, 06.10.2008.

Welcome to DevArena - Strength and honor

Marcus Aurelius: And what is Rome, Maximus?
Maximus: I've seen much of the rest of the world. It is brutal and cruel and dark, Rome is the light!




I proudly welcome you to join us at DevArena this year. The conference will be held at hotel International, Miramarska 24, Zagreb, on 21.10.2008. Reserve your time and apply today!

The most tech-event is coming to take place, in which Ekobit experts, in cooperation of eminent guest speakers, will present you actual development topics on Windows platform, solutions in team development, management of application lifecyle (ALM), integration and portal development. Besides classes and presentations, DevArena 2008 will be enriched with a complete new type of interactivity - chalk&talk sessions!
Use your opportunity and join in choosing the topics on site, and find answers and solutions for your everyday work.

Feel free and follow this link for furher info: Dev Arena 2008

Come, listen, participate!

Included sessions:
BizTalk Server Jumpstart for Developers (by Nadja Roić & Kornel Boros)
VSTS Rosario wonders (by Ognjen Bajić)
Microsoft Platform vNext: Cloud + Services + Software (by Ratko Mutavdžić)
Automated Testing with PEX i White (by Renata Kovačević, Janja Tomšić, Marko Lohert)
Modelling of web/wcf services using Web Service Software Factory (by Ivan Kardum)
Unified Communication (by Tomislav Bronzin)
Design Patterns in C# 3.0 (by Maro Marčinko)
Windows Workflow Foundation (by Martin Kralj & Vedran Mustać)
Integrations (EAI, B2B, B2C) (by Kornel Boros)
Sharepoint web parts & Silverlight (by Saša Tomičić & Zoran Šantek)
Web UI Patterns (by Kristijan Prikratki & Dejan Martinčević)
Pro's & Con's of Agile Software Development (by Ognjen Bajić & Ivan Kardum)
ASP.NET MVC Framework (by Siniša Kušanić & Hrvoje Hudoletnjak)
AJAX - ClientControls (by Renato Železnjak & Branko Vlaisavljević)

- 10:50 - Comments (0) - Print - #

petak, 03.10.2008.

Book Review: Pro WF: Windows Workflow in .NET 3.0 (...)

(continued from the previous post)



A Quick Tour gives a good example on how to establish working environment and build one simple "Hello world" project. Inside you'll see what to expect from WF and how to manage to live with it. Nevertheless, Foundation Overview chapter is very good explained. It's good structured, in terms of that you don't have to install and use all of the features of WF at once. The phenomenon of "services" is heavily used through the book, as it's hype theseadays to use it. Well, I also have an opinion about'em, but curretly, I'm keeping it to myself...

What makes the story about workflows very intriguant is that they are "persistent" and "compensatable". It thrilled me from the beginning, because I have couple of "close encounters" with this "persistance" and "compensation" before in my short life. Nevertheless, later about that...

Activities

Why WF constantly reminds me to SQL DTS packages? Huh, maybe it's derived from it, but it's evolutionary better. As you probably know, DTS is history now, it has been replaced with SSIS (SQL Server Intergration Services) and BIDS (Business Intelligence Development Studio) - a wicked brother of VS.
As in DTS package, you have very neat collection of components you can work with, and only one "real" component you can "massage" the code. It's code activity and it's intended to mess the workflow with non-eligible code-behind. Well, you can't avoid after all.
Another good point is that you can take a base activity class (workflow itself is treated also as one big activity) and extend it as you wish (as much as you dare...). That means, you can have your own desired activities you can later reuse and test separately (hey, how do I unit test this stuff, anyway?). For example, consider creating ReservationBooking, CreatingRequest, OrderProcessing, CalculateTax as some of your custom activities.



As a big surprise, WF is not a system, it's part of a system. And this is good.
Why? You can host it wherever you like: desktop app, web service, web app, you name it.. For that, you'll typically create some management or handling classes. I had once one example of this: one monitoring app which displayed semaphore lights indicating statii about executing some of the background tasks, with some config features. Well, parameterization and configuration is also managed here. It means, you wouldn't normally need to "reset" the app to change the settings (as you'd need with NT services). So, you benefited.

Flow of Workflow



The flow is the same as you look the SSIS today. It allows also to debug the workflow on-the-fly. Also on live system, but it's not recommended (of course, who would recommend this?).
First thing which buggers me is that I think it's primarly intended to be used for "middle-earth" applications, i.e. building business models for specific business problems. I cannot imagine it to drive GUI's flow of instructions, or data-driven apps. Just can't. But, you can nicely see how it mingles inside some robust large-scale business app.

Persistence

Imagine you have some application you want to run for one year, or couple of months. I have survived this scenario before, but with performance monitor on the screen, tuning the memory usage and a headache. Now it won't be like that - it's promised by WF. The thing is that i'ts guarrantied by persistence service not by workflow itself. Workflow just "conforms" with this service, and therefore, it lives in some special store that is managed by persistance service, and it's not dependent on a particular instance of an app. When there is no work to be done, it just stays idle and wait for its master to be obedient.
For this purpose, state machine workflows are particulary useful. Of course, you can do it also without it, as you can go from Europe to Afrika by feet. Very practical. They function by raising and handling events. That reminds me on one debate in which there are no "services" at all, but rather only resident apps (as viruses) which stay in memory and keep in mind that CPU cycles should be low...

Compensation

The term "compensation" is misused here. Actually, we're not compensating something, as you'd thing of it as "else block", but rather try to "undo" the stuff you mistakenly done before. How you'd get this thing work? Through the usage of transactions. You create a transaction and stuff it with some code (i.e. activity work) and commit it. If it succeeds it can be "compensated". Ugly terms. Traditionally, that work must involve a resource manager such as a relational database or other store. By implementing the IPendingWork interface, you can undo things in a way you desire.

Rules of Engagement

One of my favourite themas - business rules management. As I've already tackled with this topic in one of my book reviews, it can be done in myriads of ways. And programmers are just one creative kind of people. So, they are never satisfied with this, and neither am I. Really, every time I think it's not good enough. The nearest solution to the perfection, I have to confess, is Validation Application Block. It rocks... But, for the sake of this tech, we'll stick to the Workflow Rules whose are not so neat, but they work. You have Rule class, RuleSet class, dependency properties (check out the WPF), you can manage it through the code or externally (declarative rulez).

Other warez

Dynamic Workflow Updates provide you to make dynamic changes to the structure of a workflow instance (I just pasted it from the book ;)). Tracking is a built-in mechanism that automatically instruments your workflows. (also pasted).

Hosting Workflow Designer

And, finally, workflow designer itselft can be hosted by your app.
What I actually meant that it is WF in the first place, but now it seems to be a feature! Nevertheless, it is awesome. I just can't help myself on stopping ideas to come into my head - how much applications I could build that looks awesome with it !! Imagine you have a PBX system you can visually configure (on how a call is forwarded, a girl on answering machine spells you soft words, ...), a hardware network of your company (or home in my case), a delivery system (hey Amazon, I have a solution for you!)....
The bad is that you have to use lots of low-level classes such as DesignSurface, WorkflowLoader, and so. But when you once establish the style, you'll fly with it.
I've seen that some guy did it as an AJAX application and it's somewhere available on web. Wonders you can do with it.

So, what to say about the book and about it's topic?
"We'll meet again, but not yet.. not yet.." (Gladiator)


- 13:01 - Comments (0) - Print - #

<< Prethodni mjesec | Sljedeći mjesec >>

Creative Commons License
Ovaj blog je ustupljen pod Creative Commons licencom Imenovanje-Nekomercijalno-Dijeli pod istim uvjetima.