Ratko Ćosić - lamentations of one programmer

srijeda, 27.02.2008.

SLP services - licensing i packaging

So, to continue with the story from before...

LICENSING AND PACKAGING

As I wrote before, SLP services is awesome thing in terms of protection of code, but they also give complete solution for implementing the system of licensing app. There are no more those sorrow days when we were buying just retail versions of software (ok, buying is hard word... maybe borrowing ;). In time of 'software-as-a-service' and 'pay-per-use' solutions, SLP gives us reliable model for supporting this, through:

- turning on/off different feature sets (modules) without recompiling of code,
- adjustment of code for each users (personalization),
- business models such as SaaS (software as a service), pay-as-you-go and subscription ,
- different kinds of application upgrades (ehm, 'upgrade'... it's usually just a patch or correction of bugs),
- sending of product electronically, 'on-demand' shipping, sharing of installation media, etc.

1. Feature-level control (modular configuration)
The point is that every specific method inside the code has to be mapped with predefined module (feature), and then, before calling, it is checked if the module is activated or not. This represents a new philosophy called "license as business logic". That means that always a full version is shipped to a customer, and then, through SLP services, business rules are defined which determine which modules will be activated and what in a whole will be influenced by that change.
Additionally, SVM records (audits) how many times a module is used (feature set). Although the code remains untouched, developers may use also Service Code Protector SDK if they want further to 'massage' on that functionality and adjust it their special needs.

This topic brings me back in time when I was working on first bigger projects, and on problems with the delivery and handing of different kinds of program versions. It was extremelly difficult to split what is sending and what is not sending to the customer - especially in terms of database and data objects in particular (for example, triggers which update tables of some other application modules, etc), and mess around the application and other components, references, files of all kinds, reports, changing of business rules and politics in-the-fly. Alltogether, that story will, I guess, continue to be told and lived, because, the war isn't over - it's just one battle won.

2. Adjustable possibilities of licensing
SLP offers building blocks ('lego bricks') for creating appropriate licensing model by desire of a customer and market:
- demo version - application works for a certain time period and then - either stops working or works with limited set of functionality,
- free basic version - it's version in which basic set of functionality is offered (most commonly through limited time period), but for more advanced options (or for resuming of usage) there should be some money spent,
- subscription model - application license should be replenished regullary to make application work.

Image Hosted by ImageShack.us

All this work is made via SLP online portal which has couple of nice forms for entering the licences and options in preparing the packages (see the picture). You can adjust start and end of licence period, or unlimited usage, 'grace period' (merciful time), is the license renewable, and so on.
Licenses can be configured to limit number of concurrent users and/or number of times starting the app. You can turn on application activation via internet or manually (offline). Also, you can allow licensing during the testing time even on virtual machines - VMWare/VPC (I'm just surprised that Microsoft put VMWare on first ...), etc. Anyway, play around yourselft to see what else it has.

There is an option to monitor on how the users licence their applications. In fact, there is so-called SLP Server 2008 (simply put: Microsoft just cannot learn not to use word 'server') and SLP Online Service which monitors and waits 'somewhere in dark and quiet' to pops up when some activation, registration of module appears. We can, for example, find out how many times and when the application has been used, which parts are heavily used (or not used at all), and we can gradually put it in some 'cube' and data mine, of course.

LIFE CYCLE OF LICENCE

Bah, how I like those 'buzzwords', especially from Microsoft (although it's rather old one). It's near the day when we will use 'GMO code', or 'hormonal services', or internet clouds (oops, we've got this one already). So, a license is living being. It starts its miserable life with defining of products and possibilities that it gives. All this can be done by using SLP Server 2008 (locally) or SLP Online Service (web).
Further, we define a set of functionality - modules which we can additionally licence, with definition what goes with the demo version, what goes with basic version, complete (full), or whatever flavor we choose to distribute our solution (home, enterprise, etc). All this we adjust with our source code by linking the methods with SLP terminology, and that's it.

Activation of licence

When the software is being distributed, it comes with the activation key required for its unlocking. It's nice thing because it eases our suffer with crypting the code which anyway doesn't have to do anything with the business rules. When the code is precompiled using SLP Code Protector, during the startup sequence of our, if one is not activated, a small wizard pops up which allow us to enter activation key and continue with our work.
What happens in the background is that the app connects with the SLP Licensing Control Portal to check the key and unlock the app for execution. If there is no internet connection available, a user has the possibility to install manually licence file.

And, at the end of the lament, I review the tools we can use for SLP services:

SLP SERVICES TOOLS

SLP Code Protector
As I said, it concerns the protection of the source code. It transforms .NET MSIL in protected encrypted SVML and links parts of the code with the 'feature sets', i.e. licences.

SLP Permutation
It allows transformation of MSIL into Secure Virtual Machine Language (SVML) - it contains distributable SVM for including it into the app. It allows digital signature for the licensing. Consider it SLP SDK.

SLP Service 2008 (locally) / SLP Online Service (web)
It allows management and maintenance of licensing models - defining of licenses, modules (feature sets), publishing of licences, reacting on the activation.

Activation Packs
Packages of licences (commercial or free) for software activation. It is installed, by the way, during the creation of .msi file.

Here, I hope that I've made closer this topic to you, which personally interests me a lot, and impressed me because of its simplicity and practical usage.

And now, the bad news...

Microsoft sells the whole story in a few modalities.
SLP Online Service is sold subscription-based, in which there are three different kinds of it based on its power and might: Basic (costs 42 $ a month), Standard (625 $ a month), and Enterprise (which costs 'chilli' 1.667 $ a month).
SLP Server 2008 (which is actually a beating heart of everything) costs in standard equipment 'from 23 thousand above', and for Enterprise you have to spent 'at least 60 kilos'.

Well, obviously the Mic applied the SLP system firstly on the SLP itself, and that on the package system. So, so-called basic permutation (5 methods of products) costs additional 750 $, and unlimited permutation whole 5.000 $.
Also, activation packs are additionaly purchased: 1.000 $ for 1.000 licences for commercial purposes, and 100 $ for 10.000 licenci non-commercial activation packs. I knew there is a catch. Phew!

Anyway, SLP stands as a good designed framework for protection and licensing of applications.

Kind regards,
Ratko.

- 18:15 - Comments (0) - Print - #

ponedjeljak, 25.02.2008.

Microsoft Software Licensing and Protection (SLP) Services

What is SLP?
SLP Services represent a collection of applications and services designed with the purpose of protection of intellectual rights against illegal / incorrect usage, through protection of code and licensing system.
In next posts, I will try to explain the purpose and ways of usage of SLP services, differences between already existing solutions and advantages of SLP technology.

Image Hosted by ImageShack.us

You can find all that in more detail at:
Microsoft SLP Services

The most important facts about SLP services:
Code protection - although we can help to protect our code by using obfuscation, encription and code splittng, SLP code protection brings us a new perspective on this and solves the deficiencies of earlier approaches.
Packaging and licensing - SLP services brings us different possibilities of feature-level control and creating different models of licensing of our solutions - in which purpose different tools had to be used.
Lifecycle of licensing was introduced - ajdusted tools which are used during the different project phases of development and deployment, personalized to every user.


So, let's begin...

PROTECTION OF CODE

What's been before the SLP services?

1. Obfuscation of code
Most of the obfuscation tools make renaming of titles of classes, methods, parameters and variables in some meaningless text. Also, obfuscation tools change the program flow in manner that it comes to the same result but it is harder to decipher..Obfuscation of code can slow down a little bit an attacker to break the code, but it doesn't permit decompiling and reverse engineering of code. It just make reading of program code harder. Also, it is possible that bugs occur when changing the names of objects and changing the program flow.

2. Code encryption
Some attempts to protect the code crypt the code by a certain encryption algorithm and decrypt it back again during its execution. The problem in that case is the delivery and keeping the key secret. Moreover, the moment of decryption leaves the possibility for the hackers to break into the system and hack the code.

3. Code splitting

Code splitting is also a method of protection program code, and it works in such a way to split the code into two halves. Less sensitive part is been delivered as before, and more sensitive part is been delivered on a special peace of hardware made for that purpose - mostly smart card or security key. Thus the most secure approach, but you may guess, it has multiple disadvantages - high costs, clumsy handling with special equipment, parts, and so on.

SLP approach of protection of code

The general strategy of the SLP services is to take the source code and transform it in such way that its logic is encrypted and obfuscated, and to finally avoid its direct execution by CLR. So, all that good stuff inside the previous strategies is combined to produce one solid solution.
For code transformation a special tool is used named SLP Code Protector which transforms compiled MSIL code into so-called Secure Virtual Machine Language (SVML) which cannot longer be executed via CLR directly or discovered through the recompilers.
Each version of SVML is different and, as such, demands unique protection virtual engine (SVM). Even it goes so far that each application from the same provider can have different permutation.

Image Hosted by ImageShack.us

After building the app, already mentioned SLP Code Protector should be used to identify which methods would be masked. Because it is relatively 'costly' operation, there should be picked only the necessary methods, i.e., those containing confidential information (for example, connecting to a database, storing and entering a password, etc).

private static OleDbConnection GetDBConnection()
{
if (_dbConnection == null )
{
string connectionString= “Provider=Microsoft.Jet.OLEDB.4.0;”
+ “DataSource=Access.mdb;”
+ “Password=somepassword”;
_dbConnection = newOleDbConnection(connectionString);
_dbConnection.Open();
}
return _dbConnection;
}


After the protection process, when you look up the code, let's say inside .NET Reflector, code looks as the following:

private static OleDbConnection GetDBConnection()
{
object[] args = new object[0];
object obj2 = SLMRuntime.ExecMethod( null, "DS9FvZGluZyAvV2luQW5zaUVuY29kaW5A0vQmXZ“, args );
return (OLEDBConnection) obj2;
}


That' all for now...

Continue with the topic

- 11:23 - Comments (0) - Print - #

četvrtak, 21.02.2008.

Book Review: Pro SQL Server 2005

Title: Pro SQL Server 2005
Authors: Louis Davidson, Robin Dewson, Adam Machanic, Jan D. Narkiewicz, Thomas Rizzo, Joseph Sack, Julian Skinner, Rob Walters
Publisher: Apress
Pages: 704
Links: http://www.apress.com/book/view/1590594770

Overall mark: B+

Alltogether, finally good booklet.
The thema is generally well-known and clear, the chapters have good structure, no exhaugurated 'touring the engine' in the beginning, no appendices at the end, each chapter is one and only topic. I just say: on schedule.

First chapter: overview and installation. Good review of all versions of SQL Server, and features.
Second chapter: database technologies. Also good review of just differences and improvements from the previous version; no boring lamentations about the terms like database, table, instance, and so on. Just the differences. Short. Functional.

Chapter 3: Improvements for developers. Really good. Special review of differences and improvements for just programmers. In fact, I was astonished on how much new version of SQL Server 2005 gives, and that we don't get spammed too much about that (for example: pivot, recursive queriesi, sampling, except, intercept, CLR, xml data type).

Chapter 4: Improvements for sys-ops. Again good (at least, I think, I'm not sys-op). Dynamic views are not such a breakthrough as DDL triggers - really useful thing. Indexes, partitioned views and tables, snapshots (I immediately get association with VMware and option to have multiple snapshots, but it doesn't have nothing to do with it even conceptually ;) ).

Integration with .NET. That's all what we've waiting for. Shall we? As I've found out from the book, not really. .NET integration should be used 'with caution', only when we really have to use it, otherwise it's too expensive of performance. Anyway, nicely explained thema, indeed.

SQL Server and XML. Just like .NET integration, two chapters. I guess it should point out the importance of the subject (it is one of the reasons that book didn't get A). A new data type (xml) is explained, updating data through XML, 'pinging' of data via query methods, full-text query, dynamic views, and so on...

And now, the best part: overviews of additional technologies (features) - one per chapter:

- SQL Server 2005 Reporting Services (chapter 9)
- Analysis Services (chapter 10)
- Service Broker (chapter 12)
- Integration Services (chapter 14)
- Notification Services (chapter 16)

Each of these chapters is, although short, good and remarkably explained. In each of them we can find main 'rules of engagement' these different components along with on how should we use them. It's interesting that Apress printed each of this topics as separate book. So, if you're interested about it more, get it and read it one by one. Who wants it, let him grasp.

Oh, yeah... There are some messed up chapters. Advanced functionality of database is little bit meshed with mentioned features, but what a hack. We will forgive the guys in Apress for such lapsus. There are:

Security (chapter 11)
Automation and Monitoring (chapter 13)
Database Mirroring (chapter 15)

Altogether, nice made book with some minor weaknesses which do not influence significally to readiness and qualiti of the book.

Happy reading!

- 00:28 - Comments (0) - Print - #

srijeda, 20.02.2008.

Book Review: MCPD Self-Paced Training Kit (Exam 70-548): Designing and Developing Windows-Based Applications Using the Microsoft .NET Framework

Title of the book: MCPD Self-Paced Training Kit (Exam 70-548): Designing and Developing Windows-Based Applications Using the Microsoft .NET Framework
Authors: Bruce Johnson and Mike Snell of GrandMasters, with Shawn Wildermuth
Publisher: Microsoft Press
Pages: 704
Links: http://www.microsoft.com/MSPress/books/10093.aspx

Overall mark: C

I remember my first passed exam in 'ancient' year 2001. It was the exam named 70-100: Analyzing Requirements & Defining Solution Architectures on that time, more recent, Visual Studio 6.0 platform.
That exam was alfa and omega of the Microsoft's thinking on how to architect the theoretical basis on which it should be built everything regarding development, in time of class modules, COM, DLL hell, MTS, and distributed applications. It that time, when first that I've read on theoretical basis was Projecting the information systems by prof. Brumec (jeez, those melancolic informbiro times), and for the first time I've heard about some 'white papers' of MSF (Microsoft Solutions Framework).
If I would grading the book for preparation of that exam, it would probably get the worst mark. It was, on my pers opinion, the worst book I've ever read - what because of MSF concept by itself, which was everything but the praxis, what because of that fact that the answer for everything was - use Microsoft's stuff, whatever solution you need. Just imagine: the solution was to use InterDev or FrontPage for web application! (instead of, of course, PHP, or at least DHTML).

In this book, and the book is especially considering Microsoft Press, the reflection of official thinking inside the Microsoft Envisioning team, it was made the important step forward and advancement toward the reality in development of IT projects (do not say, solutions).
In fact, there are no more draws of unnecesary and almost everytime questionable graphicons and 'mind flows' as waterfall idea, except in the second chapter. Why then is that advancement is in that that the terminology more 'natural' and clearer, the topics are more concrete, and not so much 'conceptual', but that's unfortunatelly everything.

On the other side, I like the chapter lineage.

First chapter (Application Requirements and Design) has nice title, good firm developed subject and lots of good advices. For example, philosophy of 'proof-of-concept' projects is very good known for me because I meet it in everyday work, also the suggestions about buying or developing the parts ourselves.

The chapter that follows, which I rather jump over, but it's there is: Decomposing Specifications for Developers.
Promising title, but it's unfortunatelly nothing new. We are again guided to a logical model, and then logical architecture, and then on their evaluation (reading the third chapter continuously).
Jeez, Object Role Modelling is still there! It's just like mother-in-law that just won't go! It's unbeliveable on how this concept is still pushed which generally doesn't make sense and nobody uses (at least, nobody I know). Who uses it, people? In modern era of class diagrams, database diagrams, not to mention Workflow Foundation, who on earth would use ORM? Application layers are also there, phisical models, blah blah ...

Next conclusion crosses my mind: as long as the technology, that means, development tools, do not get to the level of interpreting the theory, it doesn't make sense to use this theory.

Example on that? Class Diagram.
How many of you created and worked with schemas of classes and components (not on paper, we still all do that), and maintained them and kept it actualized until the end of a project? I don't think many of you. The reason for that: lack of technology, i.e. tools which would do the 'hard work' for us. Then comes class diagram in great style, and as a consequence - we use it. Collaboration diagram? Pseudo-code? No, thanks! I'll use it when VS generates by itself the lifecycle of my class and when based on that I conclude that something is wrong and afterwards reingeneer that myself.

More about the evaluation and markings of application...

Now it's called Design Evaluation (performance, maintainability, extensibility, scalability, availability, recovery, data integrity, use case correctness), but before, it was PASS ME analysis (acronym from: performance, availability, scalability, security, maintenance, extensibility). Moreover, there is also QoS (Quality-of-Service), Agile Development with all those Sprint phases, and so on.. Gosh, man just really starts to believe that someone really sits upon that and cares that nothing evil doesn't happen. Well, that usually doesn't happen. If this really is the case, still the errors might occur.

Chapters 4 and 5, which concern the visual appearance, i.e. GUI, will be soon (if not already) 'swallowed' by the WPF or Silberlait (and btw, don't work for web). Chapter no. 6 is for children (I've just rest my underlining marker for a while), and for chapter no. 7 I've sturdly spent it (because it's for the experts). Why splitting it because it is about the same subject? Why can we say instead: Component Design and Development?

Rest of the chapters until the chapter 12 follow the same premise as the previous booklets for such an exam; nothing new, nothing spectacular, nothing special to lament about. Besides, this 'meat' is probably very good 'chewed up' inside at least two exams (70-536, 526..). I'd say - redundant in this book and exam, but, once again - never too much to know too much (I hope).

Testing is something new what Microsoft has included in their philosophy of thinking, so the chapters 12 and 13 give somewhat refreshment to the story. Unit tests, integration and stress tests and explained, and more.
It's interesting that company Ekobit has one good solution for working with the tests, so feel free to examine it a little bit more (free trial):

Test Manager

Also, code review, refactor code, triage of bugs - all the things used already (I hope you do), have their place in the book and also in MCPD certificate. About deployment and maintenance not to speak - by default included. It should be and it's there.

And now: surprise!
Multimedia inside Microsoft's applications! Hurray!
Naturally, don't you evey think that new DirectX 10 is explained alltheway instead. It's in fact about audio (wav, wma, mp3) and video (wmw, mpeg4) formats. Simpaticaly explained sampling, compression, streaming, and even Digital Rights Management (bah, we're afraid of piracy, or what!).

And that's all folks!

P.S. Don't you worry, in exam there are again all-in-the-earth-known Contoso, Woodgrove Bank and Worldwide Importers. In other words, nothing new under the sun.

Kind regards,
Ratko.

- 17:34 - Comments (0) - Print - #

ponedjeljak, 18.02.2008.

Book Review: MCTS Self-Paced Training Kit (Exam 70-536): Microsoft® .NET Framework 2.0—Application Development Foundation



Title: MCTS Self-Paced Training Kit (Exam 70-536): Microsoft® .NET Framework 2.0—Application Development Foundation
Authors: Tony Northrup and Shawn Wildermuth, with Bill Ryan of GrandMasters
Publisher: Microsoft Press
Pages: 1088
Links: http://www.microsoft.com/MSPress/books/9469.aspx

Overall mark: C

This book is intended, as the title says, for preparing the first exam in a row for MCTS certificate. The certificate, which, in difference of the previous ones (MCSD, MCAD), it has relatively more useful and worthful path, and that is, to prepare a programmer - software ingenieur for real life (kidding, for life with the .NET 2.0 platform). Jeez, what a pitiful mission.

Unfortunatelly, I'm a little bit late with the review, although I've read the bool long time ago. More over, I wanted to start a series of reviews of dev-books exactly with this book especially because it is first from the new series.

Anyway, I'm not a fan of books which begin with two or three introductionary chapters in whom 'nothing happens'. So, chapters like 'what's an object, what's a class, what is .NET framework'. 'Haven sake, these topics just permits the programmer to dive with the matter, so I just avoid those books or easily override these intros.
One of the examples of good book which follows the principles 'load me with the meat', is Pro SQL Server 2005, which is made by the principle - one topic - one chapter. No intros. Just great. But, not this one..

So, let's begin...

First, the arrangement of the chapters is bad and unpractical.
Let's just say, the topics like types, classes, events, s and similar are situated in two different chapters and moreover with a hole in between (it's about chapter 1 - Framework Fundamentals, and chapter 4 - Collections and Generics). I think that the topics are unnecessary splitted on two different chapters just to conform the pattern of '16-chapters-per-book', which becomes a certain standard of 'good book'. Phew!

Also, regex, as an important part and relatively complex for a simple programmer is included in chapter with the encoding, which is the topic by itself enough for one or two pages.It would have been better to put the regex inside a separate chapter, maybe under the appendices.

The similar situation is with two concurrent topics in the chapter 8: Application Domains and Services. I mean, why one chapter for such isolated topics? Plus, we do not know about what services we are talking about?(actually, it's about NT services). More, the same mistake is in the following chapter (chapter 9: Installing and Configuring Applications).

Wrong 'helter skelter' lineage is with the topics regarding graphics, text and multimedia. It's just too many saturated chapters along the book.

And now, listen carefully: a whole chapter about writing an email? It's just unbeliveable as it seems. The thema is so trivial that it can be written on one page. Bravo.

Definitely you can notice the different writing style in forming of some chapters, which is product of the cooperation of different authors, which is generally ok (maybe there are more of them, but not mentioned ;) . But, to not exagurate, it's ok for a book for general purpose book, and for prepare yourself for that exam.

It's nice positive surprise to read the chapter about .NET security. In matter of fact, to me, this topic is always hard to read and understand (probably because Microsoft itself in this case don't give concise rules and solutions), so it was good and concrete.
Reflection and interoperability follows this path (so, probably it's the case of the same author(s)).Also, there is serialization, threading, also very good explained.

Alltogether, the book is good to read if you're going (still) on the exam.
And if you aren't, at least you can read it, especially more 'techical' chapters.

Nice reading!



- 23:21 - Comments (0) - Print - #

petak, 15.02.2008.

Vista Gold Certification - RM-aware applications

Image Hosted by ImageShack.us

As an ending of this story, I want to mention that the reacting on Restart Manager, that is, creation of environment for the application recovery is the prerequisite for application to get prestige Vista Gold Certificate. Alas, test number 30 speaks about that, so do read the following, if you are interested about that:

Certified for Windows Vista Test Cases

Is the application ready to satisfy the criteria to pass the test no. 30, it's easy to demonstrate following the suggested instructions (Microsoft):

1. Launch the application.
2. Open Windows Task Manager
a. Click on Processes tab
b. From the View menu select columns
c. Check PID (Process Identifier) check box and click OK to add the PID column to Windows Task Manager
d. Click Image Name to list processes by name
e. Find the application image name and make note or the PID
3. Open command window.
4. Change directories to the directory that contains the Restart Manager Tool.
5. From the command window inject a shutdown message to the application through the RMShutdown API in the following manner:
a. If the application is designed to shutdown and restart; type “rmtool.exe –p dwPID –S –R” (dwPID is the application's Process ID and can be obtained via task manager) this will force the application to shutdown and restart.
b. Make a note if the application does not shutdown and restart
c. If the application is not designed to restart after shutdown; type “rmtool.exe –p dwPID –S” this will force the application to shutdown only
d. If your application is a service and is designed to restart after shutdown; type “rmtool.exe –f $dir –S –R” ($dir is the path to the service executable or dll) this will force the application to shutdown and restart.
e. If your application is a service and is not designed to restart after shutdown; type “rmtool.exe –f $dir –S” ($dir is the path to the service executable or dll) this will force the application to shutdown only.
f. Open Event Viewer by typing eventvwr from the command line or from the Administrator Tools.
i. Expand Windows Logs
ii. Click on Application
iii. There will be a RestartManger (Information) message for each Shutdown and Restart action performed by Restart Manager on each executable.
a. The shutdown message will contain the executable name and ProcessID in the log.
b. The restart message will contain the executable name and ProcessID in the log, with the ProcessID being different then the shutdown log. This log will have a later time stamp

VERIFICATION:

1. Using the Restart Manager Tool the application was able to be shutdown via the RMShutdown API quietly without user interaction while idle and restarted if the application was designed to restart after shutdown.
2. If the application is a service; using the Restart Manager Tool the service was able to be shutdown via the RMShutdown API quietly without user interaction whole idle and restarted if the service was designed to restart after shutdown.
3. There must be both an Information message with “Source” listed as RestartManger for each shutdown and restart action performed by Restart Manager in the Windows Application Logs on each executable above in order to pass this test case.

NOTES:

1. The application must be idle and must not be running or performing any operations while performing this test case.
2. All applications must listen and respond to shutdown messages using the Restart Manager API quietly without user interaction while idle in order to pass this test case.
3. The application or service must not cause an Access Violation and shutdown/restart safely in order to pass this test case.
4. Restart Manager shutdown messages are:
a. WM_QUERYENDSESSION with LPARAM = ENDSESSION_CLOSEAPP(0x1): GUI applications must respond (TRUE) immediately to prepare for a restart.
b. WM_ENDSESSION with LPARAM = ENDSESSION_CLOSEAPP(0x1): The application must shutdown within 5 seconds (20 seconds for services).
c. CTRL_SHUTDOWN_EVENT: Console applications must shutdown immediately.
5. If the application or service is not designed to restart after shutdown, then there will only be an Information message with “Source” listed as RestartManger for each shutdown action performed by Restart Manager in the Windows Application Logs on each executable above.


I've tried and it really writes mentioned entries in the EventLog. So, the test has been passed!


- 17:03 - Comments (0) - Print - #

četvrtak, 14.02.2008.

Application Recovery - finally Restart Manager

To continue with the story of the application recovery...

Main benefit, on my opinion, which has Restart Manager on previous approach is the ability of registering processes, files, and services relying on the application. That means, if our application shuts down, all related objects will try to shut down or restart, depending on a situation.

More information on that you can find at:
Application Shutdown Changes in Windows Vista

But now, let me just continue further...

RM uses the concept of the session, i.e. piece of isolation where everything related with the RM is handled. Sessions can exists 64 per user. So, we're starting with RmStartSession, closing the session with RmEndSession. With RmShutdown command we 'shut down' all linked resources, and with RmRestart we restart all linked resources.

Step 1:
Copy-paste of the PInvoke functions (this time from rstrtmgr.dll).

[StructLayout(LayoutKind.Sequential)]
struct RM_UNIQUE_PROCESS
{
public int dwProcessId;
public Com.FILETIME ProcessStartTime;
}

[Flags]
enum RM_SHUTDOWN_TYPE : uint
{
RmForceShutdown = 0x1,
RmShutdownOnlyRegistered = 0x10
}

delegate void RM_WRITE_STATUS_CALLBACK(UInt32 nPercentComplete);

[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmStartSession(out IntPtr pSessionHandle, int dwSessionFlags, string strSessionKey);

[DllImport("rstrtmgr.dll")]
static extern int RmEndSession(IntPtr pSessionHandle);

[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmRegisterResources(IntPtr pSessionHandle, UInt32 nFiles, string[] rgsFilenames, UInt32 nApplications, RM_UNIQUE_PROCESS[] rgApplications, UInt32 nServices, string[] rgsServiceNames);

[DllImport("rstrtmgr.dll")]
static extern int RmShutdown(IntPtr pSessionHandle, RM_SHUTDOWN_TYPE lActionFlags, RM_WRITE_STATUS_CALLBACK fnStatus);

[DllImport("rstrtmgr.dll")]
static extern int RmRestart(IntPtr pSessionHandle, int dwRestartFlags, RM_WRITE_STATUS_CALLBACK fnStatus);

[DllImport("kernel32.dll")]
static extern bool GetProcessTimes(IntPtr hProcess, out Com.FILETIME lpCreationTime, out Com.FILETIME lpExitTime, out Com.FILETIME lpKernelTime, out Com.FILETIME lpUserTime);


Step 2:
What is following is a part of the code which does the job: creates RM session, registers dependent resources, shurts the resources and restarts them, and closes the session. The thing is trivial:

static void Main(string[] args)
{
IntPtr handle;
string key = Guid.NewGuid().ToString();

int res = RmStartSession(out handle, 0, key);
if (res == 0)
{
Console.WriteLine("Restart Manager session created with ID {0}", key);

RM_UNIQUE_PROCESS[] processes = GetProcesses("notepad");

res = RmRegisterResources(handle, // session handle
1, new string[] { @"Shared.xml" }, // number of shared files and list of shared files
(uint)processes.Length, processes, // number of dependant processes and list of processes
1, new string[] { "W3SVC" } ); // number of dependant services and list of services

if (res == 0)
{
Console.WriteLine("Successfully registered resources.");

res = RmShutdown(handle, RM_SHUTDOWN_TYPE.RmForceShutdown, ReportPercentage);
if (res == 0)
{
Console.WriteLine("Applications stopped successfully.n");

Console.ForegroundColor = ConsoleColor.Yellow;
Console.Write("Installing... ");
Thread.Sleep(5000);
Console.WriteLine("Done.n");
Console.ResetColor();

res = RmRestart(handle, 0, ReportPercentage);
if (res == 0)
Console.WriteLine("Applications restarted successfully.");
}
}
res = RmEndSession(handle);
if (res == 0)
Console.WriteLine("Restart Manager session ended.");

Console.ReadLine();

}
}


For the initialization of the session, I use new GUID and new pointer, which I later use also for calling another functions. By the registration of the resources, besides pointers of sessions, key-value pairs are required: number of files plus file list, number of processes plus process list, number of services plus services list. More over, there is a call of shutting down linked resources, and starting necessary resources. What I've found out is that if some of the resources is incorrectly stated, application fails or just continue to work malfunctionaly, and it doesn't react on RM events. Odd. So, be cautious.

Step 3:
For getting the information about the process, I use one, at first look, complex function which requires process by its name and returns an array of objects which complies RM in terms of processes (it requires additionally the date of starting the process), as I described in the following lines:


static RM_UNIQUE_PROCESS[] GetProcesses(string name)
{
List lst = new List();
foreach (Process p in Process.GetProcessesByName(name))
{
RM_UNIQUE_PROCESS rp = new RM_UNIQUE_PROCESS();
rp.dwProcessId = p.Id;
Com.FILETIME creationTime, exitTime, kernelTime, userTime;
GetProcessTimes(p.Handle, out creationTime, out exitTime, out kernelTime, out userTime);
rp.ProcessStartTime = creationTime;
lst.Add(rp);
}
return lst.ToArray();
}

static void ReportPercentage(UInt32 percent)
{
Console.WriteLine(percent);
}


So much about that, for now.

Continue with this topic

- 22:48 - Comments (0) - Print - #

Application Recovery - some code in C#

Let's just begin with something ...

First example is a small block of code which shows how we can respond on a wreck havoc, and this in such a way that we:
1. acknowledge the initiation of shutting down the os,
2. make sure that all necessary work is done, do not say that I should call it 'gracefull degradation' (just call all Flushes, Closes, Disposes and so on, depending of a situation),
3. call to Windows that we are:
a) ready for the final countdown, or
b) ready to block the shutdown because our app has a higer prio to finish some important operations.

Step 1:
We modify the entrance of the app in such a way that we provide the call of our application with the command-line parameter in the following way:


static void Main(string[] args)
{
if (args.Length == 1 && args[0] == crashHint)
{
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("I crashed but Vista restarted me :-)n");
Console.ResetColor();

RecoverMe();
}
else
{
RunAndCrash();
}
}

That parameter will be filled by AR itself when it try to start our application again after the crash.
So, this is the first crossroads in our app which routes the program flow depending on if we start the app or the app was started by the system automatically.

Step 2:
The following action is to define PInvoke functions from Application Recovery API. So, just do the copy-paste of the following code:

static string crashHint = "Restarted";

[DllImport("kernel32.dll", CharSet = CharSet.Auto)]
static extern uint RegisterApplicationRestart(string pszCommandline, RestartFlags dwFlags);

[DllImport("kernel32.dll")]
static extern uint RegisterApplicationRecoveryCallback(APPLICATION_RECOVERY_CALLBACK pRecoveryCallback, object pvParameter, int dwPingInterval, int dwFlags);

delegate int APPLICATION_RECOVERY_CALLBACK(object pvParameter);

[DllImport("kernel32.dll")]
static extern uint ApplicationRecoveryInProgress(out bool pbCancelled);

[DllImport("kernel32.dll")]
static extern uint ApplicationRecoveryFinished(bool bSuccess);

[Flags]
private enum RestartFlags
{
NONE = 0,
RESTART_CYCLICAL = 1,
RESTART_NOTIFY_SOLUTION = 2,
RESTART_NOTIFY_FAULT = 4,
RESTART_NO_CRASH = 8,
RESTART_NO_HANG = 16,
RESTART_NO_PATCH = 32,
RESTART_NO_REBOOT = 64
}


First function relates on registration of the application on restart (RegisterApplicationRestart), and the second three on handling the Application Recovery machinery:
RegisterApplicationRecoveryCallback - callback function which provides us to 'hack and slash' everything we can when it comes to the 'inevitable',
ApplicationRecoveryInProgress - function with a goal to 'ping' the system to prove that we are alive while we're doing the recovery (else we'll be shut by Windows after 5 seconds of grace), and
ApplicationRecoveryFinished - function which sends the signal that we are through with the func of the recovery.

Step 3:
Cool, what we have to do now is to register our app for the restart, and for the application recovery. It can be acomplished by calling of two methods in the beginning of the method which enters in 'normal workflow' of the program, in our case, RunAndCrash():

static void RunAndCrash()
{
uint i = RegisterApplicationRestart(crashHint, RestartFlags.NONE);
Console.WriteLine("Application restart registration {0}.", i == 0 ? "succeeded" : "failed");

i = RegisterApplicationRecoveryCallback(Recovery, "Just something", 50000, 0);
Console.WriteLine("Application recovery callback registration {0}.n", i == 0 ? "succeeded" : "failed");

// SOME FUNCTIONAL CODE GOES HERE!

throw new Exception("Kaboom!");
}


Step 4:
What remains is to fill the method called by RegisterApplicationRecoveryCallback function which is called after the crash, and according to that, triage of recovery:

static int Recovery(object o)
{
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine("nRecovering ... ");

Timer t = new Timer(KeepAlive, null, 1000, 1000);

// SOME RECOVERY CODE GOES HERE!

ApplicationRecoveryFinished(true);

return 0;
}


So, it is important to inevitably call at the end of the function that we're done with the recovery procedure, and to pass the control further to RM which tries to start the app again, and so on.

Step 5:
As I wrote, it's important that we 'ping' the system all the time so that it knows that we're 'alive', if recovery procedure just continue to flow. It can be accomplished with defining the alarm clock (Timer) which 'tickles' every, let's say, one second and calls a callback which, let's say, refreshes a progress bar, calculates the percentage of recoveryja, and so on. It's vital just to constantly call the function ApplicationRecoveryInProgress during its execution.

static void KeepAlive(object o)
{
// PROGRESS FUNCTIONALITY GOES HERE!

bool cancelled;
ApplicationRecoveryInProgress(out cancelled);

if (cancelled)
{
Console.WriteLine("Recovery cancelled");
Environment.FailFast("Recovery cancelled");
}
}


BTW, this is just re-delivery of different articles, but, at the end, I'll make some adjustments and simplified all this stuff. When it's started, the effect is really cool. Try it!

Continue with this topic more

- 15:18 - Comments (1) - Print - #

Application Recovery on Windows Vista

Here is one good redelivery of the articles from another blogs and msdn2 site...

The subject is really cute so I will explain it a little bit, and maybe in future make some session about it!
It it about Application Recovery and Restart Manager functionality on Windows Vista.

Description of a problem:

Windows break apart or Windows Update is under way. Application which you made is still working during that time and, probably, doing something, processing. What is happening with the application? What is happening with the data in the flow? What's with the database, transaction, file, in-memory vars? We've got 'Houston"!

Solution:

Windows Vista gives us something regarding that, and that is so called 'Application Recovery' mechanism (through kernel32.dll API), and even more with Restart Manager (rstrtmgr.dll). What is all about? Application registers itself as a "RM-aware" app which react on 'pinging' of already mentioned RM, and react in a way that it closes all open files, transactions, and so on, and prepares itself for inevitable end. Even you can force the Vista to cancel hers restart. Alltogether, 'sounds promising'.

I have made some little code snippet on that, and I will present it on the way.

Continue with the topic

- 14:58 - Comments (0) - Print - #

Sljedeći mjesec >>

Creative Commons License
Ovaj blog je ustupljen pod Creative Commons licencom Imenovanje-Nekomercijalno-Dijeli pod istim uvjetima.