Showing posts with label performance improvement. Show all posts
Showing posts with label performance improvement. Show all posts

Thursday, October 31, 2013

EventSource & Performance for High Volume Events

.net framework 4.5 introduced EventSource API to write events using ETW infrastructure. The choice of ETW infrastructure has made the writing of events lightning fast. Now they can be consumed by any ETW consumer. If the event provider is not enabled, then the events just fall on the floor. On the other hand, if a session is already established, they are written to ETW session buffers. Now they can be consumed by the consumers. The event provider (EventSource) doesn't really have to wait until the messages are consumed. This is independent on the number of consumers registered for a particular type of events. We have discussed about ETW tools and their usage with EventSource API [Reference].



As we discussed above, the choice of ETW infrastructure has made EventSource API extremely fast. But we can even improve on it further. But remember that these recommendations would be useful in the case for high volume events generally. For low frequency events, although you would have performance improvement, but this might not be very noticeable.

Optimization # 1: Use IsEnabled() before WriteEvent()
ETW controllers can enable a provider before registration of provider. So when a provider registers itself and starts emitting events, a session is automatically created and they are forwarded to the buffer. In order to prove that, we can start Semantic Logging Service (from Application Block) and then run our application. Our service should still be able to consume the events. On the other hand, if the provider is not enabled, the events just fall on the floor by the ETW infrastructure. In order to improve it, we can check if the event provider is enabled so that we don't even use the WriteEvent() definition in base class. It does check the same thing in the base class as well.

When an Event Provider is enabled (generated for EventSource), it receives a command to enable itself. We can even use this command in our custom EventSource by overriding OnEventCommand() method. You don't need to use the definition of base class method as there is an empty implementation of the virtual method.



It caches this in a local variable. So EventSource can use this field to check if it is enabled. There is no property available in EventSource type to check that but we can find IsEnabled() method to do just that. Now a method might cause some confusion for you thinking it might be slow as it might do some other stuff and could slow it down further. As the development team suggests, it is just a field fetch. We can also verify this by dotPeeking into the framework assembly from Windows folder.



As a matter of fact there are two overloads of the method. One of these overloads is parameter less. The other one is more interesting, it lets us check if the provider is enabled for a particular level and keyword, which makes it more interesting.



Optimization # 2: Convert static field fetch to local member fetch
This suggestion is not for EventSource implementation but it refers to the use of EventSource instance in the logging site. The logging site is the code from where we call EventSource's methods. As we have seen several examples in this blog, EventSource implementation is singleton based [See singleton], it is a static variable fetch at the call site. Here is an example of the usage:



As we know static variable fetch is more expensive than an instance member fetch, we can assign the singleton instance to a local instance member. In order to improve the usage of EventSource's method, we can assign this to a local variable. Then we can use the local member at the logging site.



Optimization #3: Minimize the number of EventSources in the application
In a previous post, we discussed about the mapping between EventSource and ETW Event providers. Actually the framework registers an Event Provider in ETW infrastructure for every EventSource in the application. This is done using ETW EventRegister API. This allows the framework to pass a command to the EventSource from external controllers.

The framework also maintains a list of EventSources in the application domain. We can get the list of all EventSource (s) in the application domain using the static GetSources() method in EventSource type.



Both of these tasks are performed at startup. Since this would depend on the number of EventSource (s) in the applications so the higher number for event sources in your application would mean slower startup. This can be resolved by minimizing the number of EventSource in your application. I would definitely not suggest an EventSource for each type in your application. There can be two options to resolve this.

The first option is using Partial types. We can span the same type across different files in the same assembly. All of these definitions are combined to generate a consolidated type by the compiler. We generally organize our types in different folders in a project. Here each folder generally represents types belonging to the same group. We can provide all the event methods definition for the method group for the types in this folder.

Most of the real life projects span more than one projects. In this case, we should still be able to minimize the number of EventSource (s), by defining on for each group, in order to improve the startup performance. Here each event source can take care of instrumentation requirement for a group of types in your application. You can group them logically or the way you want.

Optimization # 4: Avoid fallback WriteEvent method with Object Array Parameter
There are various overloads of WriteEvent() methods in EventSource type. We need to call WriteEvent() method from our [Event] methods in EventSource type. One of these overloads is an overload with params array. If none of the overload matches your call then the compiler automatically falls back to this overload.



We should be avoiding the fallback method as much as possible for performance reasons as it is reported to be 10-20 times more expensive. This is because the arguments need to cast to object, an array needs to be allocated and these casted arguments are added to the array. Then calling the methods with these arguments as serialized.

In order to avoid using the fallback overload, we can introduce new WriteEvent() method by overriding it.

Sunday, February 3, 2013

[MPGO] Managed Profile Guided Optimization - .net framework 4.5

In the previous post we discussed about what are native assembly images and how it can help us in improved application startup and facilitate code sharing. In this post, we are going to discuss another CLR feature provided in .net framework 4.5. The feature is called Profile Guided Optimization. The only difference is that this profiling would not be based on historical usage of the application by the user but would be based on some test scenarios. The profile optimization is about improving JIT compilation, on the other hand, profile guided optimization is improving the performance by the layout of native assemblies.

It must be remembered that this is the same optimization that is used by .net framework assemblies. Profilers have been helping the software developers with code optimization. They are used to measure various statistics for the running software including memory footprints, memory leaks, CPU and I/O usage. The feature just creates the profile data for assemblies for generation of better native binary images by NGEN [Native Image Generator].

How it works?
The workflow involving the use of MPGO is very simple. It is fed with the list of IL assemblies with the scenarios to run. It generates the updated IL images in the specified folder. The simplified workflow can be presented as follows:


The first part of this procedure happens on vendor side creating providing the software. The vendor uses [MPGO] and associates profile data with the IL assemblies. Now the assemblies are provided to a client in a deployment package. Here they can be NGENed manually during installation. They can also be queued for NGENing for later when the machine is idle to be taken care of by Native Image Service [ Or Native Image Task for Windows 8 machines]. They can also be automatically NGENed by the framework when no such specification is provided by the installation package.

Running MPGO
MPGO can be used with Developer Command Prompt for Visual Studio 2012 Ultimate. Please make sure that you are running the command prompt with administrator privileges.


It might ask for administrator privileges if your Developer Command prompt is not running with it. You will get an access denied for not granting them.


MPGO & Application Types
It must be remembered that neither the native images nor MPGO are the silver bullets to improve application performance. They are recommended for very large applications where start-up time can be an issue. It must be remembered that they are only recommended for desktop applications. Both of these tools are not recommended for ASP.net and WCF applications because of their dynamic deployment model and significantly less importance of start-up time for those application types. MPGO is not supported for Windows Store apps.

MPGO & Application Deployment
As we have discussed MPGO requires Visual Studio 2012 Ultimate license. Our clients definitely wouldn't have this license on their machines. Actually they don't need to and this tool is also not part of .net framework. It's rather a Visual Studio tool. We can create IL assemblies optimized through MPGO. We can include these assemblies in our deployment package. While installation we can run NGEN to copy these assemblies to the native image cache for manual native images creation.

If your clients are Windows 8 machines, then these native images can also be automatically created when the application is running for future application startup performance gain.

Using MPGO:
Let's use some existing code for doing some cool stuff with [MPGO]. We can use the project from one of our previous posts. This is the project we developed while discussing Portable Class Libraries in the context of MVVM based designs. You can find the project here. You can also download the code.

We will be setting the output folder for all the projects in the solution to [././ILAssemblies]. This would create a folder on the same level as [bin] folder.


This is where all the assemblies would get copied once you build the project.


Let's create another folder where we you want the optimized image of the assemblies is copied to. The optimized assemblies would be generated by [MPGO].


Running MPGO is easy. We need to specify the required switches to help the tool. Here we are running it for the assemblies (including DLLs and EXEs) in [ILAssemblies folder). We are generating the optimized images in [OptimizedILImages] folder. [Scenario] specifies how we want to run our application including the start-up arguments. The other option is to use the profile data from some other assemblies. In that case, we don't need to run the application. This is supported through another switch [-Import]. Please remember that [-Scenario] and [-Import] are used exclusively.


When we run this, MPGO launches the application. We need to run the application through the pre-thought scenarios and close the application. As soon as we close it, MPGO has all the data to optimize the application (alternatively it also supports Timeout where it would just profile it for the specified duration and uses the profile data). So it runs our [Coffee House App] as follows:


As soon as we close the application, we have our optimized assemblies in the [OutDir] folder.


[MPGO] doesn't override the optimized assemblies in the [OutDir] folder. It just appends them with the next available integer. In this case, I have run the tool again and look at how it has appended "-1" to the name of the assemblies.


[Note] I don't want you to miss one important detail. Two of these assemblies are portable class libraries. It is really great to see that the tool supports them for generation of optimized images.

Mechanism of Optimized Assembly Generation
This should be enough to discuss about the tool. But we need to see what is happening under the hood. Let's discuss briefly what is actually happening under the cover. When we run MPGO it basically compiles our assemblies and their dependencies into native code using NGEN and copies only the native images of source assemblies in the [OutDir] folder.


We can use Process Explorer to see how MPGO is using NGEN to created native images. Remember we discussed about MSCORSVW.exe in the previous post?


You can see the details about the assemblies in registry as follows:


When compilation of these assemblies finishes, [MPGO] adds instrumentation information to those assemblies.


After instrumentation of the assemblies, MPGO launches the application as specified in [-Scenario] switch. It is launched from the source directory. It is done for assemblies individually as they are individually compiled they are picked up for instrumentation.


It then runs the application with the scenario. It waits until the application finishes or timeout expires. During this time, MPGO profiles the application and keeps the details in a temporary storage. The tool creates IBC data files in the [OutDir] folder. They are saved with [IBC] extension as follows:


As soon as the scenario finishes or timout expires, the instrumentation assemblies are uninstalled.


Since MPGO has profiled the application while running the scenario, it just copies the assemblies from the source folder and merges the profile data with the assemblies. It then removes the IBC data files. Now we have the assemblies with the profile data.


Using Optimized data from Previously optimized Assemblies
Once we have the optimized assemblies with the profile information, we don't need to run the scenario every time there is a new version of the assembly is created. We can just source the profile information from previously optimized assemblies and create newer version of the assemblies for native image generation. Let's create another image, called [ReOptimizedILImages], in the same folder as follows:


We can specify the new versions of assemblies with -assemblyList or -assemblyListFile switch. The ones to use the scenarios from could be provided with -Import switch. MPGO processes it and generates the newer versions of assemblies in -OutDir folder.


Here is how we can specify the same in Command Prompt.


Finally we have the merged assemblies in the [-OutDir] folder.


It must be remembered that if an assembly (specified in -assemblyList / -assemblyListFile) has no corresponding assembly in the assemblies containing the profile data (specified as -Import) then it would just fail for the particular assembly. The rest of the assemblies still get merged successfully. In order to see that let's just remove [CoffeeHouseApp.PortableViewModels] assembly from the [-Import] folder. Now execute the same command as above and notice the output in command prompt.


MPGO In Enterprise Eco System
In the enterprise world, MPGO might be made part of the process. As you know we first need to run scenarios. This can be done on a sample machine. Running the scenarios on an application would created assemblies with the profile data. These assemblies can be uploaded to some central location (Or checked into a source control).

Now Build servers (like TeamCity / Cruise Control) come into action. We can make this as one of the last build steps. We can use the -assemblyList from the repository assemblies from current build and provide the -Import details as the checked-in assemblies. Now when MPGO is run, it would create the merged assemblies in the specified folder. It would be tricky though in case of Build Triggers and dependencies if this build would be triggering some other builds on the build server.

[Note] If we are using Visual Studio, we can include a post-build event with MPGO command and have the merged assemblies in the specified folder.

Thursday, March 17, 2011

WPF - Performance Improvement for MVVM Based Applications - Part # 3

This is third part of our discussion about .net features and techniques to improve the performance of MVVM based applications. The other part of this discussion are:

Part 1: http://shujaatsiddiqi.blogspot.com/2011/01/wpf-performance-improvement-for-mvvm.html

Part 2: http://shujaatsiddiqi.blogspot.com/2011/02/wpf-performance-improvement-for-mvvm.html

In this post we will be discussing how we can use Caching to improve the performance of an application. Support of Caching in desktop application is a new feature of .net framework 4.0.

Assembly and Namespace:
Most of the classes used for caching feature are available in System.Runtime.Caching namespace available in System.Runtime.Caching assembly. We need to add an assembly reference of this assembly in order to use these types.


Extensible Caching Implementation:
The cache system available in .net 4.0 has been implemented from ground up to be an extensible concept. A Cache provider must inherit from ObjectCache available in System.Runtime.Caching namespace. The cache provider available with .net is MemoryCache. It represents an in-memory cache. This is similar to ASP.net cache but you don't need to use System.Web.Caching as it is available in the same System.Runtime.Caching namespace in System.Runtime.Caching assembly. The other benefit is that we can create multiple instances of MemoryCache in the same AppDomain.


Simple Caching Usage for temporal locality of reference:
Let's consider an example of a WPF application using this. In this example we would be utilizing the idea of temporal locality of reference. We would be reading the contents of a File. Since I/O is a time consuming operation, we will be caching the contents of the file. We will be using this cached content from other window, where we would be showing it in a TextBlock. The definition of MainWindow is as follows:
<Window x:Class="WpfApp_MVVM_Caching.MainWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:local="clr-namespace:WpfApp_MVVM_Caching"
        Title="MainWindow" Height="350" Width="525">
    <Window.DataContext>
        <local:MainWindowViewModel />
    </Window.DataContext>
    <Grid>
        <TextBlock Height="81" HorizontalAlignment="Left" Margin="12,17,0,0" 
                   Name="textBlock1" Text="{Binding FileContents}" VerticalAlignment="Top" Width="468" />
        <Button Content="Open Child Window" Height="26" HorizontalAlignment="Left" Margin="12,273,0,0" Name="btnOpenChildWindow" 
                VerticalAlignment="Top" Width="134" Click="btnOpenChildWindow_Click" />
    </Grid>
</Window>
This window has a TextBlock to display the contents of the file. The Text property of this TextBlock is bound to FileContents property of DataContext. The window also has a button. The definition of click event handler [btnOpenChildWindow_Click] from the code behind is as follows:
private void btnOpenChildWindow_Click(object sender, RoutedEventArgs e)
{
    new ChildWindow().Show();
}
This is just opening the ChildWindow as a modeless window using its Show method. MainWindow is using MainWindowViewModel instance as its DataContext. It is constructing its instance as part of its initialization code. The definition of MainWindowViewModel is as follows:
class MainWindowViewModel : INotifyPropertyChanged
{

    public MainWindowViewModel()
    {
        string localfileContents = File.ReadAllText(@"C:\Users\shujaat\Desktop\s.txt");
        FileContents = localfileContents;

        MemoryCache.Default.Set("filecontents", localfileContents, DateTimeOffset.MaxValue);
    }

    #region Properties
    
    private string _fileContents;
    public string FileContents
    {
        get { return _fileContents; }
        set
        {
            _fileContents = value;
            OnPropertyChanged("FileContents");
        }
    }

    #endregion

    #region INotifyPropertyChanged implementation

    public event PropertyChangedEventHandler PropertyChanged = delegate { };
    private void OnPropertyChanged(string propertyName)
    {
        PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
    }

    #endregion INotifyPropertyChanged implementation

}
As expected by the view, It just has a property FileContents. In the constructor, the contents of a file is read and assigned to this property. Since this is based on INotifyPropertyChanged, the changes are propagated to the view using PropertyChanged event. Additionally, we are copying the data read from the file to MemoryCache.Default with Key 'fileconents'.

The definition of ChildWindow is as follows:
<Window x:Class="WpfApp_MVVM_Caching.ChildWindow"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:local="clr-namespace:WpfApp_MVVM_Caching"
        Title="Child Window" Height="300" Width="300">
    <Window.DataContext>
        <local:ChildWindowViewModel />
    </Window.DataContext>
    <Grid>
        <TextBlock Height="61" HorizontalAlignment="Left" Margin="36,49,0,0" Name="textBlock1" 
                   VerticalAlignment="Top" Width="206" Text="{Binding CacheEntryValue}" />
    </Grid>
</Window>
It has a TextBlock whose Text property is bound to CacheEntryValue property from the DataContext. The DataContext is an instance of ChildWindowViewModel. Its definition is as follows:
class ChildWindowViewModel : INotifyPropertyChanged
{
    private string _cacheEntryValue;
    public string CacheEntryValue
    {
        get { return _cacheEntryValue; }
        set
        {
            _cacheEntryValue = value;
            OnPropertyChanged("CacheEntryValue");
        }
    }

    public ChildWindowViewModel()
    {
        CacheEntryValue = MemoryCache.Default["filecontents"] as string;
    }

    #region INotifyPropertyChanged implementation

    public event PropertyChangedEventHandler PropertyChanged = delegate { };
    private void OnPropertyChanged(string propertyName)
    {
        PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
    }

    #endregion INotifyPropertyChanged implementation
}
We are just copying the contents of the cache entry defined in the MainWindow to the Text property of the TextBlock. Using the value is achieved using the idea of temporal locality of Caching technology. Let's run the application and open child window.


As you might already have guessed, the contents of the file are: "Muhammad Shujaat Siddiqi".

Changes in Underlying DataSource:
Cache keeps the data available in an application for faster data access when it is needed. The user of this cache might or might not know about the actual data source. The caching feature available in .net framework 4.0 keeps track of this is through the provision of ChangeMonitor. This is basically kind of implementation of Observer design pattern. So when underlying datasource is changed, the ChangeMonitor notifies the ObjectCache implemenation [MemoryCache] about this change. ObjectCache implementation then takes any action about this change. It might even nullify the Cache entry as it is invalid now [Like the case of MemoryCache].


Let us create a new Window to present an example of a WPF application using ChangeMonitor. The Window below is a similar window as the above example. It just has an extra TextBox and a Button.
<Window x:Class="WpfApp_MVVM_Caching.MainWindowChangeMonitorExample"
         xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:local="clr-namespace:WpfApp_MVVM_Caching"
        Title="MainWindow" Height="350" Width="525">
    <Window.DataContext>
        <local:MainWindowChangeMonitorExampleViewModel />
    </Window.DataContext>
    <Grid>
        <TextBlock Height="81" HorizontalAlignment="Left" Margin="12,17,0,0" 
                   Name="textBlock1" Text="{Binding FileContents}" VerticalAlignment="Top" Width="468" />
        <TextBox Height="138" HorizontalAlignment="Left" Margin="279,125,0,0" Name="textBoxCachedFileContents"
                 VerticalAlignment="Top" Width="212" Text="{Binding Path=MessageToDataSource, UpdateSourceTrigger=PropertyChanged}" />
        <Button Content="Update Data Source" Height="30" HorizontalAlignment="Left" 
                Margin="314,269,0,0" Name="btnUpdateDataSource" VerticalAlignment="Top" 
                Width="177" Command="{Binding UpdateDataSourceCommand}" />
        <Button Content="Open Child Window" Height="26" HorizontalAlignment="Left" Margin="12,273,0,0" Name="btnOpenChildWindow" 
                VerticalAlignment="Top" Width="134" Click="btnOpenChildWindow_Click" />
    </Grid>
</Window>
The above view is using a new instance of MainWindowChangeMonitorExampleViewModel as its DataContext. It is expecting a string property FileContents to bind to the Text property of the TextBlock. Additionally, it is also expecting the DataContext to have MessageToDataSource property to bind to the Text property of the only TextBox. When a user clicks the Update button the contents of TextBox should be updated to the DataSource (File). Since FileContents are invalid now, it should be updated to the latest contents. For that, the view model should use ChangeMonitor. But first look at the code behind of this view.
public partial class MainWindowChangeMonitorExample : Window
{
    public MainWindowChangeMonitorExample()
    {
        InitializeComponent();
    }

    private void btnOpenChildWindow_Click(object sender, RoutedEventArgs e)
    {
        new ChildWindow().Show();
    }
}
This is similar to the view in the previous examples. When user clicks "Open Child Window", a new instance of ChildWindow is shown in a modeless fashion. Now we have a look at the View Model. Get ready to see Change Monitors in action...
class MainWindowChangeMonitorExampleViewModel : INotifyPropertyChanged
{
    #region Fields

    List<string> filePaths = new List<string>();

    #endregion Fields

    #region Constructors

    public MainWindowChangeMonitorExampleViewModel()
    {
        filePaths.Add(@"C:\Users\shujaat\Desktop\s.txt");
        
        string localfileContents = File.ReadAllText(@"C:\Users\shujaat\Desktop\s.txt");
        FileContents = localfileContents;

        var changeMonitor = new HostFileChangeMonitor(filePaths);
        var policy = new CacheItemPolicy();

        MemoryCache.Default.Set("filecontents", localfileContents, policy);
                                 
        policy.ChangeMonitors.Add(changeMonitor);            
        changeMonitor.NotifyOnChanged(OnDataSourceUpdated);

    }

    #endregion Constructors

    #region Change Monitor Callback

    private void OnDataSourceUpdated(Object State)
    {
        //Get file contents
        FileInfo file = new FileInfo(@"C:\Users\shujaat\Desktop\s.txt");
        using (FileStream stream = file.OpenRead())
        {
            using (StreamReader reader = new StreamReader(stream))
            {
                string updatedFileContents = reader.ReadToEnd();
                reader.Close();
                FileContents = updatedFileContents;
            }
        }

        //Update Cache Entry
        MemoryCache.Default["filecontents"] = FileContents;

        //Update change monitor for further changes in file contents
        var policy = new CacheItemPolicy();
        var changeMonitor = new HostFileChangeMonitor(filePaths);
        policy.ChangeMonitors.Add(changeMonitor);
        changeMonitor.NotifyOnChanged(OnDataSourceUpdated);
    }

    #endregion Change Monitor Callback

    #region Properties

    private string _fileContents;
    public string FileContents
    {
        get { return _fileContents; }
        set
        {
            _fileContents = value;
            OnPropertyChanged("FileContents");
        }
    }

    private string _messageToDataSource;
    public string MessageToDataSource
    {
        get { return _messageToDataSource; }
        set
        {
            _messageToDataSource = value;
            OnPropertyChanged("MessageToDataSource");
        }
    }

    #endregion

    #region INotifyPropertyChanged implementation

    public event PropertyChangedEventHandler PropertyChanged = delegate { };
    private void OnPropertyChanged(string propertyName)
    {
        PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
    }

    #endregion INotifyPropertyChanged implementation

    #region Commands

    ICommand _updateDataSourceCommand;
    public ICommand UpdateDataSourceCommand
    {
        get
        {
            if (_updateDataSourceCommand == null)
            {
                _updateDataSourceCommand = new RelayCommand(
                    param => this.UpdateDataSource(),
                    param => this.CanUpdateDataSource
                    );
            }
            return _updateDataSourceCommand;
        }
    }

    public void UpdateDataSource()
    {
        FileInfo file = new FileInfo(@"C:\Users\shujaat\Desktop\s.txt");
        using (FileStream stream = file.OpenWrite())
        {
            using (StreamWriter writer = new StreamWriter(stream) { AutoFlush = true })
            {
                writer.Write(MessageToDataSource);
                writer.Close();
            }
        }
    }

    bool CanUpdateDataSource
    {
        get { return true; }
    }

    #endregion Commands
}
In the constructor, like in the previous example, it is reading the contents of a text file and adding it to the cache. Additionally, it is creating a HostFileChangeMonitor and adds it the CacheItemPolicy's list of ChangeMonitors. This policy is being used for keeping a watch on the contents of the files specified by the ChangeMonitors. If there is an update, it executes the callback as specified by the ChangeMonitor. HostFileChangeMonitor is a descendent of ChangeMonitor through FileChangeMonitor. In non ASP.net applications, it uses internally FileSystemWatcher to monitor files. It must be remembered that it does not support relative paths.

As expected by the view, it has two string properties FileContents and MessageToDataSource. It also has a RelayCommand object UpdateDataSourceCommand. This command is executed when user clicks Update Data Source button. In the execute method [UpdateDataSource] of this command, we are just updating the contents of the file using StreamWriter.

The soul of this example is OnDataSourceUpdated method specified as the callback for change notification in the monitored data sources by HostFileChangeMonitor. When the contents of the file are updated, this method gets called. In this method we are updating the Cache Entry and the string property bound to the text block in the view showing the contents of the file.

Now specify the new view as startup view of the project in App.xaml and run the application. The view appears as follows:


Now let us update the contents of the TextBox and click "Update Data Source" button. As the contents of the file are updated, the callback is called by the ChangeMonitor which updates the cache entry and the FileContent property with updated contents of the file. Since this is bound to a TextBlock.TextProperty in the view, the view shows this update.


Now we open the child window using "Open Child Window" button. The child window uses the same cache entry. The child window appears as follows:


Now update the contents of file again by updating the contents of the TextBox and hitting "Update Data Source" button. This should be done when the ChildWindow is still shown. We are updating the contents with one of Jalal Uddin Rumi's great quote.


Cache Entry Change Monitor:

As you can see that clicking Update Source button after updating the contents of the TextBox updates the TextBlock on the main window but it does not appear to update the contents of second window. This is because the ChildWindow does not know that the cache entry value it used for its TextBlock has been updated. If ChildWindow has some means to get notification for this cache entry update then it could update its logic. For this purpose we can use another ChangeMonitor, called CacheEntryChangeMonitor.

Let's update the constructor of ChildWindowViewModel as follows:
public ChildWindowViewModel()
{
    CacheEntryValue = MemoryCache.Default["filecontents"] as string;
    MemoryCache.Default.CreateCacheEntryChangeMonitor(
        new string[] { "filecontents" }.AsEnumerable<string>()).NotifyOnChanged((a) =>
        {
            CacheEntryValue = MemoryCache.Default["filecontents"] as string;
        }
    );
}
Here we have just added an instane CacheEntryChangeMonitor to the code. Again we are using the same cache entry but the only difference we have added a NotifyOnChanged callback for this. As the entry gets updated this callback is called and it updates the property bound to the TextBlock.Text property in the view refreshing this.


Download Code:

Friday, February 25, 2011

WPF - Performance Improvement for MVVM Applications - Part # 2

This is the second part of our discussion about performance improvement of an MVVM based application. You can find the other part of this discussion here:

- Part - 1: http://shujaatsiddiqi.blogspot.com/2011/01/wpf-performance-improvement-for-mvvm.html

- Part - 3: http://shujaatsiddiqi.blogspot.com/2011/03/wpf-performance-improvement-for-mvvm.html

In this example we would see how we can utilize Lazy Initialization to improve the performance of MVVM based application. Lazy initialization is a newly available features of .net framework 4.0. As I remember this is based on Tomas Petricek's suggestion which has been incorporated in .net framework recently. This is a part of effort to incorporate lazy execution in c# code. You might already know that LINQ queries support lazy evaulation already. They are only evaluated when the first time they are enumerated. If they are never enumerated then never such evaluation takes place. With new Lazy initialization feature, the framework now also supports lazy initialization.

http://msdn.microsoft.com/en-us/vcsharp/bb870976

In this example we would base our discussion on a hierarchical view model. The main view is consisted of some collection based controls. Each of these control also needs to define its template to specify how different items in the collection would be displayed on the screen.

Let's consider a view for a Course with just two students. The data of students should be displayed in tab items in a TabControl. The student information should only have two fields, their first and last names. Student's First name should be displayed as header of its TabItem. If the data is not entered yet then it should display "Default" in the header. The XAML definition of the view is as follows:
<Window x:Class="WpfApp_MVVM_LazyInitialize.MainWindow"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:local="clr-namespace:WpfApp_MVVM_LazyInitialize"
Title="MainWindow" Height="350" Width="525">
<Window.DataContext>
<local:MainWindowViewModel />
</Window.DataContext>
<Grid>        
<TabControl Height="288" HorizontalAlignment="Left" Margin="12,12,0,0" Name="tabControl1" VerticalAlignment="Top" Width="480"
ItemsSource="{Binding StudentViewModels}" >
<TabControl.ItemTemplate>
<DataTemplate>
<TextBlock Text="{Binding StudentFirstName}" />
</DataTemplate>
</TabControl.ItemTemplate>
<TabControl.ContentTemplate>                
<DataTemplate>
<Grid>
<Label Content="First Name" Height="27" HorizontalAlignment="Left" 
Margin="12,29,0,0" Name="label1" VerticalAlignment="Top" Width="105" />
<TextBox Height="30" HorizontalAlignment="Left" Margin="123,29,0,0" 
Name="textBoxFirstName" VerticalAlignment="Top" Width="345" 
Text="{Binding Path=StudentFirstName}" />
<Label Content="Last Name" Height="27" HorizontalAlignment="Left" Margin="12,65,0,0" 
Name="label2" VerticalAlignment="Top" Width="105" />
<TextBox Height="30" HorizontalAlignment="Left" Margin="123,65,0,0" 
Name="textBoxLastName" VerticalAlignment="Top" Width="345" 
Text ="{Binding StudentLastName}" />
</Grid>                    
</DataTemplate>
</TabControl.ContentTemplate>
</TabControl>
</Grid>
</Window>

We would be needing two DataTemplates for TabControl. One is for Header i.e. ItemTemplate and the other is for the content of each TabItem i.e. ContentTemplate. The ItemsSource of this TabControl is bound to StudentViewModels collection. Each member of this collection should have atleast two properties StudentFirstName and StudentLastName. The ItemTemplate is just a TextBlock. The Text property of this TextBlock is bound to item's StudentFirstName property. The ContentTemplate has two TextBox(es). One of them is bound to StudentFirstName and the other is StudentLastName properties. We have kept the code behind of the view as default.

The DataContext of the above view is set as a new instance of MainWindowViewModel. It implements INotifyPropertyChanged so it needs to provide definition of PropertyChanged event as part of its contract. As required by the view, it provides a collection StudentViewModels. It is provided as an ObservableCollection. Each member of this generic collection is specified as of the type StudentViewModel.
class MainWindowViewModel : INotifyPropertyChanged
{
ObservableCollection<StudentViewModel> _studentViewModels = 
new ObservableCollection<StudentViewModel>();

public ObservableCollection<StudentViewModel> StudentViewModels
{
get
{
return _studentViewModels;
}
}

public MainWindowViewModel()
{
_studentViewModels.Add(new StudentViewModel());
_studentViewModels.Add(new StudentViewModel());
}

public event PropertyChangedEventHandler PropertyChanged;
private void OnPropertyChanged(string propertyName)
{
if (PropertyChanged != null)
{
PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
}
}
}

In the constructor of the above view model we have added two members to the collection. As expected they are of type StudentViewModel. They would be displayed as Tab items in the view. The definition of StudentViewModel is as follows:
class StudentViewModel : INotifyPropertyChanged
{
Lazy<Student> _model = new Lazy<Student>();

string _studentFirstName;
public string StudentFirstName
{
get { return _studentFirstName; }
set
{
if (_studentFirstName != value)
{
_studentFirstName = value;
_model.Value.StudentFirstName = value;
OnPropertyChanged("StudentFirstName");
}
}
}

string _studentLastName;
public string StudentLastName
{
get { return _studentLastName; }
set
{
if (_studentLastName != value)
{
_studentLastName = value;
_model.Value.StudentLastName = value;
OnPropertyChanged("StudentLastName");
}
}
}

public StudentViewModel()
{
_studentFirstName = "Default";
}

public event PropertyChangedEventHandler PropertyChanged;
private void OnPropertyChanged(string propertyName)
{
if (PropertyChanged != null)
{
PropertyChanged(this, new PropertyChangedEventArgs(propertyName));
}
}
}

As expected by the view it has two properties StudentFirstName and StudentLastName. It implements INotifyPropertyChanged to support change notification.

The above view model uses Student as model. When the view model receives updates in these properties through WPF Biniding System, it just passes on those updates to the model. The instances initialized with Lazy initialization feature are accessed using Value property of the Lazy object reference. We have accessed StudentName and StudentLastName properties using the same. Let us assume that it were a model which requires a heavy instantiation. We are just displaying a MessageBox in the constructor. This would show a MessageBox from the constructor. In this way we would realize exactly when the constructor is called. This is how are trying to understand Lazy initialization of Student.
class Student
{
public string StudentFirstName { get; set; }
public string StudentLastName { get; set; }

public Student()
{
MessageBox.Show("Student constructor called");
}
}

Now let us run this. The application is shown as follows:



Enter some data in the First Name field of the first Tab Item. As soon as the focus is changed to the other field, the default Binding of Text property of TextBox triggers the source updates on LostFocus.


When you close the message box you can see the header of the TabItem being updated with the FirstName you entered.



Deciding which Constructor to use for initialization:
We can also decide which constructor of the type we want to use for initialization using the constructors of Lazy class. Lazy<T> allows Func<T> delegate as argument in some of its overloads.

In a multithreaded scenario:
In a multithreaded scenario the first thread using the lazy instance causes the initialization causes the initialization procedure and its value would be seen by the other threads. Although the other threads will also cause the initialization but their value will not be used. In a multithreaded scenario, like this, we can use any overload of EnsureInitialized method of Lazyintializer for even better performance. This can also use any of the default constructor or specialized constructor using Func delegate.

http://msdn.microsoft.com/en-us/library/system.threading.lazyinitializer.ensureinitialized.aspx

Download Code:

Friday, August 22, 2008

Performance Improvement of SQL Server

In the very early ages we used to say that "Information is virtue" then it changed to "Information is power". In the present age of information technology, Information is not power any more because information is every where. It is the correct and timely use of it. Just take an example, there are two recruitment agencies both have unlimited resumes of job seekers in their database. Now an organization reaches to both of them and asked them to provide list of candidate suiting a particular criteria. Now the firm which has the faster mechanism to fetch the information from the database would be able to get the contract. So in the present age, only those organizations would remain to be existent which have faster access to information. This is "Survival of the fastest" scenario for an organizational market.

Today I want to discuss many things for you to improve the performance of your SQL Server based systems.

Hardware:

Use Faster Storage Technologies:
It is true that by writing efficient code, we can improve the performance. But this code has to run on hardware, so by using the efficient hardware, we can improve the performance of our systems. These days, we are using multi core processors and primary memory up to many giga bits per second. But the problem is that our database systems are based on the data stored in our persistent storage drives. The maximum speed of such devices is 7200 rpm. This proves to be a bottleneck. But being a software person, we are not concerned, generally, with the speed of our hard drives. But by just using a storage which becomes more responsive to the data requirements of our system, we can improve our systems many folds.

Having discussed about this, a question pops up in our minds. Can we increase the speed of our hard drives? The answer is YES. This can be done by using the drives which have higher speeds. Gone are the days in which the organizations had to use the magnetic storage drives because of their capacity, cost and reliability. These days’ organizations are moving towards solid state options for persistent storage. This storage have more random access ability. So they can process data much faster then their magnetic counterparts.

The only problem is that, these drives are not marketed as they deserve to be. But being an enthusiast, we should dig the technology to get the best out of it.

Software:

a. Efficient use of indexes
First thing first, Index does not mean improving performance for every operation. If there are more SELECT operations in a table, use index on columns. But if there are more inserts / updates / deletes, don't consider implementing it. This is because this would further slow down the process. These operations would not only be slow but would also create fragmentation problems.

b.Computed Columns:
This is another decision you have to take based on your scenarios. If you have more Insert / Updates then using computed columns would significantly reduce the performance of your system. Instead if your system involves more SELECT operations then use of computed columns would prove to be a step towards drastic performance improvement for your system.

c. Selecting the data types of columns in a table
This aspect is sometimes ignored whenever an improvement is discussed. Always consider the following points whenever you have to select a data type of one of your column.

• You should have very good reason not to use non-Unicode VARCHAR if you have to deal with character data. The reasons may be your requirement to store data more than 8000 bytes. There may be special Unicode characters which you have to use. Otherwise, you can do very fine with this data type.

• Always remember that integer based columns are much faster to sort any result set if used in an ORDER BY clause.

• Never use Text or BigInt data types unless you need extra storage.

• Using Integer based data type for Primary key would improve the performance compared to text based or floating point based ones.

d. Bulk Insert / Updates
Whenever Bulk Insert or updates are involved, selecting the recovery model to FULL or BULK LOGGED would slow down the process of the insert or update operation. So what you can do is that you change the recover model do your operation and then change the model back to the previous one.

Index could also create delay for these operations.

Always try to pass data as XML to your stored procedures and use OPENXML for BULK INSERT or UPDATE. This operation is then done through a single INSERT / UPDATE statement as a SET based operation. Avoid using DataAdapter's for update even if you are using updateBatchSize with it. This is because of the reasons: it is not SET based; still more than one UPDATE statements are passed to the SQL Server. This only decreases the roundtrip between your application and SQL Server. The numbers of UPDATE statements being executed on the server are the same. The maximum value for UPDATEBATCHSIZE is a limited value. But with XML option you can send the complete data of a DataTable.

e. Parameterized queries
You might have studied that compiled queries run faster. This is because the DBMS does not have to create a plan for the query for each operation. This also would help you as a precaution against SQL Injection attacks.

f. Stored procedures
Stored procedures also help in preventing the SQL Injection attacks. They also decrease the roundtrips between application and Database Server because many statements are combined in one procedure. This makes these bunch of statements as a single transaction. There are also many other benefits. But the specified ones are enough to use them as a first priority.

g. Stop lavishing use of explicit Cursors
This has been said by so many people so many times that this repetition is certainly not very necessary. I just tell you my example. Few years back I was given a stored procedure in which the previous developer had the lavishing use of the explicitly cursors. The execution time has risen to 40 minutes then because of increase of data in the database. This was expected to increase more as the data increases. By just replacing these cursors with some SET based operations; I was able to turn that execution time to 3 seconds. This was a great achievement on my part. By just looking at this example, you can find out that use of cursor is killing for the performance of your system.