Showing posts with label C#. Show all posts
Showing posts with label C#. Show all posts

Tuesday, December 9, 2014

Using Overridden members with Partial Mocks for abstract Types

For a serious unit testing mocking framework is a required piece. It allows creating mocks for dependencies we don't want to instantiate in our code. We discussed about partial mocks long time ago [refer post]. They are specially useful for testing abstract types. Actually they provide a proxy implementation for the virtual / abstract members.

In this post we are going to discuss a special case when we have overridden some members of a parent type. We might not have overridden the other members or we just are good with the default null proxy implementation. Generally the mocking framework would just provide proxy implementation for all virtual / abstract members. How can we make sure that it uses the implementation in our type if we have provided so.

For our demo project, we are using nunit with Moq. Let's create a test project, and add the following nuget packages.



Here we have introduced BaseClass with a public member [Calculate] and two abstract members [CalculateCore and SomeOtherMethod]. It is inherited from another type ChildClass, which provides an implementation of CalculateCore only.


Now let us try to add unit tests for testing the functionality for this type. Here we have added a methods to check the result of Calculate method. Since we have added a definition of CalculateCore member, it should be adding the two parameters and just returns the result.


When we run this test, we get failure. If we look at the result, it doesn't actually make a lot of sense. Actually it is not even using the CalculateCore implementation from ChildClass.



But can we actually tell mocking framework to avoid providing implementation of overridden members. Actually Moq supports this by providing CallBase member for Mock. Here we have another implementation of the same ChildClass. The only difference is that mock has this member set. Now it should use the overridden member, CalculateCore, and test should pass.


And test passes now, Zindabad!!!

Thursday, September 18, 2014

Amazon S3 Sink for Semantic Logging Service – Enterprise Library 6

Event Source API was introduced in .net framework 4.5 to support semantic logs. It allows an application to author events. These events are generated as ETW events which can be consumed in-proc or out-proc. For a general in-proc consumption, the API has introduced EventSourceListener type.

Semantic Logging Application Block (SLAB) has developed on top of EventSource API with more focus on event’s consumption. It has introduced sinks for consumption which are utilized by ETW event listener in the application block. These events are destination specific. The Pattern & Practices (p&p) team...

Tuesday, March 18, 2014

WPF - Custom ToolTip Placement with more Options

WPF allows us various options to customize ToolTips. We also have different options for appropriate placement of ToolTip. But they are very limited and just include positioning the ToolTip on top, bottom, right and left of a control. In this post, we are trying to create a behavior to appropriately place our ToolTip. The behavior should allow us to position it around different edges and corners e.g. Bottom - Right.



Let us create a WPF project for defining this behavior. We need to first add these enumerations to allow users to specify the placement of ToolTip.


Now let us add a behavior for placement of tooltip. You would need to add an assembly reference of System.Windows.Interactivity from Blend SDK to use Behavior<T>. In order to place the tooltip, we are using the two enumerations defined above. Since these would be static values assigned in XAML, we are introducing them as CLR properties. If we need to bind them, we can add them as dependency properties.


Since we are custom placing the tooltip, we can use CustomPlacementCallBack delegate defined for a ToolTip. We can assign a computation method to the delegate. The method can calculate the placement of the ToolTip based on the option specified in the HorizontalPlacement and VerticalPlacement properties


Now we need to calculate where we want to show the ToolTip. In addition to the options specified using VerticalPlacement and HorizontalPlacement properties, the delegate arguments provide us the details about the size of target control, from targetSize parameter, on which the ToolTip is being displayed. We also have the details about the ToolTip size from popupSize parameter. The behavior's user can also specify further offsets for the ToolTip using ToolTipService.VerticalOffset and ToolTipService.HorizontalOffset attached properties on the control. They are available in offset parameter. Now we have all the details to calculate the exact position to place the ToolTip.


Let's first define the templates for out ToolTip. This is how we want to display our ToolTip.


Now let us see how we need to add this to a target control. Here we are adding our behavior to a Border control. Using the attached properties in ToolTipService, we have specified that we want to place the tooltip based on some custom calculations. For a complete description of features of ToolTipService, we can refer MSDN.


It must be remembered that, by default, when we just select Custom placement option, all the calculations would be using the top left of the target control as a reference point.



Now let's see how to calculate the vertical and horizontal offset of the ToolTip. For the Horizontal offset, we just need to move it all the way on the right side of the target. So we can use target's Width for this purpose. We can also add the horizontal offset which might be defined on the control. We can do similar calculations for calculating vertical offset.


Here is how we are calculating the horizontal offset. Based on this, you can imagine the calculation of vertical offset.



What if ToolTip cannot be displayed on Calculated Position?
Since tool tips depend on the location of target control. Now the window containing the target control might be situated as such that the tooltip cannot be displayed because of being cut by screen corners. In that case, if we just use the above code, WPF just moves the ToolTip on its own so that it can be shown. But we can totally control the alternate location if it cannot be displayed on the location we first calculated. If we just look at the signatures of CustomToolTipPlacement delegate, it is returning an array. The placement options would be prioritized as such that the first element has the highest priority. The runtime can use the next one if it cannot be displayed on the first location.

If we look at the definition of above behavior, we have defined calculation methods as protected virtual. We can override this in a child behavior to achieve the required effect. Let's do it as follows:


Get it from Git

Wednesday, February 26, 2014

WPF CollectionViewSource and Live Shaping - WPF 4.5 & 4.0

Microsoft improved CollectionViewSource in WPF 4.5 to include Live Shaping. Live shaping is a modern idea in UI design which allows to re-evaluate the dependents when a dependee changes value. For WPF's CollectionViewSource, we can define sorting order by introducing SortDescriptions to it. We can add a list of properties to identify the sort order of the items in the collection. In the earlier versions, the position of an item in the sort order was determined when it is added or removed. But if the identified property changes value than it is not re-evaluated.

Live Shaping is similar to forward chaining mode of Inference Engine where rules (sort order) is re-evaluated based on other dependent rules (property changes). The feature has been added in .net framework 4.5 by including ICollectionViewLiveShaping interface.



This post is based on a funny story. While working on a client project, I realized that the live shaping is not working for a CollectionViewSource. Later I realized that they are on .net framework 4.0 and cannot upgrade version. But the business requirements suggested that the items must be re-sorted when an item changes value. Let's first start with an example in WPF 4.5 to see how Live Shaping works in the newer version. Then we will be trying to add a similar feature to 4.0 based CollectionViewSource.

Let's create a sample WPF application using .net framework 4.5. We name the project as WpfApp_LiveShaping.



Here we are updating MainWindow.xaml. The view is using a ListBox to show students data. Here we are concatenating First and Last names of students separated by a hyphen '-'. In order to hand over the sorting to WPF runtime, we are using CollectionViewSource here. We are FirstName property for sorting (ascending by default). Here IsLiveSortingRequested is being set to true. This would include live shaping for sorting. There are other properties including IsLiveGroupingRequested and IsLiveFilteringRequested for including the same for grouping and filtering respectively.


In the above view, we are using a local instance of MainViewModel as the DataContext. The view model is expected to have a Students collection. Each item in the collection is supposed to have FirstName and LastName properties.


And here is the StudentViewModel type used as items in the MainViewModel described above. As stated, it just has two properties FirstName and LastName.


Now in order to stick it in for MVVM police, we have a Click handler for the button in the view. It just changes one property in the view model from Vladimir to David. This should result in re-sorting and hence re-positioning of the item.


Now let's run the application. As we click the button in the bottom, the value of the last item changes and hence it is re-positioned in the list as follows:



What about WPF 4??
Now we need to see if we can achieve the same behavior in WPF 4. Actually there is no such support out of the box for this. Since the framework doesn't support so, we need to write custom code for this purpose. Let's extend CollectionViewSource to support Live sorting.


Here we are using OnSourceChanged member to hook up collection changed events on the source collection of the collection view. It is assumed that the source of the collection view is an INotifyCollectionChanged type. Now we just need to update the Resource section with this updated collection view source definition as follows:


The above code should incorporate live sorting in the above view for an earlier version of WPF. It should not be recommended for any version including and after 4.5.

Download Code:


Wednesday, November 13, 2013

ASP.NET Web API based REST Service - An Introduction

ASP.NET Web API is a microsoft's framework for creating HTTP services that can reach a broad range of clients including browsers and mobile devices. In this post we are going to discuss how we can create a simple HTTP service using ASP.NET Web API and observe them using Fiddler.

Implementing ASP.Net Web API Service
We can create ASP.NET Web API projects using ASP.NET MVC project template. As soon as we select this type of project, we can select Web API in the second dialog shown for selection.



The services can also be added later as project items as follows:



The HTTP based rest services are implemented as Api Controllers in ASP.NET Web API. It is an object that handles HTTP requests. The service must inherit from ApiController in System.Web.Http namespace in System.Web.Http assembly.



As we discussed above, ASP.NET Web API suggests implementing HTTP based REST services as controllers. We can add a controller directly to Controllers folder in ASP.NET web API project. We can also use shortcut keys to bring up Add Controllers dialog.



Since we would be providing the complete definition ourselves, let's just add an empty MVC controller. You can see that we have added definitions of Get, Post, Put and Delete methods. These methods would be used by the framework as we received the corresponding GET, POST, PUT and DELETE http requests for the service. These names are based on conventions, alternatively we can decorate these methods with HttpGet, HttpPost, HttpPut and HttpDelete attributes from System.Web.Http namespace.



Now we update the contents of the file adding our definition of StudentsController.


Inspecting / Analyzing & Generating HTTP Requests
We can use Fiddler for debugging ASP.NET web api based HTTP services. Fiddler is a Telerik utility which is used to inspect, analyze and generate HTTP requests. It can also be used to compose HTTP request messages. After analyzing a request, we can update and replay the message.



ASP.NET Web API fully supports content negotiation out of the box. We can request data in the required format by specifying the details in the GET request. Here we have picked up a HTTP GET request in Fiddler. The request has the details about the content types supported by client browser.



Requesting Content in JSON format
We can very well compose the GET request in Fiddler to request data in a different format. As we have just stated that ASP.NET Web API fully supports content negotiation. We can also request data in a different format. In the following we are requesting data in text/json format using Fiddler.



The response from the API can be viewed in Fiddler. The above request results in the following response:



HTTP POST to ASP.NET Web API
We can also generated HTTP POST requests using Fiddler.Here we want to post some data in JSON format. We need to specify the content type to text/json in order for the service to correctly handle the data.



The request is received by ApiController in the method used for handling HTTP post requests. By convention, this is the method starting with name as Post. We can also decorate the methods with specific HttpPost attribute if we don't want to follow conventions.



Download



Wednesday, November 6, 2013

Using Win ADK's Windows Performance Analyzer with EventSource

This is the third post in our discussion regarding the use of ETW tools with EventSource API in .net framework 4.5. In the previous posts, we have discussed how we can use PerfView (link), LogMan, PerfMon, PerfMonitor and TraceRpt with our EventSource (link). This post is about using Windows Performance Analyzer.

Windows Performance Analyzer can be used to analyze ETL [Event Trace Log] files. These ETL files can be generated from any ETW tool including PerfView, XPerf or Windows Performance Recorder (WPR). Here XPerf is a legacy ETW tool. PerfView is available as a separated single file download from Microsoft Download Center.



Here WPR and WPA are part of Windows Performance Toolkit, which can be installed with Windows ADK (Assessment & Deployment Toolkit). The toolkit can be downloaded from Microsoft's Download Center.



Just make sure that Windows Performance toolkit is installed by the installer.



After installation, you should see the toolkit in the start menu. There are two tools including Windows Performance Analyzer (WPA) and Windows Performance Recorder (WPR).



Let's use PerfView to generate ETL file for ETW events generated for a sample application. Now WPA relies on additional specific events in ETL files when a trace is stopped. The release version of PerfView doesn't generate those events in the file but there is a separate download from Vance Morrison's blog. Hopefully, this feature should be added to the release version soon.



We need to run PerfView with Admin Privileges. We have a very simple application which just writes a couple of events to ETW and exits. PerfView should be able to push those events to ETL file. But first let's first look at event's data with Activity Id (s) and RelatedActivity Id (s).



Just look at the extra events generated from Morrison's PerfView in order to support WPA.



PerfView generates the ETL file in the same folder by default. It is in compressed format. We can decompress and open the ETL file in Windows Performance Analyzer. We can drop the Generic Events to Analysis view to see the details about particular events and their order. I think this can be improved further as you still don't see activity and related activity Ids here.



Download

   

Friday, November 1, 2013

October 2013, What a month this has been...

October is about to pass. But what an amazing months it has been...

The greatest skill a developer has to have is to adjust within a group of people. Believe me, this is even difficult than developing a technical skill. Interpersonal communication is a craft and one needs to master it in order to better survive for selling the ideas. Since we are in the business of consulting for software services, we need to move between organizations every few months and years. Not all the people are welcoming, they have different psychologies and motivations. These events prove to be a practice. You need to go to the venue, introduce yourself, present in the presence of total strangers - you have no idea about their backgrounds.

Few Advices & Lessons Learned
Humor is an interpersonal lubricant. Since we are interacting with the audience for the first time, we need to break the barrier of being strange. Humor is the universal lubrication of friction between people. You are never strange for the people who smile with you. Although this is the most effective technique, but this can go either way. You need to pull off the joke really well. Like an actor, the punch line has to be effective. Practice it, tell it a few times to your friends no matter how annoying they become.

One must go through the session material before the session. I have realized that we try to be very detail oriented when are creating slides for our presentation. These conferences and code camps are not class room sessions. People are interested in links and ideas they can take home. This give them direction for their technical development. We need to give them as much as possible in the limited time we have. But the most important thing is to motivate them enough about the stuff that they study further after the event. We need to be fresh for the event, so take a nap before the event.

We can also record the session if possible. This is one of the take away item for yourself. A recorded session can be useful for preparing yourself if you are speaking about the same stuff later on. This is specially useful if you have been away focusing on some other technological stuff for a few months. You can also publish them for general audience if you think this might help them learning about the topic.

We must time box our session. A session has a list of agenda items. We need to make sure that we time box the agenda so that we are able to cover each item and still be able to finish our session in time. If there is an item which is taking more time, tell about some pointers the audience might take away with them. We must also be ourselves, confident and to the point. Call your family or friend for a little chit chat in the morning.

Session Slides
I have presented my thoughts and ideas over two areas from a huge list of Microsoft technology stack. They are as follows:
  1. Cross Platform Development using Portable Class Libraries
  2. Event Source API, ETW and Semantic Logging Application Block
I have uploaded the session slides on my sky drive. You can view it here or download it if you want to view them offline.



Session Videos
Here are the session videos from the events in Silicon Valley, Boston and Southwest Florida Code camps.

semanticLogs from Muhammad Siddiqi on Vimeo.


Portable Class Libraries - Boston Codecamp 20 from Muhammad Siddiqi on Vimeo.


slab from Muhammad Siddiqi on Vimeo.


Cross Platform Development with Portable Class Libraries from Muhammad Siddiqi on Vimeo.

Download Code
You can also download the code from these sessions. I even uploaded that to my sky drive.

Sunday, October 20, 2013

SLAB, EventSource and Standard ETW Tools

As we have been discussing through the past few posts that we can direct event log data generated through EventSource to be saved in different destinations. EntLib6 provides a number of sinks provided just for that purpose out of the box. They include support for console, files, Sql Server database and Windows Azure Table storage. We have also seen how we can direct the event's data to Windows Event Log [Discussion]. It would be interesting to see how we can use the existing ETW tools to view the event's data generated using EventSource API in .net framework 4.5.

EventSource API is based on registration free ETW [Event Tracing for Windows] data. This means we don't have to register an ETW event provider for generation and consumption of these events. The existing tools including LogMan and Windows Performance Analyzer are based on registered event providers. But since EventSource API is still based on the same ETW infrastructure, we can register the Event Provider manually and direct the events' data to an ETL file. Since these tools can work with this format, we can use our expertise in these tools to troubleshoot and analyzer situations we need this data for.

It must be remembered that EventSource API uses ETW infrastructure only in the case of out-proc listeners.

Performance Monitor [Perfmon.exe]
Since EventSource establishes an ETW session for out-proc consumers, we should be able to use PerfMon to see the details of the session.



As an ETW provider, each event source is assigned with a GUID. We can note the unique identifier from the properties of the provider. This identifier is generally used by ETW controllers to start / stop a session. We can also verify that the streaming mode for the configured EventSource as Real time.



PerfMonitor.exe
Like PerfView, PerfMonitor is also based on TraceEvent library. The major benefit of PerfMonitor over Perfmon is that the former can be used as an ETW controller.



https://bcl.codeplex.com/releases/view/99985

Logman
Internally Logman seems to use Performance Logs & Alerts service. If the service is not running, the utility just starts this. Make sure that we are running the command prompt with Administrative privileges.



Here we are creating a trace data collector using Logman utility.


After running the above command, we should be able to find the specified collector when queried. This can also be done using Logman. The utility supports a verb "query" which list all the data collectors.



We need to start the data collection for the events generated from the event provider. In order to do that we can use Logman's start verb with the provider's name as follows:

Querying the provider again should show the status of the provider as Running. This should ensure that the command ran successfully.



Stopping the trace session should flush the trace data in the ETL file. Here is the file generated for our provider in the same folder as we specified while creating trace data collection for the provider.



Logman also registers a user-defined trace which can be viewed with Performance Monitor.



Tracerpt
This is another useful Windows Utility for ETW data. One of the usage for this utility could be to process the ETL files generated using any other tool. It can be used to generate human readable files from the binary data. Let's see the following usage:


The above command should generate dumpfile.xml and summary.txt in the same folder. For the events generated for our event provider, the following files are generated. You can open them to have an idea about the expected format for the generated files. We should note that the utility allows us to control the names and format of dump and summary file names and formats. The supported formats for dump file include XML (default), CSV and EVTX.

  

We can also generate xml (default) or html based report for the generated events data. Here we are generating the report in html format. It generates the report based on the etl file provided in the same command.


You can have a look at the following report generated for events generated from our event source.



Tracerpt can also use ETW data from real time sessions. We can use PerfMon to determine the session we are interested in. We can then use the same name with -rt switch for TraceRpt. Here is the session details for our event source.



Thursday, October 10, 2013

VS2013 RC: Portable Class Libraries Target Frameworks

In this post we are going to discuss the changes in target framework selections in Visual Studio 2013 RC. Up until, Visual Studio 2012, Visual Studio has supported the following framework selections:
  1. .Net framework 4.0, 4.0.3 & 4.5
  2. Silverlight 4 & 5
  3. Windows Phone 7.x & 8.0
  4. Windows Store Apps
  5. Xbox 360
With Visual Studio 2013 RC, now Portable Class Libraries can also support targeting .net framework 4.5.1 and Windows Store Apps (Windows 8.1). But Visual Studio 2013 has mainly reduced other target frameworks for PCL. The following is the list of frameworks which are not supported in Visual Studio 2013 for PCL development.
  1. Silverlight 4
  2. Windows Phone 7.x
  3. Xbox 360


Supported Framework Xml Changes
In order to support a combination of target frameworks for PCLs, Visual Studio keeps a list of xml documents in Supported framework folders for each profile. Actually, a profile is selected based on target framework selection matching the profile.



Now we can develop PCLs in three different versions of Visual Studio. They are Visual Studio 2010 SP1, 2012 and 2013 RC. It appears that in order to limit target frameworks for PCL development in Visual Studio 2013 RC there are some changes in the Xml schema for these documents. Let's see the document for Silverlight 4 in Profile44 in Visual Studio 2012.


Visual Studio 2013 has added a few new attributes. They are FamilyName, MinimumVisualStudioVersion and MaximumVisualStudioVersion. We must remember that the version of Visual Studio 2012 and 2013 RC are 11.x and 12.x respectivly.


The above is the Xml document for Silverlight 4 in the Supported Framework folder in the same profile after the installation of Visual Studio 2013 RC. We can notice that the maximum version for the framework is "11.0", so the framework is not supported in Visual Studio 2013 RC. Below is the framework xml for Windows Store apps (Windows 8.1) in \v4.6\Profile\Profile44\SupportedFrameworks folder. From the xml document, we can clearly figure out that this profile can only be used for a Visual Studio version 2013 RC at minimum.


Converting Project for Unsupported Frameworks
Opening a PCL project in Visual Studio can convert the target frameworks to one of the supported frameworks. A Silverlight4 project is converted to Silverlight5 project.



We can opt to not migrate the project to a new framework version but we wouldn't be able to work on it in Visual Studio 2013 RC then. The project is shown in Solution Explorer as follows:



There seems to be a bug in the current version, which shows the following error message if we don't opt to convert, by not checking the check box. It is shown as follows:

Friday, September 27, 2013

EntLib6 - EventSource & Keywords for Semantic Logs

In this post we are going to discuss two very common details for defining event source. They are EventLevel and EventKeyword. They are so central for defining and enabling events. Yet, it is very easy to make mistake in using or expecting a particular behavior with the values used for them. The value assigned to them might include other values as well, which might come as a surprise for others.

Event Level
As we discussed, the event level selected for enabling a listener is the maximum value of the level which would be forwarded to the subscribed sinks. All events assigned with levels equal or lesser than the assigned enumeration values are considered enabled.

I have really felt that intellisence is not really useful in this case. The levels are ordered in the ascending order of their names. It could have been really useful if they could be available in the order of their values [but there are some things in life you cannot get, hence the void... :)]



Let us try to make it a little easier. We can come up with an acronym like from top to bottom, it is VIWECL. In order to remember this, we can come up with some catchy phrase like Very Intelligent Wife so Extremely Cool Life. Here whatever level we specify to enable the event source, all the lower levels are automatically included. I think we should not forget it now. When someone special is involved, it is better to try not to forget...



Keywords As Power of 2
It must be remembered that keywords must be assigned values in power of 2. It makes it easier to enable and disable the events. Here is an inner class Keywords defined in an EventSource. You can see that the values are 1, 2, 4, 8. They are powers of 0, 1, 2, 3 respectively.


In order to understand how we can use these keywords, we need to understand the bit mask for these keywords. We have the bit masks for Creation, AdmissionApproved, FeeDeposited and TransferCreditHours.



Definition of EventSource
Let us introduce an EventSource. The event source has four [Event] based methods. The methods are using the values of keywords defined above. They are also assigned with particular


Special Event Level & Keyword
EventLevel.LogAlways has special meaning when we use it to enable the events. It means to ignore levels when filtering is being applied i.e. all levels are considered. Since this is a perfectly fine EventLevel value which could be assigned to an Event method, it might get confusing when you would just want to enable the events assigned to this level, as it has a specific mean in this scenario. I think we should better avoid it when defining our events. I still have this in my EventSource just to give you an example.

Similar is the case for Keywords.All. It means to include all keywords. Basically, when we don't specify an Keyword to enable a scenario, all those events which are assigned with a specific value of keywords are just ignored. If we want to consider all of them instead, we can use this value for EventKeyword when enabling events. It has a value -1. We can use this value for enabling all keywords with xml configuration in the case of Semantic Logging Service configuration.

Configuration for In-Proc usage
As we have discussed before we can use SLAB both In-Proc and Out-Proc scenarios. For the In-Proc scenarios, we can enable the events using ObservableEventListener specifying the details of event's levels and keywords we are specially interested in. Let's first consider the case of specials for level and keyword. Here we are using EventLevel.Always and Keyword.All, which means to say, I want to log everything.


And it does have all the logs, no matter what Level or Keywords they are assigned with. Here is the output for the above code:



EventLevel.LogAlways with other keywords
Let's see the following combination. Here we are using EventLevel.LogAlways with two keywords.


If you go back and look at the definition of our EventSource, we don't have this combination of Level and Keywords for any event. There is only one event with Level assigned as EventLevel.LogAlways, but that has different keyword assigned.

So does it mean, it should include none of the events? Acutally No!!!

As we have discussed before, it has a special meaning the case when we are enabling event. It means to ignore whatever value of level assigned to events. In other words, just use the keywords specified for enabling the events. So it should just enable the two events, defined with the specified keywords. If we run this code, it shows the following output:



Level is the Max Level for Events
As we have discussed in the stairs diagram for Events. The event level assigned is the maximum value for the EventLevel considered for logging. In the following example, we are using EventLevel.Error for enabling the event. If we look at the diagram, we should realize, it would automatically include EventLevel.Critical and EventLevel.LogAlways. Let's change the configuration as follows:


As expected, it includes events assigned with levels as EventLevel.Error, EventLevel.Critical and EventLevel.LogAlways. Here is the output:



-matchAnyKeyword for Out-Proc Configuration
We can use a similar configuration for Out-Proc scenario. In the Semantic Logging service provided by Semantic Logging Application Block, matchAnyKeyword attribute is provided for source configuration. The value assigned to this keyword is treated differently. It is the sum of all keywords we want to enable. Here we want to enable Keyword.Creation and Keyword.AdmissionApproved. We need to just add the two numbers to calculate the value which should be assigned to the attribute. We can also use bit calculation by masking the two keyword bits. We can then convert the value in decimal as follows:



Let's use this in the configuration for the service. Here we are interested in the events with maximum level assigned as EventLevel.Informational. We are specially interested in the two keywords specified above.


Since we are using Windows Azure Sink with Storage Simulator, we should be able to see the following events logged in SLAB table.



Download