• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)


Date: Tuesday, 27 Aug 2013 11:49

Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/27/methods-to-verify-are-datatables-or-datasets-the-same.aspx

 
I wanted to verify, are DataTables in DataSets the same. I found a few similar implementations on StackOverflow, but the one that I've selected 
(http://stackoverflow.com/questions/7517968/how-to-compare-2-datatables/7518025%237518025) didn't work and returned unexpectedly false, when comparing two cells with the same values

          tbl1.Rows[i][c]     2     object {long}
          tbl2.Rows[i][c]     2     object {long}
          tbl1.Rows[i][c] == tbl2.Rows[i][c]     false     
I found, that it should be used Equals instead of ==.
          Equals(tbl1.Rows[i][c], tbl2.Rows[i][c])     true
There are a few articles, explaining the difference and reasons behind it.

 
'==' Operators are overloaded, not overridden, which means that unless the compiler knows to call the more specific version, it'll just call the object version



Below are tested methods to  check, are  DataTables or DataSets the same
/// <summary>
        /// 
        /// </summary>
        /// <param name="tbl1"></param>
        /// <param name="tbl2"></param>
        /// <returns></returns>
         public static bool AreTablesTheSame( DataTable tbl1, DataTable tbl2)
        {
            if (tbl1.Rows.Count != tbl2.Rows.Count || tbl1.Columns.Count != tbl2.Columns.Count)
                return false;

            for ( int i = 0; i < tbl1.Rows.Count; i++)
            {
                for ( int c = 0; c < tbl1.Columns.Count; c++)
                {
                   if (!Equals(tbl1.Rows[i][c] ,tbl2.Rows[i][c]))
                        return false;
                }
            }
            return true;
        }
       /// <summary>
        /// Compare content of all rows in the table.
        /// </summary>
        /// <param name="ds1"> The DS1.</param>
        /// <param name="ds2"> The DS2.</param>
        /// <returns></returns>
        public static bool  AreTablesTheSame( DataSet ds1, DataSet ds2)
        {
            if (ds1 == null && ds2 == null)
                return true;
            if (ds1 == null || ds2 == null) //only one is null
                return false;
            if (ds1.Tables.Count != ds2.Tables.Count)
                return false;

            for ( int i = 0; i < ds1.Tables.Count; i++)
            {
                if (! DataTableHelper.AreTablesTheSame(ds1.Tables[i] ,ds2.Tables[i]))
                    return false;
            }
            return true;
        }
Author: "Michael Freidgeim" Tags: ".Net Framework"
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 17 Aug 2013 01:42

Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/17/2-options-to-write-tests-for--wcf-services.aspx

When writing Integration Tests for  WCF Services, you have 2 options to access SUT(system under test)
1. Inproc – Test Classes are calling methods from application DLLs directly.
2. External – Tests are calling external services using client proxy with specified URL.

If you own the code of the service, the Inproc method is preferred as it is allow to test units rather  the whole object as black-box.

If your tests are external clients of the service determined by URL, you need to have the service installed somewhere, e.g. on the test server.
So you need to use your services URL to run your external tests.

I've asked our team in a future when creating new/modifying existing test methods to use inproc approach rather than external clients.

Author: "Michael Freidgeim" Tags: "Testing/Debugging/Logging"
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 17 Aug 2013 01:20

Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/17/angular-js-vs-knockout-js--quotes-and-links.aspx

Our team need to choose a JS framework. Base on the research Angular is the preferred framework. The only essential concern - if you need to support old IE6/7 browsers(but also there are articles how to support/workaround the IE issues)


Knockout supports almost all major browsers:IE6+, Firefox 2.0+, Chrome, Safari and Opera 10+.Link - Knockout : Browser supportAngular Supports only A grade browsers at the moment:
IE8 and higher,chorme, safari, FireFox,iOS, Android
Support for ie6/7 is possible in theory, but in practice we don't run any ouf our regression tests on it, so it probably does not work
Link - Browser Support for Angular - Google Groups

  • They are also similar in many important things: data-binding and directives. However, Angular does it better and prettier. You can stick your expression practically anywhere in the document using the {{}} clause and this alone give you quite a lot of power and yet readability. You also don't need to useobservable for model objects - you use normal objects and Angular does its magic.
  • Angular is more feature complete. It gives you out of the box a complete solution for almost anything you want in client-side web-app development, including mini-jquery, dependency-injection, redirection and deep-linking, animation and powerful, modern event handling. Knockout doesn't provide any of this. 
  • Angular's html templating/customization/
    binding is less limited than KO - it let you build directives which are html tags, attribute, classes or comments. In KO, you can do it to comments (which looks rather ugly) or in the data-binding attribute. Much less expressive and pretty.
  • Performance: sometimes performance is better in KO, sometimes in Angular (see the other answers for that). But i don't see it as an issue anyways.
  • Angular is built from testability and clean project organization better than any framework i know of (but i don't know all of them).


  • I found Knockout easier to comprehend and get going with compared to Angular which is always a winner when starting something new but Knockout is pretty much just about binding so for things like routing and separation of concerns you need to use other libraries whereas Angular has all that built in. I think for smaller projects without too much logic required Knockout is ideal and I really like the simplicity of SammyJS however, for larger applications that requires dependency injection, a clear separation of concerns and the easy ability to unit test logic Angular is a winner. In this case the term “pick the right tool for the job” certainly applies.



    Angular.jsAngular.js is a very nice framework . Developed by Googler's, it has some very interesting design choices.

    Pros: Very well thought out with respect to template scoping and controller design. Has a dependency injection system (I am a big fan of IOC). Supports a rich UI-Binding syntax to make things like filtering and transforming values a breeze.

    Cons: Codebase appears to be fairly sprawling and not very modular. Views are not modular enough .

     Knockout.jsKnockout.js is an MVVM framework that receives lots of praise from its supporters. It stresses declarative UI bindings and automatic UI refresh.

    Pros: Binding support. Great documentation and amazing tutorial system.

    Cons: Awkward binding syntax and lacks a solid view component hierarchy. I want to be able to reuse components easily. I also feel like identifying as an MVVM framework is deleterious. Hardly any of these frameworks are MVC, but are of the MV* variety (MVP, MVVM, etc)

    Additional reasons from my co-worker Krishna Nadiminti 

    Which framework to use?

    A comparison of Angular Vs. Knockout: - http://www.slideshare.net/basarat1/mvvm-knockout-vs-angular

    A sample app some guys did in various frameworks:

    ·         TODO App in Knockout [Not bad – but still one file with too many things in it]

    ·         TODO App in Angular [More files, forces you into thinking about structure and avoiding jQuery]

    ·         TODO Apps in other frameworks (for reference only – I don’t think we can spend days evaluating the best framework – and frankly it does not matter that much.)

     

    About that steep learning curve:

    I’ve worked with both – it took me almost the same time to get a brand new app running  in KO vs Angular. Once I looked at Angular – I remembered all the additional frameworks (require, sammy, amplify,   I needed to learn to make a ‘properly designed’ client-side app with KO as the basis.

     

    Other things we considered in choosing AngularJS:

    Momentum behind the framework (Google!)

    Completeness in terms of client-side application requirements (clean design, testability, data-binding, routing, module loading, ui-widgets)

    Browser support

    Plugging into existing apps

    Playing well with other frameworks/scripts

     

    More links:

    Some pro’s of AngularJS over KnockoutJS http://litebyte.net/blog/?p=135




    Angular JS and old versions of IE 
    Author: "Michael Freidgeim" Tags: "CSS/DHTML/JavaScript"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 11 Aug 2013 04:57

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/11/postsharp-error-ps0052-the-plug-in-postsharp.patterns.diagnostics.weaver-was-not-found.aspx

    After some merge of source code branches I've got a build error
     POSTSHARP : error PS0052: The plug-in "PostSharp.Patterns.Diagnostics.Weaver" required by the type "PostSharp.Patterns.Diagnostics.ILogAspect" was not found. 
    [C:\Builds\\MAIN_Master\Sources\main\ServiceInterfaces\myProj.csproj]

    I wasn't able to find immediately which code caused the error, and google also didn't give me an answer.
    After some investigation I found that PSproj file had missing entries for PostSharp.Toolkit.Diagnostics.Weaver.dll and PostSharp.Toolkit.Diagnostics.Weaver.NLog.dll.

    It will be easier if error PS0052 will include name and line from PSproj file

    <Project xmlns="http://schemas.postsharp.org/1.0/configuration" xmlns:dg="clr-namespace:PostSharp.Patterns.Diagnostics;assembly:PostSharp.Patterns.Diagnostics" ReferenceDirectory="{$ReferenceDirectory}" >
      <Property Name="LoggingBackEnd" Value= "nlog " />
      <Using File="default" />
      <!-- if the following 2  entries will be missing, it would cause PS0052 error -->
    <Using File="..\..\..\packages\PostSharp.Toolkit.Diagnostics.NLog.2.1.1.12\tools\PostSharp.Toolkit.Diagnostics.Weaver.NLog.dll" />
      <Using File="..\..\..\..\packages\PostSharp.Toolkit.Diagnostics.2.1.1.12\tools\PostSharp.Toolkit.Diagnostics.Weaver.dll" /> 
      
      <dg:LoggingProfiles>
        < dg:LoggingProfile Name =" Exceptions" OnExceptionOptions="IncludeParameterType | IncludeParameterName | IncludeParameterValue | IncludeThisArgument" OnEntryLevel="None" OnSuccessLevel =" None" />
      </dg:LoggingProfiles>

      <Multicast>
        <!-- Add exception logging to everything  .-->
        < LogExceptionAttribute xmlns="clr-namespace:PostSharp.Patterns.Diagnostics;assembly:PostSharp.Patterns.Diagnostics"  AttributeTargetAssemblies="MyProj.ServiceInterfaces" AttributeTargetTypes="MyProj.ServiceInterfaces.*" AttributeTargetMembers= "*" />
      </Multicast>
    </Project>
    Author: "Michael Freidgeim" Tags: "Visual Studio/TFS, PostSharp"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 11 Aug 2013 04:45

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/11/misleading-compiler-error-is-a-type-but-is-used-like.aspx

    The following C# line to call AutoFixture method
    fixture.Create<List<List<FlightItem>>();
    Caused Error     63     'System.Collections.Generic.List<System.Collections.Generic.List<BusinessEntities.FlightItem>>' is a 'type' but is used like a 'variable'     
    It wasn't obvious, that closing '>' was missing.
     I believe  that compiler can recognize and provide better error message.
    Author: "Michael Freidgeim"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 11 Aug 2013 04:40

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/08/11/my-favorite-visual-studio-tools.aspx

    Recently I've setup Visual Studio on new machine and I wanted to install a few tools, that I am using all the time.

    It includes 
    Resharper, 
    GhostDoc(aka   ctrl-shift-d)   
    http://visualstudiogallery.msdn.microsoft.com/46A20578-F0D5-4B1E-B55D-F001A6345748
    checkin - alternative to VS 2012 pending window ( http://stackoverflow.com/questions/12204845/vs2012-return-to-a-normal-tfs-checkin-window)
    Postsharp
     
    See also my old post Tools to debug/Trace ASP.NET applications.(many of them are outdated already) 
    Author: "Michael Freidgeim" Tags: "Visual Studio/TFS"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 13 Jul 2013 12:40

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/07/13/error-409-webservicebindingattribute-is-required-on-proxy-classes.-after-creating.aspx

    After creating new build type 32bitDebug (based on Debug) according to How to: Create and Edit Configurations 
     
    I received the following error during compile
    Error     409     WebServiceBindingAttribute is required on proxy classes.     C:\TFS\myProject\SGEN     MyAssemblyName

    The instructions from SO answer http://stackoverflow.com/questions/354503/webservicebindingattribute-is-required-on-proxy-classes

    In the properties for the project. In the Build Tab. Select "Generate Serialization assembly:" to Off

     fixed the problem.

    See also http://msdn.microsoft.com/en-us/library/system.web.services.webservicebindingattribute(v=vs.110).aspxv
    Author: "Michael Freidgeim" Tags: "Visual Studio/TFS"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 09 Jun 2013 02:11

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/06/09/number-of-unit-test-projects-in-visual-studio-solution.aspx

    Some time ago I have discussion with my co-worker  how to organize test projects. 
    Should we have a single test project that does all sorts of things and references every project?
    It is good to have one integration test dll, but for unit tests, what is the point merging everything into one.


    In ideal world I agree that small independent projects are better. Unfortunately we have  solution size limitations 
    From http://stackoverflow.com/questions/5197192/which-is-better-unit-test-project-per-solution-or-per-project?lq=1
    However, Visual Studio performance quickly degrades as the number of projects increases. Around the 40 project mark compilation becomes an obstacle to compiling and running the tests, so larger projects may benefit from consolidating test projects.

    From http://msdn.microsoft.com/en-us/library/bb668953.aspx
    • Single solution. If you work on a small system, create a single solution and place all of your projects within it.
    • Partitioned solution. If you work on a large system, use multiple solutions to group related projects together. Create solutions to logically group subsets of projects that a developer would be most likely to modify as a set, and then create one master solution to contain all of your projects. This approach reduces the amount of data that needs to be pulled from source control when you only need to work on specific projects.
    • Multiple solutions. If you are working on a very large system that requires dozens of projects or more, use multiple solutions to work on sub-systems but for dependency mapping and performance reasons do not create a master solution that contains all projects.
    At the moment we decided to go with one huge integration test and one huge unit test projects.
    And we constantly trying to keep reasonable (not too many) number of projects in the main solution. Unfortunately this number is quite big - 70+. 
    Author: "Michael Freidgeim" Tags: ".Net Framework, Testing/Debugging/Loggin..."
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 02 Jun 2013 11:51

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/06/02/office-customization-tips.aspx

    I've posted below a few Office customization tips, that I prefer to setup when using a new computer.

    Display File Path in Excel


    Steps to display the file path of the current open file (Excel 2007):


    1. Right click on the ribbon

    2. Choose "Customise quick access toolbar"

    3. Select "All commands"

    4. Then choose "Document Location"

    5. Click "Add".. and it will appear on the right



    MS Word 2007/2010 - display path and filename in menu bar

    http://www.technologyquestions.com/technology/microsoft-office/116356-office-2007-display-path-filename-menu-bar.html


    To add the Document Location command to your Quick Access Toolbar(QAT):


    - Click the More (or Customize) command at the end of the QAT and then click

    More Commands.

    - In Choose Commands From, select Commands Not In the Ribbon.

    - Locate Document Location, select it, and then click Add to add it to your

    QAT.


    Prompt to open a Microsoft Office Word document as read-only


    http://office.microsoft.com/en-us/accounting-help/prompt-to-open-a-microsoft-office-word-document-as-read-only-HP001173084.aspx


    1. In a Word file, click the Microsoft Office Button , and then click Save As.

    2. Click Tools, and then click General Options.

    3. Select the Read-only recommended check box.

    4. Click OK.

    5. Click Save. If prompted, click Yes to update the existing file with the new read-only setting.


     
    Author: "Michael Freidgeim" Tags: "General Tips"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 02 Jun 2013 11:08

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/06/02/serialization-error-when-property-is-declared-as-base-class-but.aspx


     
    I've receive  quite generic error Message :
     Type 'MyclassType' with data contract name 'MyclassType:http://schemas.datacontract.org/myNamespace' is not expected. Consider using a DataContractResolver or add any types not known statically to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding them to the list of known types passed to DataContractSerializer.
    Type : System.Runtime.Serialization.SerializationException, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089


    After investigation I found that the class that I tried to serialize, had a property declared of the base class, but at runtime derived class was assigned, and serialization was unable to resolve it.
    The fix was simple- to add KnownType property to container class.

        [KnownType(typeof(MyclassType))]
    public class Mycontainer 
    {
     MyBaseclass PropertyOfmyClass { get; set;}
    .......
    }

    public class  MyclassType : MyBaseclass
    { ....}

    Unfortunately, the serialization time error message didn't specify the name of container class , not the name of property. it makes harder to fix the error.

    Author: "Michael Freidgeim" Tags: ".Net Framework, Web Services/WCF"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 02 Jun 2013 04:29

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/06/02/response-for-rest-method-was-truncated-because-default-maxitemsinobjectgraph-was.aspx

    We have a   REST service with attributes [WebGet(UriTemplate ="...", BodyStyle =WebMessageBodyStyle.Bare, ResponseFormat =WebMessageFormat.Xml)]
    Normally it worked fine. But for particular data it has a huge response that was truncated. 

    The size returned in a few attempts in IE browser was 2196456, in Chrome slightly different 2195397.


    After a search in google I found http://forums.asp.net/post/4948029.aspx, that has a number of suggestions.

     

    For WCF service that will transfer large amount of data in operations, here are the configuration settings you need to check:

    1) the maxReceivedMessageSize attribute of the certain <binding> element(in your case, that's the webHttpbinding)

    #<webHttpBinding> 
    http://msdn.microsoft.com/en-us/library/bb412176.aspx

    2) The <readerQuotas> settings (under the <binding> elements) which has control over the maxarrayLength, maxStringLength ,etc...

    #<readerQuotas> 
    http://msdn.microsoft.com/en-us/library/ms731325.aspx

    3) The DataContractSerializer behavior which has a MaxItemsInObjectGraph property. You can configure it via ServiceBehavior of your WCF service

    #DataContractSerializer.MaxItemsInObjectGraph Property 
    http://msdn.microsoft.com/en-us/library/system.runtime.serialization.datacontractserializer.maxitemsinobjectgraph.aspx

    4) And if your WCF service is hosted in ASP.NET/IIS web application, you also need to enlarge the "maxRequestLength" attribute of the <httpRuntime> element (under <system.web> section).

    #httpRuntime Element (ASP.NET Settings Schema) 
    http://msdn.microsoft.com/en-us/library/e1f13641.aspx

     After a few attempts my collegue found that our problem was caused by

    #DataContractSerializer.MaxItemsInObjectGraph Property 
    http://msdn.microsoft.com/en-us/library/system.runtime.serialization.datacontractserializer.maxitemsinobjectgraph.aspx

     

    <behavior name="MyOperationBehavior">
              < dataContractSerializer maxItemsInObjectGraph ="2196456" />
    </behavior>

    Author: "Michael Freidgeim" Tags: "ASP.NET, Web Services/WCF"
    Comments Send by mail Print  Save  Delicious 
    Date: Sunday, 02 Jun 2013 04:17

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/06/02/upgrading--postsharp-from-ver-2.1-to-new-version-3.0-yet-again.aspx

    I was upgrading our solutions from PostSharp 2 to PostSharp 3. The small solution based on cache attribute from http://cache.codeplex.com/ was upgraded without any problems.

    Upgrading my main solution by installing nuget package PostSharp also was quite well. 

    The only annoying thing was that installer added 
    RequiresPostsharp.cs file to all projects, that already had SkipPostSharp=true setting and I had manually remove them
    The issue was reported at
    http://support.sharpcrafters.com/discussions/problems/1220-requirespostsharpcs-breaking-the-build-when-skippostsharptrue and 
    http://support.sharpcrafters.com/discussions/problems/234-postsharp-2175-nuget-package-restore-and-skippostsharp
    but Gael unfortunately  considers this behavior "by design".
     
    More work was to convert   psproj files PostSharp.Toolkit.Diagnostics ver 2.1 to new PostSharp.Patterns.Diagnostics.3.0.
    There was no documentation.I've only found a short notice at the bottom of 
    http://doc.postsharp.net/postsharp-3.0/Default.aspx##PostSharp-3.0.chm/html/1530becd-2bf4-470e-9e8e-31c16fdd01ec.htm 

    PostSharp Toolkits 2.1 need to be uninstalled using NuGet. Instead, you can install PostSharp Pattern Libraries 3 from NuGet. 
    Namespaces and some type names have changed. 

    Uninstall for 2.1 suggested to remove NLog nuget package, which we are using regardless of PostSharp.

    I've run 
    https://nuget.org/packages/PostSharp.Patterns.Diagnostics/3.0.26 and 
    https://www.nuget.org/packages/PostSharp.Patterns.Diagnostics.NLog/ 
     Install-Package PostSharp.Patterns.Diagnostics
    Install-Package PostSharp.Patterns.Diagnostics.NLog

    The install of PostSharp.Patterns.Diagnostics.NLog didn't like the latest version of Nlog, but Gael 
    fixed it recently(http://support.sharpcrafters.com/discussions/problems/1211-nlog-weaver-version-error).

    The installs haven't changed the content of PSPROJ files and I had  to manually update them.
    1. Deleted old references to DLL and inserted dg:LoggingProfiles profile element
     
     <!--<Using File="..\..\..\..\packages\PostSharp.Toolkit.Diagnostics.NLog.2.1.1.12\tools\PostSharp.Toolkit.Diagnostics.Weaver.NLog.dll"/>
      <Using File="..\..\..\..\packages\PostSharp.Toolkit.Diagnostics.2.1.1.12\tools\PostSharp.Toolkit.Diagnostics.Weaver.dll" /> -->
    2. After advise from Gael  I've  removed the Task element and change <Data Name="XmlMulticast">into simply <Multicast>.
    3. I've also replaced namespace and DLL names in LogAttribute xmlns properties to be "clr-namespace:PostSharp.Patterns.Diagnostics;assembly:PostSharp.Patterns.Diagnostics"
    The psproj file becomes similar to the following and seemed to work.

    <?xml version="1.0" encoding="utf-8"?>
    < Project xmlns="http://support.sharpcrafters.com/discussions/problems/1264/r?go=aHR0cDovL3N1cHBvcnQuc2hhcnBjcmFmdGVycy5jb20vZGlzY3Vzc2lvbnMvcHJvYmxlbXMvMTI2NC9yP2dvPWFIUjBjRG92TDNOMWNIQnZjblF1YzJoaGNuQmpjbUZtZEdWeWN5NWpiMjB2WkdselkzVnpjMmx2Ym5NdmNISnZZbXhsYlhNdk1USTJOQzl5UDJkdlBXRklVakJqUkc5MlRETk9hbUZIVm5SWldFMTFZMGM1ZW1SSVRtOVpXRXAzVEcwNWVWcDVPSGhNYWtGMldUSTVkVnB0Ykc1a1dFcG9aRWRzZG1KcFduaGtWemt3; xmlns:dg="clr-namespace:PostSharp.Patterns.Diagnostics;assembly:PostSharp.Patterns.Diagnostics">
      <Property Name="LoggingBackEnd" Value="nlog" />
      <Using File="..\packages\PostSharp.Patterns.Diagnostics.3.0.26\tools\PostSharp.Patterns.Diagnostics.Weaver.dll" />
      <Using File="..\packages\PostSharp.Patterns.Diagnostics.NLog.3.0.26\tools\PostSharp.Patterns.Diagnostics.Weaver.NLog.dll" />
      <dg:LoggingProfiles>
        <dg:LoggingProfile Name="Exceptions" OnExceptionOptions="IncludeParameterType | IncludeParameterName | IncludeParameterValue | IncludeThisArgument" OnEntryLevel="None" OnSuccessLevel="None" />
      </dg:LoggingProfiles>
      <Multicast>

          <LogAttribute xmlns="clr-namespace:PostSharp.Toolkit.Diagnostics;assembly:PostSharp.Toolkit.Diagnostics" AttributeTargetAssemblies="Applications.MyApp" AttributeTargetTypes=" Applications.MyApp.MyCustomer" AttributeTargetMembers="*" OnExceptionLevel="Warning" OnExceptionOptions="IncludeParameterValue" />

       </Multicast>

    </Project>

    It was deployed to CI test environment, where we noticed delays and timeouts. I found that despite that only  OnExceptionLevel and  OnExceptionOptions were specified, the new LogAttribute generated verbose trace information, which caused severe performance hit.

    4. I had to change LogAttribute to LogExceptionAttribute and remove OnExceptionLevel and OnExceptionOption properties.
     
    Author: "Michael Freidgeim" Tags: ".Net Framework, PostSharp"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 25 May 2013 00:20

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/05/25/addifnotnullt-collection-extensions.aspx

    I want to post a few recently created collection extensions to write in one line, what otherwise takes 2 or more
         public static void AddIfNotNull( this IList coll, T newItem) where T : class
            {
                if (newItem != null)
                {
                    coll.Add(newItem);
                }
            }

             public static void AddRangeIfNotNullOrEmpty( this List coll, IEnumerable newItems) where T : class
            {
                if (!newItems.IsNullOrEmptySequence())
                {
                    coll.AddRange(newItems);
                }
            }
     
      public static void AddIfNotContains( this Dictionary dictionary,  TKey key, TValue value)
            {
                if (!dictionary.ContainsKey(key))
                {
                    dictionary.Add(key, value);
                }
            }
    The methods use
    public static bool IsNullOrEmptySequence(this IEnumerable c)
                  {
                          return (c == null || !c.Any() );
                  }

    I've also found a few extensions, that could be useful in https://pikacode.com/Barankin/Fabrika-dveri/file/default/CRM/Common/Extensions/LinqExtensions.cs
    Author: "Michael Freidgeim" Tags: ".Net Framework, Helper Functions"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 04 May 2013 03:18

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/05/04/override-tostring-using-json-serialization-or-something-else.aspx

    When creating a new data class, it’s a good idea to override ToString() method to output most of the data.
    It will help to see details in logs.

    The only exception if the class has a sensitive data like CreditCard number or password.
    For DataContract classes just use

    public override string ToString()
                  {
                          //Use JSON as the simplest serializer
                          string sRet = this.ToExtJsJsonItem();
                          return sRet;
                  }
     
    Sometimes it is useful to create extensions to standard .Net classes.
    E.g. In CookieHelper class I've created
      public static string ToJsonString( this HttpCookie httpCookie)
            {
                string sRet = httpCookie.ToExtJsJsonItem();
                return sRet;
            }
     
    The actual extension we are using was written by Nikita Pinchuk
          public static string ToExtJsJsonItem( this object item)
            {
                DataContractJsonSerializer serializer = new DataContractJsonSerializer(item.GetType());
                using ( MemoryStream ms = new MemoryStream ())
                {
                    serializer.WriteObject(ms, item);
                    StringBuilder sb = new StringBuilder ();

                    sb.Append( Encoding.UTF8.GetString(ms.ToArray()));

                    return sb.ToString();
                }
            }
     
    For non DataContract classes the extension method may not work-it could cause an error: 
    A first chance exception of type 'System.Runtime.Serialization.InvalidDataContractException' occurred in System.Runtime.Serialization.dll

    In this case you can try  JSON.NET or the JsonValue types (nuget package JsonValue).

    Sometimes I am using
     string sRet = this.XmlSerializeSafe();
     
    But it is also not working for all types, e.g. 
    MyClass cannot be serialized because it does not have a parameterless constructor.

    In some places we are using LinqPad's Dump an arbitrary object To Html String, but we found it is too heavy for logs, plain text is easier to read than HTML.

    I haven't tried yet ServiceStack.Text  C# .NET Extension method: T.Dump();
    Author: "Michael Freidgeim" Tags: "Testing/Debugging/Logging"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 04 May 2013 03:03

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/05/04/deploy-pdb-files-into-production-to-ease-exception-report-investigation.aspx

     

     For a long time I believed that PDB as a part of debugging information should not be included in production deployment. Recently my colleague suggested to copy them to simplify exception investigations. 

     

    The following  SO discussion convinced us that it is a good idea  ( at least for web sites).

     http://stackoverflow.com/questions/933883/are-there-any-security-issues-leaving-the-pdb-debug-files-on-the-live-servers

        These files will not be exposed to the public if kept in the right places (website\bin). 

     
     So we decided to deploy the PDBs into production.
     

    BTW, if you include PDBs with your deployments, you don't need to store them in a symbol server,

     as it is suggested in http://www.wintellect.com/CS/blogs/jrobbins/archive/2009/05/11/pdb-files-what-every-developer-must-know.aspx

    However we found that  PDBs   were generated not for all DLLs.  After some analysis we believe that MS changed default settings starting from VS 2008 (or may be since VS 2005) and make generation of PDB-only even for release mode. This is why older projects had generation of PDBs disabled.

    To change setting in Visual Studio there is an option in the "Project Properties", "Build", "Advanced...".

    Change "Debug Info:" to PDB-only.

    The screenshots are available in the posthttp://callicode.com/Homeltpagegt/tabid/38/EntryId/24/How-to-disable-pdb-generation-in-Visual-Studio-2008.aspx

     

    Related links:
    The article http://blog.vuscode.com/malovicn/archive/2007/08/05/releasing-the-build.aspx compares different options for debug and release and confirms that in 2007 pdbonly was the default for release configuration of visual studio                  

    /optimize+ /debug:pdbonly (release configuration of visual studio)

    The article Include .pdb files in Web Application Publish for Release mode (VS2012)  wasn't applicable for us, but may be useful for someone else.
    Author: "Michael Freidgeim" Tags: "Deployment, Testing/Debugging/Logging"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 16 Mar 2013 02:00

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/03/16/default-enum-initialisation-by-ms-create-unit-tests-wizard.aspx

    VS 2012 doesn't show Create Unit Tests wizard. However it can be used - see 
    Where is the “Create Unit Tests” selection?  and

    I've found the Wizard creates quite funny default for enumerator- to use constructor.

    PaymentType paymentType = new PaymentType (); // TODO: Initialize to an appropriate value

    I would prefer t have first/default enum value, e.g. 

    PaymentType paymentType =PaymentType.None ;

    I should suggest it to ALM Rangers, who started a new test generation project.
     
    Another funny default is for numeric primitives:
     
     Decimal totalAmount = new Decimal(); // TODO: Initialize to an appropriate value
     
    Author: "Michael Freidgeim" Tags: "Testing/Debugging/Logging"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 16 Mar 2013 01:43

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/03/16/wcf-transactions-are-not-supported-by-azure.aspx

    We have a service operation, for which it is very important to ensure that a client receives the status, that was determined at the end of operation.
    If client does receive the response, the server status should be "completed". Otherwise (in case of communication error), server status should be rollback and stay as "pending". The possible technical solutions were discussed and WCF Transactions support with  2PC(two phase commit) was selected.  We implemented service operation with transaction commit/rollback support and asked our clients to use it.
    Our main client is running on Azure. It was a big disappointment, when Readify consultant Himanshu Desai  adviced that WCF Transactions are not supported by Azure.

    I did a quick check on Internet and didn't find that is well known issue.
    Below are a few quotes to describe the limitation:

    2PC in the cloud is hard for  all sorts of reasons. 2PC as implemented by DTC effectively depends on the coordinator and its log and connectivity to the coordinator to be very highly available. It also depends on all parties cooperating on a positive outcome in an expedient fashion. To that end, you need to run DTC in a failover cluster, because it’s the Achilles heel of the whole system and any transaction depends on DTC clearing it.

    The bottom line is that Service Bus, specifically with its de-duplication features for sending and with its reliable delivery support using Peek-Lock (which we didn’t discuss in the thread, but see here and also here) is a great tool to compensate for the lack of coordinator support in the cloud

    The Azure storage folks implement their clusters in a very particular way to provide highly-scalable, highly-available, and strongly consistent storage – and they are using a quorum based protocol (Paxos) rather than classic atomic TX protocol to reach consensus. 

    URLTransactions in Windows Azure (with Service Bus) – An Email Discussion 

    In the current release, only one top level messaging entity, such as a queue or topic can participate in a transaction, and the transaction cannot include any other transaction resource managers, making transactions spanning a messaging entity and a database not possible.

    Has Windows Azure any kind of distributed transaction mechanism in order to include any remote object creation in an atomic transaction including other domain-specific operations?  
    The alternative solution suggested by Himanshu Desai is to have an operation to start a process on the server and then poll in a loop on a client until final status is received from the server. 
     
    Author: "Michael Freidgeim" Tags: "Web Services/WCF"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 16 Mar 2013 01:27

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/03/16/2010-version-of-tf-tfs-command-tool-unable-to-determine.aspx

    We upgraded to VS 2012 and TFS 2012 a month ago.
    I have a backupShelve.cmd that used to work before upgrade, but when I run it yesterday caused the error

    Unable to determine the workspace. You may be able to correct this by running 'tf workspaces /collection:TeamProjectCollectionUrl'

     
    The backupShelve.cmd file is the following 

    call "C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\vcvarsall.bat" x86
    tf shelve   /replace /comment:"Current backup" CurrentBackupMT01 /noprompt
    pause

    After some time I've noticed, that batch file refers to version 10 folder instead of "Microsoft Visual Studio 11".

    After I've changed folder name, TF started to work.
    Author: "Michael Freidgeim" Tags: "Visual Studio/TFS"
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 16 Feb 2013 04:45

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/02/16/entlib-editor-corrupts-config-files.aspx

    I've tried to use Microsoft Enterprise Library(EntLIb) editor, as it was suggested in http://weblogs.asp.net/sukumarraju/archive/2011/11/07/configuring-wcf-service-to-utilise-enterprise-library-logging-application-to-log-data-to-database.aspx, but after changes all comments in config files were removed.
     
    Always consider to move any Enterprise Library configurations to a separate file before editing.
    Author: "Michael Freidgeim" Tags: "Testing/Debugging/Logging, .Net Framewor..."
    Comments Send by mail Print  Save  Delicious 
    Date: Saturday, 16 Feb 2013 04:24

    Originally posted on: http://geekswithblogs.net/mnf/archive/2013/02/16/static-methods-not-always-bad-for-testability.aspx

    Some time ago I've posted a few links about What is testable code?

    Reading the links someone can feel that any static methods are bad for testability. However it is a wrong impression- static methods without external dependencies are good for testing.

     http://programmers.stackexchange.com/questions/5757/is-static-universally-evil-for-unit-testing-and-if-so-why-does-resharper-recom
    There is nothing wrong with static methods and they are easy to test (so long as they don't change any static data). For instance, think of a Maths library, which is good candidate for a static class with static methods 
     
    Static methods which hold no state and cause no side effects should be easily unit testable. In fact, I consider such methods a "poor-man's" form of functional programming; you hand the method an object or value, and it returns an object or value. Nothing more. I don't see how such methods would negatively affect unit testing at all.

    Alternatively you can mock anything - implemented by MS Fakes, TypeMock, JustMock and Moles.  They rely on .NET'sProfiling API. It can intercept any of your CIL instructions.
     
    See related links  
    http://stackoverflow.com/questions/9677445/mock-framework-vs-ms-fakes-frameworks
    Isolating Code under Test with Microsoft Fakes
     

    Author: "Michael Freidgeim" Tags: "Testing/Debugging/Logging"
    Comments Send by mail Print  Save  Delicious 
    Next page
    » You can also retrieve older items : Read
    » © All content and copyrights belong to their respective authors.«
    » © FeedShow - Online RSS Feeds Reader