• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)

Date: Monday, 11 Mar 2013 14:23

I was fortunate enough to work on a team for the past year on producing an eBook that covers the Microsoft Fakes Framework that shipped as part of Visual Studio 2012.  Note that Fakes is or will be available in the Premium edition of VS2012 with Update 2.

Download the guide: http://vsartesttoolingguide.codeplex.com/releases/view/102290


Mike Fourie was the lead on this, and w/ out Mike, this wouldn’t be where it is today…


The team was made up of ALM Rangers from Microsoft and partners, providing a broad based set of experiences that helped shape the guide towards real world scenarios.

Brian Blackman, Carsten Duellmann, Dan Marzolini, Darren Rich, David V. Corbin, Hamid Shahid, Hosam Kamel, Jakob Ehn, Joshua Weber, Mehmet Aras, Patricia Wagner, Peter Provost, Richard Albrecht, Richard Fennell, Rob Jarratt, Shawn Cicoria, Waldyr Felix, Willy-Peter Schaub

Here’s the TOC:



Chapter 1: A Brief Theory of Unit Testing

  • Software testing

  • The fine line between good and flawed unit testing

Chapter 2: Introducing Microsoft Fakes

  • Stubs
  • Shims
  • Choosing between a stub and a shim

Chapter 3: Migrating to Microsoft Fakes

  • Migrating from Moles to Microsoft Fakes
  • Migrating from commercial and open source frameworks

Chapter 4: Miscellaneous Topics

  • Targeting Microsoft .NET Framework 4
  • Adopting Microsoft Fakes in a team
  • You can’t Fake everything!
  • Verbose logging
  • Working with strong named assemblies
  • Optimizing the generation of Fakes
  • Looking under the covers
  • Refactoring code under test
  • Removing Fakes from a project
  • Using Fakes with Team Foundation Version Control
  • Using Microsoft Fakes with ASP.NET MVC

Chapter 5: Advanced Techniques

  • Dealing with Windows Communication Foundation (WCF) service boundaries
  • Dealing with non-deterministic calculations
  • Gathering use-case and other analytical information
  • Analyzing internal state
  • Avoiding duplication of testing structures

Chapter 6: Hands-on Lab

  • Exercise 1: Using Stubs to isolate database access (20 – 30 min)
  • Exercise 2: Using Shims to isolate from file system and date (20 – 30 min)
  • Exercise 3: Using Microsoft Fakes with SharePoint (20 – 30 min)
  • Exercise 4: Bringing a complex codebase under test (20 – 30 min)

In Conclusion


Author: "cicorias"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 08 Mar 2013 17:01

Over time, you may end up with lots of sites running in IIS Express.  I like things neat and tidy, and periodically, I’ll run a little cleanup command as follows:

From PowerShell:

$appCmd = "C:\Program Files (x86)\IIS Express\appcmd.exe"

$result = Invoke-Command -Command {& $appCmd 'list' 'sites' '/text:SITE.NAME' }

for ($i=0; $i -lt $result.length; $i++)
    Invoke-Command -Command {& $appCmd 'delete' 'site'  $result[$i] }
Author: "cicorias"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 01 Mar 2013 00:11

The CAML for the query easily enough includes a ParentID reference.  However, if you’re spelunking around in SP 2013 using the OData services, you might have a hard time finding the ParendID field.

However, if you just issue the query:

https://<server>/<mp>/web/_api/Web/Lists/getByTitle('TaskListName')/Items/?$filter=ParentID eq ‘101’

You’ll be able to retrieve all Tasks that have task #101 as their parent.

Author: "cicorias"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 01 Feb 2013 00:05


IE6 – WinXP, IE7 – Vista, IE8 – Win7, IE9 – Win7, and IE10 – Win8…


Author: "cicorias"
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 03 Nov 2012 12:07

If you haven’t heard, TFS Service has gone live at http://tfs.visualstudio.com/.

While the old DNS name works, at some point it may retire.

So, Jesse Houwing has a post/script that makes it easy.


Here’s the script as well.


Get-ItemProperty -Path HKCU:\Software\Microsoft\VisualStudio\*\TeamFoundation\Instances\*.tfspreview.com Uri | %{set-itemproperty -Path $_.PSPath Uri -Value ( $_.Uri -Replace ".tfspreview.com/", ".visualstudio.com/" )}
Get-ItemProperty -Path HKCU:\Software\Microsoft\VisualStudio\*\TeamFoundation\Instances\*.tfspreview.com\Collections\* Uri | %{set-itemproperty -Path $_.PSPath Uri -Value ( $_.Uri -Replace ".tfspreview.com/", ".visualstudio.com/" )}
Get-ChildItem -Path HKCU:\Software\Microsoft\VisualStudio\*\TeamFoundation\Instances\*.tfspreview.com | Rename-Item -NewName { $_.PSChildName -Replace ".tfspreview.com$", ".visualstudio.com" }
Author: "cicorias" Tags: "TFS"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 02 Nov 2012 11:38

The upcoming Fall Release of VS2012 has some really great updates, bringing back OData, SPA, and now SingnalR.

If you get a chance take a look at Scott Guthrie’s keynote session where he goes through it (SignalR -- along with a demo of New Relic – which rocks).

SignalR is a hub/client model that leverages javascript to provide real time broadcast and point to point call backs from the hub to clients.  So, you can make some really cool sh**t with it.

Here’s a simple chat that was done with just a few lines of code:


Other links




Good overview of what’s included here:


Author: "cicorias" Tags: ".NET, ASP.NET, SignalR"
Comments Send by mail Print  Save  Delicious 
Date: Sunday, 19 Aug 2012 13:20

Sometimes you can’t do a clean install of the OS and move to Win8 RTM.  One of my machines I’ll be using for an ongoing project it’s in that “perfect” operating mode – except for the RC versions of Visual Studio 2012.

The following links provide the upgrade paths as needed (note that with Win8 – must move to RTM of Win8 to get VS2012 RTM.

How to uninstall Visual Studio 2012 Release Candidate


Upgrading from Visual Studio 2012 RC to RTM


Author: "cicorias" Tags: "Visual Studio"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 05 Jul 2012 18:24

If you’re using Win8 yet, no doubt you’ve run into the charms bar.  There’s a feature to extend via Share, links to your application.

Details on the HOW are here:

Adding share (Metro style apps using JavaScript and HTML)


So, Digital Folio has taken their shopping tool to Win8 and enabled some really cool ways to take advantage.  I was fortunate enough to help out the folks there a while back on some other things, but their app is a nice shoppers aid.

Digital Folio for Windows 8 | Instant Price Comparisons from Major Retailers on the Products You Want

Author: "cicorias" Tags: "Metro, Win8"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 20 Jun 2012 12:14
Author: "cicorias" Tags: ".NET, WIF, Security, ADFS"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 12 Apr 2012 12:58

When you’re urnning under x64 you have to affect 1 addition spot in the registry to disable this warning – which clearly should only be done by folks that know what they’re doing.

NOTE: affecting the registry can be harmful – do so at your own risk.

Windows Registry Editor Version 5.00

Windows Registry Editor Version 5.00


Author: "cicorias" Tags: "Visual Studio, Development, Troubleshoot..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 22 Mar 2012 21:22

When you’re working with WIF and WSTrustChannelFactory when you call the Issue operation, you can also request that a RequestSecurityTokenResponse as an out parameter.

However, what can you do with that object?  Well, you could keep it around and use it for subsequent calls with the extension method CreateChannelWithIssuedToken – or can you?

public static T CreateChannelWithIssuedToken<T>(this ChannelFactory<T> factory, SecurityToken issuedToken);


As you can see from the method signature it takes a SecurityToken – but that’s not present on the RequestSecurityTokenResponse class.

However, you can through a little magic get a GenericXmlSecurityToken by means of the following set of extension methods below – just call

rstr.GetSecurityTokenFromResponse() – and you’ll get a GenericXmlSecurityToken as a return.

public static class TokenHelper

    /// <summary>
    /// Takes a RequestSecurityTokenResponse, pulls out the GenericXmlSecurityToken usable for further WS-Trust calls
    /// </summary>
    /// <param name="rstr"></param>
    /// <returns></returns>
    public static GenericXmlSecurityToken GetSecurityTokenFromResponse(this RequestSecurityTokenResponse rstr)
        var lifeTime = rstr.Lifetime;
        var appliesTo = rstr.AppliesTo.Uri;
        var tokenXml = rstr.GetSerializedTokenFromResponse();
        var token = GetTokenFromSerializedToken(tokenXml, appliesTo, lifeTime);
        return token;

    /// <summary>
    /// Provides a token as an XML string.
    /// </summary>
    /// <param name="rstr"></param>
    /// <returns></returns>
    public static string GetSerializedTokenFromResponse(this RequestSecurityTokenResponse rstr)
        var serializedRst = new WSFederationSerializer().GetResponseAsString(rstr, new WSTrustSerializationContext());
        return serializedRst;

    /// <summary>
    /// Turns the XML representation of the token back into a GenericXmlSecurityToken.
    /// </summary>
    /// <param name="tokenAsXmlString"></param>
    /// <param name="appliesTo"></param>
    /// <param name="lifetime"></param>
    /// <returns></returns>
    public static GenericXmlSecurityToken GetTokenFromSerializedToken(this string tokenAsXmlString, Uri appliesTo, Lifetime lifetime)
        RequestSecurityTokenResponse rstr2 = new WSFederationSerializer().CreateResponse(
        new SignInResponseMessage(appliesTo, tokenAsXmlString),
        new WSTrustSerializationContext());
        return new GenericXmlSecurityToken(
            new BinarySecretSecurityToken(
            lifetime.Created.HasValue ? lifetime.Created.Value : DateTime.MinValue,
            lifetime.Expires.HasValue ? lifetime.Expires.Value : DateTime.MaxValue,

Author: "cicorias" Tags: ".NET, WCF (Indigo), Federation, Security..."
Comments Send by mail Print  Save  Delicious 
Date: Saturday, 04 Feb 2012 14:50

Many companies, ISV’s, and solutions have concerns about data in the cloud.  With PKI based encryption, Trust Services provides key management for your publisher/subscribers and a simplified SDK set of classes to abstract the encryption, decryption process.  Both managed classes and PowerShell add-in provided...

Learn More about Microsoft Codename "Trust Services" - TechNet Articles - Home - TechNet Wiki

Author: "cicorias" Tags: "Azure, Security"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 19 Jan 2012 20:39

Windows Azure Fieldnote


Windows Azure Drives [1] provide a means to represent a file based (disk drive) persistent storage option for the various role types within Windows Azure Compute. Each of the roles within Windows Azure can mount and utilize for persistent storage (that survives reboot, reimaging, and updated deployments, of a role instances).

During the mounting of a VHD as a CloudDrive, the managed classes have no means to control the drive letter assignment this directly through the CloudDrive managed classes that are provided through the Windows Azure SDK.


Many solutions today require the use of standard Windows File IO based access and instead of refactoring solutions to leverage the storage options available in the PaaS part of the Windows Azure platform, solutions deployed to Windows Azure can mount a Virtual Hard Disk (VHD) that is persisted in a storage account inside of a running instance. That Page Blob backed VHD is then represented through Virtual Disk Services and Windows Cloud Drive services to the running instances as a Disk Drive and addressable through File IO using a Drive Letter.

While a persistent drive option is available, the drive letter assignment is determined at runtime during the mounting process. This potentially presents a problem with existing solutions, codebases, libraries that require a setting to be established prior to runtime. For example, an application configuration setting that provides a full path, including the drive letter to a location for read/write access for File IO.


The following solution takes advantage of the Virtual Disk Services through the DiskPart.exe operating system utility to first identify what the VHD is mounted as and, select that volume, and re-assign the letter to the target drive letter.

The original idea for the approach comes from this blog post here: http://techyfreak.blogspot.com/2011/02/changing-drive-letter-of-azure-drive.html

While there is a COM interface available that could be wrapped via an interop layer, the choice was made to initiate a process to take the actions required for remapping the drive letter due to simplicity. Additionally, while there is an existing managed Interop assembly available (Microsoft.Storage.Vds) that is an undocumented and unsupported assembly.

The example scenario presented does the following:

1. Leverages a Windows Azure Web Role (could be a Worker Role or VM Role as well)

2. Implements a Windows Console applications that:

a. Is a Startup task – in elevated mode and background

b. Runs elevated in order to affect Virtual Disk Services

c. At startup:

    • Mounts the VHD from Windows Azure Storage
    • Detects if target drive letter and re-assigns as needed to target drive letter **

d. Then Continuously (every 30 seconds)

    • i. Checks if drive is mounted on target drive letter
    • ii. If not, reassigns drive letter **

** Drive Letter reassignment is done through a System.Process startup object that runs Diskpart.exe with a “select volume” and “assign drive letter” command sequence.


The sample solution contains the following:

1. Windows Azure Web Role – simple MVC3 application that just lists the mapped CloudDrives using the CloudDrive.GetMountedDrives() method

2. CloudDriveManager class library – helper class that provides the CloudDrive management actions leveraged by the caller (either Console or other code)

3. CloudDriveManagerConsole – Windows console application intended to be a startup project and running in elevated mode in order to affect the assigned driver letter

4. CloudDriveManagerRole – implementation of Microsoft.WindowsAzure.ServiceRuntime.RoleEntryPoint – which allows this class to be used from within a Windows Azure Web or Worker role – however, that role entry point would need to be elevated (via the “Runtime” and “NetFxEntryPoint” Elements)

5. Logger – simple logger class that writes to a Queue for debugging purposes

6. ResponseViewer – simple WPF application that reads Queue messages so you can view log messages from your cloud instances – purely for debugging purposes

7. TestListDrives – simple Windows console application that lists the mapped CloudDrives – usable from within the Role instance by using Remote Desktop and connecting to the instance

Instance Initialization

During role startup, Windows Azure will execute the Task defined in the Service definition in background mode and elevated (running as system). Inside of the console application, the implementation of OnStart does the following:

public override bool OnStart()
    catch (Exception ex)
        _logger.Log("fail on onstart", ex);
    return true;

void MountAllDrives()
        var driveSettings = RoleEnvironment.GetConfigurationSettingValue(DRIVE_SETTINGS);
        string[] settings = driveSettings.Split(':');
        CloudStorageAccount account =CoudStorageAccount.FromConfigurationSetting(STORAGE_ACCOUNT_SETTING);
        string dCacheName = RoleEnvironment.GetConfigurationSettingValue(DCACHE_NAME);
        LocalResource cache = RoleEnvironment.GetLocalResource(dCacheName);
        int cacheSize = cache.MaximumSizeInMegabytes / 2;
        _cloudDriveManager = new CloudDriveManager(account, settings[0], settings[1][0], cache);
    catch (Exception ex)
        _logger.Log("fail on mountalldrives", ex);


Mostly, the startup routine calls into the custom class CloudDriveManager, which provides the simple abstraction to the Windows Azure CloudDrive managed class.

The custom CreateDrive method calls the CloudDrive create drive method in a non-destructive manner – and, for this sample, creates the initial VHD in storage if it does not already exist.

Mounting calls the managed classes CloudDrive.Mount along with calling into a custom VerifyDriveLetter method.

public void Mount()
    _logger.Log(string.Format("mounting drive {0}", _vhdName));
    _cloudDrive = _account.CreateCloudDrive(_vhdName);

    var driveLetter = _cloudDrive.Mount(_cacheSize, DriveMountOptions.Force);
    _logger.Log(string.Format("mounted drive letter {0}", driveLetter));

    var remounted = VerifyDriveLetter();


Within VerifyDriveLetter there’s some logic to validate the current state of the mounted drives. And then verification if the mounted drive is the intended drive letter.

public bool VerifyDriveLetter()
    _logger.Log("verifying drive letter");
    bool rv = false;
    if (RoleEnvironment.IsEmulated)
        _logger.Log("Can't change drive letter in emulator");

        DriveInfo d = new DriveInfo(_cloudDrive.LocalPath);
        if (string.IsNullOrEmpty(_cloudDrive.LocalPath))
            _logger.Log("verifydriveLetter: Not Mounted?");
            throw new InvalidOperationException("drive is notmounted");

        if (!char.IsLetter(_cloudDrive.LocalPath[0]))
            _logger.Log("verifiydriveLeter: Not a letter?");
            throw new InvalidOperationException("verifydriveletter - not a letter?");

        if (IsSameDrive())
            _logger.Log("is same drive; no need to diskpart...");
            return true;

        char mountedDriveLetter = CurrentLocalDrive(_vhdName);
        RunDiskPart(_driveLetter, mountedDriveLetter);

        if (!IsSameDrive())
            var msg = "Drive change failed to change";
                   throw new ApplicationException(msg);

               _logger.Log("verifydriveletter done!!");
               return rv;

           catch (Exception ex)
               _logger.Log("error verifydriveletter", ex);
               return rv;



The IsSameDrive method validates if the current mapped drive is indeed the planned drive letter. If not, it will return “false”.

bool IsSameDrive()
    char targetDrive = _driveLetter.ToString().ToLower()[0];
    char currentDrive = CurrentLocalDrive(_vhdName);

    string msg = string.Format(
        "target drive: {0} - current drive: {1}",


    if (targetDrive == currentDrive)
        _logger.Log("verifydriveLetter: already same drive");
        return true;
        return false;



Finally, the RunDiskPart method initiates the action of spawning a new process with the dynamically created DiskPart script file that selects the existing volume name (by drive letter) and assigns the target drive letter.

void RunDiskPart(char destinationDriveLetter, char mountedDriveLetter)
    string diskpartFile = Path.Combine(_cache.RootPath, "diskpart.txt");

    if (File.Exists(diskpartFile))

    string cmd = "select volume = " + mountedDriveLetter + "\r\n" + "assign letter = " + destinationDriveLetter;
      File.WriteAllText(diskpartFile, cmd);

      //start the process
      _logger.Log("running diskpart now!!!!");
      _logger.Log("using " + cmd);
      using (Process changeletter = new Process())
          changeletter.StartInfo.Arguments = "/s" + " " + diskpartFile;
          changeletter.StartInfo.FileName = 
     System.Environment.GetEnvironmentVariable("WINDIR") + "\\System32\\diskpart.exe";
        //#if !DEBUG



Output and Results

As an example of the interaction and how the drive appears within the running Windows Azure Role, the following screen shots illustrate the results.

Program Startup

At program startup the drive is initially mounted by the Console application – immediately the drive is mounted as the F: drive – the startup code verifies if this is the intended drive – as shown below in the logs, it isn’t, so the code initiates the RunDiskPart method setting M: as the mapped drive.



The following shows how a Windows Azure Drive appears after the custom code reassigns the drive letter to the Operating system using Windows Explorer – the drive is selected below.



Within the custom MVC3 application, which simply just lists the Mounted Windows Azure drive (which runs in a separate Process non-elevated – the drive appears as a regular Operating System drive – accessible for File IO as required using the intended drive letter.


Forced Letter Change

The following shows what happens if the drive letter is intentionally changed – in this example, I just initiate a DiskPart set of commands to assign the mounted drive the letter L:


As you can see in the Windows Explorer window the letter now appears as L: for the WindowsAzureDrive.

Within approximately 30 seconds (which is the value used in the Run method by the custom code) VerifyDriveLetter detects it’s not the intended drive and initiates a change.



And the below image shows the drive again, appearing as the M: drive:



Future Options

Since capabilities in the Windows Azure platform change over time the ability to dictate the specific letter to be used may come available. Until then, this approach, by means of the Windows Azure Drive and Virtual Disk Services abstraction provided by the platform offers a means to accommodate codebase and application logic that is dependent upon predetermined drive letters.


[1] Windows Azure Drives http://www.windowsazure.com/en-us/develop/net/fundamentals/cloud-storage/#drives

[2] Virtual Disk Service http://msdn.microsoft.com/en-us/library/windows/desktop/bb986750(v=vs.85).aspx

[3] CloudDrive Storage Client http://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storageclient.clouddrive.aspx

[4] Diskpart.exe http://technet.microsoft.com/en-us/library/cc770877(v=WS.10).aspx

[5] Task element http://msdn.microsoft.com/en-us/library/windowsazure/gg557552.aspx#Task

Devil Runtime element http://msdn.microsoft.com/en-us/library/windowsazure/gg557552.aspx#Runtime

[7] NetFxEntryPoint element http://msdn.microsoft.com/en-us/library/windowsazure/gg557552.aspx#NetFxEntryPoint


Solution File: MountXDriveSameLetter.zip

Author: "cicorias" Tags: "Utilities, Cloud, Windows Azure"
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 06 Dec 2011 23:00

When you’re debugging security related things, sometimes you need to take a look at the thread identities user token.

When you’re inside of Visual Studio 2010 – in the watch windows you enter ‘$user’  and you’ll get the same as when in windbg with !token –n



Author: "cicorias" Tags: ".NET, ASP.NET, Troubleshooting, Tricks"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 30 Nov 2011 15:25

If you’re like me, having those PDF version for offline review are great.  It was a pain before as I had to individually print web pages to single PDF using tools.

Now, TechNet can track a “book” of topics for you, and then generate HTML or PDF for you to download – personal publishing Smile

Roll-your-own techdocs for free - TONYSO - Site Home - TechNet Blogs

Author: "cicorias" Tags: "Tricks"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 14 Oct 2011 14:43

Wow – I still have my K&R book from a class I took at AT&T.  Cut my teeth on nix…

Dennis Ritchie, Father of C and Co-Developer of Unix, Dies | Wired Enterprise | Wired.com

Author: "cicorias"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 13 Oct 2011 12:57
Author: "cicorias" Tags: "Security, ADFS"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 06 Oct 2011 22:38

While the development server in Visual Studio 2010 is great for most work, it does have 1 shortcoming in that if you start adding content types that are not part of the base set of known Mime types built in, you won’t affect the proper header response that is emitted to the client/browser.

For example MP4 files, out of the box the development web server emits application/octet-stream or something like that.  What we really need is video/mp4.

Now, with IIS Express, you can easily switch over to use that and just add the correct mapping to the section of the web.config when you’re running in integrated mode.  Such as follows:

  <modules runAllManagedModulesForAllRequests="true" />
    <mimeMap fileExtension=".mp4" mimeType="video/mp4" />
    <mimeMap fileExtension=".m4v" mimeType="video/m4v" />


However, with the Visual Studio 2010 built in Web Development server, you can’t affect the mime type support through configuration.

For this a simple NuGet package is available that provides a simple HttpModule to affect the ContentType on the response headers.  it reads the Web.config for the site and will honor the section above – this all happens only when NOT running in Integrated Pipeline mode.



Sample Solution and Source here: SampleMimeHelper.zip

The HttpModule makes use of dynamically loading via the PreApplicationStartMethod and the DynamicModuleHelper utility method that is part of the Microsoft.Web.Infrastructure namespace.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Diagnostics;
using System.Configuration;
using System.Web.Configuration;
using System.Xml.Linq;
using Microsoft.Web.Infrastructure.DynamicModuleHelper;

[assembly: PreApplicationStartMethod(typeof(MimeHelper), "Start")]

/// <summary>
/// Summary description for MimeHelper
/// </summary>
public class MimeHelper : IHttpModule
    static Dictionary<string, string> s_mimeMappings;
    static object s_lockObject = new object();

    public static void Start()
        if ( ! HttpRuntime.UsingIntegratedPipeline)

    static string GetMimeType(HttpContext context)
        var ext = VirtualPathUtility.GetExtension(context.Request.Url.ToString());
        if (string.IsNullOrEmpty(ext)) return null;


        string mimeType = null;
        s_mimeMappings.TryGetValue(ext, out mimeType);

        return mimeType;


    static void CreateMapping(HttpApplication app)
        if (null == s_mimeMappings)
            lock (s_lockObject)
                if (null == s_mimeMappings)
                    string path = app.Server.MapPath("~/web.config");
                    XDocument doc = XDocument.Load(path);

                    var s = from v in doc.Descendants("system.webServer").Descendants("staticContent").Descendants("mimeMap")
                            select new { mimeType = v.Attribute("mimeType").Value, fileExt = v.Attribute("fileExtension").Value };

                    s_mimeMappings = new Dictionary<string, string>();
                    foreach (var item in s)
                        s_mimeMappings.Add(item.fileExt.ToString(), item.mimeType.ToString());

    public void Dispose() { }

    public void Init(HttpApplication context)
        context.EndRequest += new EventHandler(context_EndRequest);

    void context_EndRequest(object sender, EventArgs e)
            HttpApplication app = sender as HttpApplication;
            string mimeType = GetMimeType(app.Context);

            if (null == mimeType) return;

            app.Context.Response.ContentType = mimeType;
        catch (Exception ex)
Author: "cicorias" Tags: ".NET, ASP.NET, Code, VS2010"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 21 Sep 2011 21:54

Keith Dahlby has a good post on creating a fake SPContext.  Here’s the link and the code

NOTE: This is not production safe code – use at own risk…


public static SPContext FakeSPContext(SPWeb contextWeb)
  // Ensure HttpContext.Current
  if (HttpContext.Current == null)
    HttpRequest request = new HttpRequest("", web.Url, "");
    HttpContext.Current = new HttpContext(request,
      new HttpResponse(TextWriter.Null));

  // SPContext is based on SPControl.GetContextWeb(), which looks here
  if(HttpContext.Current.Items["HttpHandlerSPWeb"] == null)
    HttpContext.Current.Items["HttpHandlerSPWeb"] = web;

  return SPContext.Current;
Author: "cicorias" Tags: "SharePoint"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 21 Sep 2011 19:15

I wanted an ability to be able to simply time methods and write to a log/trace sink and a very simple approach that I ended up using was to provide a method that takes an Action delegate which would be the method that is to be timed.

The following is what I came up with (this is my reminder…)

class Program
    static void Main(string[] args)

    private static void TestMethod1()
        LoggingHelper.TimeThis("doing something", () =>
            Console.WriteLine("This is the Real Method Body");

public static class LoggingHelper
    public static void TimeThis(string message, Action action)
        string methodUnderTimer = GetMethodCalled(1);
        Stopwatch sw = Stopwatch.StartNew();
        LogMessage( string.Format("started: {0} : {1}", methodUnderTimer, message));
        LogMessage(string.Format("ended  : {0} : {1} : elapsed : {2}", methodUnderTimer, message, sw.Elapsed));


    private static string GetMethodCalled(int stackLevel)
        StackTrace stackTrace = new StackTrace();
        StackFrame stackFrame = stackTrace.GetFrame(stackLevel + 1);
        MethodBase methodBase = stackFrame.GetMethod();
        return methodBase.Name;

    static void LogMessage(string message){
        Console.WriteLine("{0}", message);

Author: "cicorias" Tags: ".NET, Code"
Comments Send by mail Print  Save  Delicious 
Previous page - Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader