• Shortcuts : 'n' next unread feed - 'p' previous unread feed • Styles : 1 2

» Publishers, Monetize your RSS feeds with FeedShow:  More infos  (Show/Hide Ads)

Date: Monday, 16 Jan 2012 17:38

I just finished reading this article on NoSQL security. It raised a couple of concerns:

1. Systems require security. Hard stop. no if’s and’s or but’s about it. Designing security from the start is a hard task. At Microsoft there are several practices and resources we use to design and evaluate security. These have been in place for ~10 years and have been refined. My assessment is it’s working. Look at the security track record for SQL2K5 through SQL2K8R2. Pretty darn solid, especially when compared to Oracle. We spent a significant amount of time reviewing existing product functionality and in designing and testing new functionality. It’s simple ignorance to release new infrastructure software, much less anything else, without designing for security.

2. It’s additional ignorance like the following that cause me serious concern; from the same article:

While [James Phillips, co-founder and senior vice president of products for Couchbase, a NoSQL platform firm] agrees that there is still an experience gap with such a new technology, he believes that some of the security concerns should be at least a little quelled if organizations consider the typical use case for NoSQL. He believes that these data stores usually contain less sensitive information than the typical relational database and that they tend to have limited touch points to other applications within the enterprise network.”

What in the world does “less sensitive information” mean? Philips goes on to say:

"They're using the technology not as a database per se, like you would consider perhaps an enterprise data store where you're collecting and aggregating lots of the business data of the organization that other apps are going to tie into," he says. "Rather, if I'm building a social network or social game or building a very specific web application that has certain functionality, it tends to sit behind the firewall and it ties to this application and usually isn't available for other parts of organization to tap."

The belief that security attacks only come from outside the organization is pure ignorance. The belief that security concerns will be quelled if the implementers of the technology consider the typical use case for NoSQL is another example of ignorance. And finally the characterization that only social network and social game companies are going to use NoSQL and that they don’t hold sensitive data is flabbergasting; think of the Sony incident.

My intent isn’t to personally attack Mr. Phillips. I’m sure there are many people in the software industry that use the same spin when it suits them or or their company’s product. Imagine if Microsoft or IBM took the same stance as Couchbase? Couchbase considers themselves a platform company. Their lack of outward concern for security is proof they are not a platform company.

I want to state again that I have nothing personal toward Mr. Phillips or his company. I’ve never met him or used his company’s product. But he is simply doing NoSQL and his company a disservice by attempting to downplay the lack of security in NoSQL.

As I said, adding security after the fact is extremely hard. But this doesn’t mean we should give up or position the technology as not needing it. As an IT professional it’s your responsibility to understand the benefits and limitation of all technology you implement, do a security review and ensure it meets the business requirements. Hard Stop!

Author: "Dan Jones MSFT" Tags: "Security, NoSQL"
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 13 Oct 2011 16:51


The presentation and scripts can be found here. Two questions came up which I couldn’t answer on the spot: 1) what is the load precedence of profiles and 2) what PowerShell books would I recommend.


Profile Load Precedence


The profiles are listed in load order. The most specific profiles have precedence over less specific profiles where they apply.

  • %windir%\system32\WindowsPowerShell\v1.0\profile.ps1
  • %windir%\system32\WindowsPowerShell\v1.0\ Microsoft.PowerShell_profile.ps1
  • %UserProfile%\My Documents\WindowsPowerShell\profile.ps1
  • %UserProfile%\My Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1


PowerShell Book Suggestion

Learn PowerShell in a Month of Lunches by Don Jones (no relation)

Windows PowerShell in Action, Second Edition by Bruce Payette

Author: "Dan Jones MSFT" Tags: "PowerShell, SQL PASS, SQL Server 2012"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 28 Sep 2011 16:54

I’ve been blogging here and here about testing Denali CTP3. Here is a recent picture of the SQL Server team – though it’s not everyone on the team. These are the people behind the magic of SQL Server. By downloading Denali CTP3, testing, and filing bugs you become a virtual member of this team. How cool would it be to say that you were a member of the SQL Server Team for Denali!


Author: "Dan Jones MSFT" Tags: "SQL Server, SQL Server Code Name "Denali..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 20 Sep 2011 14:19

At the end of August we released CTP3 (Community Technology Preview) of the next release of SQL Server codenamed Denali.

You can download it here.

This is a feature rich CTP and is deserving of your attention, though I can’t discuss where this falls into the overall release schedule – we haven’t made any official announcements and I’m not doing so here, that’s beyond my pay grade.

Download, install, exercise it and give us feedback!

Here are some areas to delve into and how to give us feedback.

Author: "Dan Jones MSFT" Tags: "SQL Server, SQL Server Code Name "Denali..."
Comments Send by mail Print  Save  Delicious 
Date: Sunday, 21 Aug 2011 18:28

Bugs reported by the community have played a huge role in the amazing quality of past releases of SQL Server. In SQL Server 2008 we fixed over 1,000 bugs submitted by the SQL Server community prior to release. And in SQL Server 2008 R2 we fixed over 300 Community submitted bugs prior to the release.

Now is the time to test new features and send us feedback!

Download SQL Server "Denali" CTP3 - here!

Microsoft Connect is the place to submit bugs. This tool connects right into our bug tracking database, Team Foundation Server.

To promote your participation in testing SQL Server Code Name “Denali” we are running the Feedback Challenge.

Starting on Friday August 19, we will be sending SQL Server gifts packs valued at $30 to each person who submits bugs or provides feedback on how we can improve SQL Server Code Name “Denali” up through August 31, 2011 (or to the first 300 respondents).

Here are a list of areas we want your feedback:



Beyond Relational

Development & Management


Project Crescent

Analysis Services

Product Update

Data Quality Services

Master Data Services

Author: "Dan Jones MSFT" Tags: "Denali, SQL Server, SQL Server Code Name..."
Comments Send by mail Print  Save  Delicious 
Date: Tuesday, 19 Jul 2011 16:18

There’s a great new feature in the next release of SQL Server (codename “Denali”) which integrates the latest product updates into the initial installation experience. This feature came online in the CTP3 release.

We need your help testing this feature.

Peter Saddow, of the Deployment Platform team, has a comprehensive blog post detailing how to test this feature, therefore I won’t repeat it here.

Thanks in advance for helping make “Denali” the best release of SQL Server yet.

Author: "Dan Jones MSFT" Tags: "SQL Server, SQL Server Code Name "Denali..."
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 06 Jul 2011 17:17

Today is my 13th day in the new job as the community lead for the Business Platform Division. It has been a filled 13 days – in a very good way. I’ve been working closely with my team to get educated on community and to begin to formulate our community strategy. This includes identifying which communities we should engage with and how, the use of various existing technologies (Twitter, LinkedIn, MSDN Forums. Stock Overflow and others) and new technologies we need to invent.

As we’re in the early stages I have yet to reach any conclusions on which communities we should engage with and how we should engage.

This means it’s the perfect time for you to share your thoughts and insights. Some of the areas I’d like to hear from you include:  Which communities do you spend your time? Which hashtags do you follow? What LinkedIn groups are you part of? Where would you like us to engage? What would you like to see us do more/less of?

I’m a big believer in listening to customers and users who ultimately make a community strong and vibrant. While I can’t respond to each, I promise you I read each and every comment. Thank you in advance for sharing.

Author: "Dan Jones MSFT" Tags: "SQL Server, Community, Business Platform..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 27 Jun 2011 18:28

I was recently talking to a co-worker and I characterized MVPs as evangelists, among other things. I got a very strong reaction - negative. I know full well this posting is going to rub some MVPs the wrong way – like petting a cat against the grain! However, I like stirring things up a bit to see what people really think or you can simply say I’m a glutton for punishment.

For whatever reason the words evangelism and evangelist, in the tech community, have been casted in a negative light.  But I happen to really like them to describe certain people in the SQL Server community, namely MVPs.

Merriam-Webster defines evangelist as an enthusiastic advocate. The Encarta World English Dictionary defines Evangelism as great enthusiasm, fervor or zeal for a particular cause.

Isn’t this a huge part of what MVPs are? After all why would they spend their own money traveling to conferences both big (SQL PASS) and small (SQL Saturday)? Why would they go on SQL Server themed cruises? Why would they spend as much time as they do blogging and answering questions on #sqlhelp?

I’ve said here before I believe MVPs do a fantastic and selfless job in the community. They are an incredibly important part of the community. Their technical and operational knowledge of the product is top notch. Their passion shines through on a daily basis.

Take the characterization as evangelist as a complement. That’s the way it’s intended.

Author: "Dan Jones MSFT" Tags: "SQL Server, MVPs"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 15 Jun 2011 20:43

I spent the first eight years of my career in enterprise IT for a Fortune 15 company. Back then IT was a vibrant work environment. Money was flowing into IT to fund resources, software, hardware and vendors. Today it’s a different story. The mantra for the past 11 years has been “do more with less”. It’s good business sense to tighten budgets when the business starts to see warning signs of market stagnation. However, if the business does not begin to fund new projects to look at new technologies it will find itself in a difficult situation.

The first problem is being on outdated hardware and software. This doesn’t just exist on the desktop but also in the data center. There are still an incredible number of system still running Windows XP (I don’t have any specific numbers so this is anecdotal evidence). Windows XP was released at the end of 2001. I also talk to a number of people who have laptops or desktops that are four or more years old. Back in the data center the story is a bit better. Again this is anecdotal evidence but I’m seeing the instances of Windows Server 2003 dwindle and I see almost no Windows Server 2000. As for SQL Server, while there is still a somewhat hefty number of databases on SQL Server 2000 that number it dropping pretty rapidly, though not as fast as I’d like to see. I’m not saying that companies have to be on the bleeding edge of technology, but being on any technology that’s more than six or seven years old is plain negligent.

The second problem is outdated applications. When I was in IT our primary mission was to support the business. This entailed updating existing applications to support changing business requirements and writing new applications in support of new business opportunities and needs. There’s been plenty of articles and blog posts on the consumerization of IT. This effectively means for the first time in the history of IT consumers have access to more cutting edge technology than what IT is making available to solve business problems. I blame this back on the extremely tight IT budgets and the lack of vision by company leaders to see the value of funding IT.

So far I haven’t said a word about pilot projects – the title of this posting. Before 2000 my company used pilot projects to try out new technologies (hardware and software). We would start with small targeted projects using well defined metrics to track success. Some of them were successful and resulted in broader rollout and usage. Others, the majority, were harvested for learning and then shelved. The point isn’t to track your success rate but rather to keep trying out new things. Almost every consumer products company (think P&G) does this with new products and they’re not afraid to shoot poor performing products in the head, quickly.

I like the mantra of “fail fast”. It sets the right mindset that failures are going to happen but without them change and innovation doesn’t happen. In IT this is equivalent to running pilot projects. Take Windows Azure, for example. Many of the survey results I see ask the wrong sets of questions which I believe lead ITDMs (IT Decision Makers) down the wrong path and cloud their thinking (pun intended). Every company should either have Cloud Computing already baked into their IT strategy or at the very least have a handful of pilot projects trying out different cloud providers to solve various business problems.

Today I wouldn’t go work for a company that had its head in the sand regarding cloud. And I certainly wouldn’t work for a company that was still running Windows XP. Both speak volumes about the IT leadership and the value (or lack there of) the company places on technology.

Author: "Dan Jones MSFT" Tags: "General"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 10 Jun 2011 16:41

We are looking for feedback on three items for SQL Server Code Name “Denali”. First, are the supported OSes. Second, are the supported upgrade paths. Third, is the way the installer handles unsupported OSes and upgrade paths. Specifically we want to know if these will slow down your adoption of SQL Server Code Name “Denali”.

1) The current support matrix for OSes is as follows:

  • Windows Vista SP2 or later
  • Windows Server 2008 SP2 or later
  • Windows Server 2008 R2 SP1 or later
  • Windows 7 SP1 or later

2) Denali will support upgrading from these SQL Server versions:

  • SQL Server 2005 SP4 or later
  • SQL Server 2008 SP2 or later
  • SQL Server 2008 R2 SP1 or later

3) The installer is going to block installation on unsupported OSes and it will block unsupported upgrade paths.

If the current support matrix for OSes, the current support matrix for upgrade, or the installation blocks will delay your adoption of of SQL Server called “Denali”, please provide this feedback to us! You can provide us this feedback by adding a comment below or submitting the feedback through the SQL Server Connect.

Author: "Dan Jones MSFT" Tags: "SQL Server, SQL Server Code Name "Denali..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 06 Jun 2011 23:32

I’ve seen three basic patterns for handling forgotten web site passwords:

  1. Send a change password link to the email address on file
  2. Ask one or more challenge questions (or personal information) to unlock the change password screen
  3. Send the password, in plain text, to the email address on file

There are different variations of these and other patterns do exist but these are the predominate ones I’ve encountered. I don’t have any stats on how prevalent each is or the secureness of each, however, I have my opinion.

Keep in mind that no password reset system is 100% foolproof. If someone really wants to get in to your account they probably can hack it, although the effort to hack a single account is very likely not worth the effort. For example, an email address can be hacked and the email generated from 1 and 3 could be intercepted. Through a bit of social engineering and research the answers for the second pattern could be had. Again this is probably not worth the effort for a single account.

Of the three though if I had to pick one I like the least it’s the third one. Having my password sent to me in clear text is disturbing from two aspects. First, anyone sniffing the network could intercept the password. Again this probably isn’t worth the time for a single account. The more disturbing aspect is how the password is stored in the site’s repository. Specifically I have no idea if the password is being stored in plain text or if the site is using a two-way encryption method.

Both methods of managing passwords are simply bad practice. Why do I say that? It’s simple, if the password can be accessed in either clear text or is stored unencrypted the site is subject to an attack on all its accounts. If I were a hacker (regardless of the color of my hat) these are the sites I would target. Rather than going after a single account at a time this method allows me to go after all of the accounts.

As a website user there isn’t a whole lot you can do to protect your account. Probably the best thing you can do, which I do, is utilize different passwords for different sites. This adds to your password management burden but this way if one account is compromised your other accounts have better odds of remaining safe and sound.

Author: "Dan Jones MSFT" Tags: "General, Security"
Comments Send by mail Print  Save  Delicious 
Date: Friday, 27 May 2011 16:34

Policy-Based Management (PBM) was introduced with SQL Server 2008 – it’s a declarative system for ensuring the system configuration hasn’t drifted from the original configuration intent.  You can read more about Policy-Based Management in SQL Server Books Online, here.

To help people get started with PBM and to promote best practices we provide a set of policies that are installed with the product. On an x64 machine you can find them in this directory: C:\Program Files (x86)\Microsoft SQL Server\100\Tools\Policies. Here you’ll find policies for Analysis Services, the Database Engine, and Reporting Services. The policies for Analysis Services and Reporting Services focus on surface area configuration, the policies for the Database Engine include a more comprehensive set of policies.

The Database Engine policies can be executed in ad hoc fashion or they can be imported into the Database Engine instance and configured for automation. You can learn more about the Best Practice policies in SQL Server Books Online, here.

Regardless if you are new to PBM or a seasoned veteran you can leverage the best practice policies as is or as a starting point for your custom policies.

Author: "Dan Jones MSFT" Tags: "Policy, SQL Server 2008, SQL Server, SQL..."
Comments Send by mail Print  Save  Delicious 
Date: Thursday, 31 Mar 2011 18:16

Caveat: I don’t write code for a living. But I do know how to get things done, usually using brute force.

SQLPS.exe is a decent environment, but sometimes I want to work in the default PowerShell environment. But if I want to work with SQL Server in the default PowerShell shell it means I need to load the SQL Server snapins into my session. Just because I’m at MS doesn’t mean I intuitively know all of the answers, though I can usually find someone who does. Sometimes, though I like to try and figure it out on my own. To feel the pain of a real user.

I’ll cut to the chase. There are probably many blog postings and articles on this already but getting a few more to pop-up in the search results doesn’t hurt. So here it goes. There are two SQL Server Snapins you need to load into your PowerShell session: SQLServerProviderSnapin100 and SQLServerCmdletSnapin100. These ship with SQL Server 2008 and SQL Server 2008 R2.

The Provider snapin is explained here. The Cmdlet snapin is explained here. Now depending upon what you’re doing in your script you may need to load one, the other, or both. I generally just load both so I don’t surprise myself when I attempt to do something and it fails. You can also add the loading to your PowerShell profile or keep it in each of your scripts. I personally like to keep it in my scripts so that when I share scripts with other people (or move them to another machine) everything just works. In other words it makes the scripts more portable.

Enough talk, here’s what you add to your scripts. I’m expecting feedback on how to simplify the logic!

# Load SqlServerProviderSnapin100
if (!(Get-PSSnapin | ?{$_.name -eq 'SqlServerProviderSnapin110'}))
if(Get-PSSnapin -registered | ?{$_.name -eq 'SqlServerProviderSnapin110'})
   add-pssnapin SqlServerProviderSnapin100
   write-host "Loading SqlServerProviderSnapin100 in session"
   write-host "SqlServerProviderSnapin100 is not registered with the system." -Backgroundcolor Red –Foregroundcolor White
  write-host "SqlServerProviderSnapin100 is already loaded"


# Load SqlServerCmdletSnapin100
if (!(Get-PSSnapin | ?{$_.name -eq 'SqlServerCmdletSnapin100'}))
if(Get-PSSnapin -registered | ?{$_.name -eq 'SqlServerCmdletSnapin100'})
   add-pssnapin SqlServerCmdletSnapin100
   write-host "Loading SqlServerCmdletSnapin100 in session"
   write-host "SqlServerCmdletSnapin100 is not registered with the system."
  write-host "SqlServerCmdletSnapin100 is already loaded"

Author: "Dan Jones MSFT" Tags: "SQL Server 2008, PowerShell, SQL Server ..."
Comments Send by mail Print  Save  Delicious 
Date: Monday, 28 Mar 2011 20:09

The March 2011 issues of Database Trends and Application has an article that highlights the results of a new survey of DBAs and DBA Managers that reveals complacency results in lax oversight of sensitive information. You can read the article here. While every aspect of the research finding is is disturbing what I found most disturbing is the amount of real “production” data that is used outside of production systems.

I find this so disturbing because it means the likelihood that my personal information is living in development and test systems is pretty high. Data obfuscation techniques have been around for a long time. I did a Bing search on “SQL Server data obfuscation”, the first result back is this article by John Magnabosco. It’s a good article that explains how you can build your own data obfuscation capabilities so data can be safely moved outside the production environment. At the end of the article he mentions a product from Red Gate called SQL Data Generator. I haven’t used this product but Red Gate offers a free trial and a single user license is under $300 (according to their web site).

There are a lot of DBAs out there who take their craft and their role as data steward very seriously. They implement strict standards for access production data and handling of production data. Then there are those who don’t, either because they don’t their craft seriously or because they don’t know any better. These are the DBAs who need training, coaching and mentoring. Which type of DBA are you?

Remember, your personal data is under the management of both types of DBAs!

Author: "Dan Jones" Tags: "SQL Server, Security"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 15 Dec 2010 18:26

I recently botched an answer about why one should use SQL Configuration Manager (SQLCM) over Service Control Manager (SCM) to change service accounts and/or service account passwords for SQL Server. Let me attempt to redeem myself.

If your running SQL Server 2008 or later and your running on Windows Vista or later (Win7 or Win2K8) all resources (Folders, Files, Reg Keys, etc) are ACL’d using the Service SID. Therefore, regardless of the account the service is running as it will always have access to the necessary resources. SQLCM, however, does a bit of magic under the covers when you change the password on a service account to avoid a service restart. There was a bug in SQLCM that blocked this behavior but it was fixed in a CU (SQL Server 2008 R2 CU 4 to be exact) and here’s the KB Article on the fix.

If you’re running on SQL Server 2005 or running on earlier versions of the OS (pre Vista; WinXP & Win2K3) ACLing is done via groups and the group membership is maintained through SQLCM. Therefore, changing the Service Account through SCM won’t update the group membership and you’ll run in to permission issues.

I hope this clarifies the difference between using SQLCM and SCM. And if I’ve botched it for a second time, I prefer my crow medium-well.

Author: "Dan Jones" Tags: "SQL Server"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 13 Dec 2010 02:31

Most people I’ve spoken to have at least heard of the Express edition. The next question I ask is where they’re running it: development or production? The answer is usually both. But when I drill into the production use it turns out that most of the time Express is running in production because of a 3rd party application not because of an in-house developed app. When I question why it hasn’t been used for in-house apps the response is one of three. The first reason is they have no idea why and realize that maybe they should consider it. The second reason is, they have ample capacity on existing servers/instances and to keep things simple they add the database to the existing environment. The last reason I hear has to do with the limits placed on Express. Let’s take a look at each of these in a little more detail:

Just Don’t Think of Using Express

This one baffles me. I can understand if you are needing features that aren’t available in Express (transparent data encryption, data/backup compression, etc.) but if Express is the right tool why not use it? It’s part of your toolbox – know your tools!

Existing Excess Capacity

This is a compelling reason to ignore Express. You’ve paid for an existing SQL Server license so why not get the highest utilization you can. Enough said.

Express Limits 

This one is interesting in that I find people with outdated information. The limits for SQL Server Express 2008 R2 are:

  • Database Size: 10 GB
  • Processors: 1
  • Memory: 1 GB

Now I agree there aren’t too many tier-1 (mission critical) applications that fall into this category of resource usage. However, the number of non-tier-1 application grossly out numbers the count of tier-1 apps and there are a lot of non-tier-1 applications that do.

Why is any of this interesting or important? SQL Server Express is the exact same codebase as the other editions. It’s fully tested and fully supported. All of the familiar tools (Management Studio, SQLCMD, and PowerShell) work against it. It’s good for development environments and even more importantly it’s completely capable for production environments. In certain situations you may not even consider Express and you may find yourself introducing yet another RDBMS into your environment. In the long run managing multiple RDBMS will be more expensive. Obviously if you have no other option, for example, your using a framework or app that doesn’t yet support SQL Server, you’ve got to do what you’ve got to do. Just don’t make a decision out of ignorance. Express is a solid edition of SQL Server for all environments: development, test, and yes, even production.

Author: "Dan Jones" Tags: "SQL Server, SQL Server Express"
Comments Send by mail Print  Save  Delicious 
Date: Wednesday, 08 Dec 2010 14:41

How many databases in the world do you think are storing your personal information? Tens? Hundreds? Thousands? I have no clue what the answer is but my guess is it’s closer to thousands than tens. Why is this an interesting question?

In my line of work I speak with lots of DBAs and I’m absolutely shocked how many times I hear a DBA say they never change the password on service accounts or admin accounts. I had one DBA admit they hadn’t changed an Admin password in almost ten years! The reason almost always given is “it’s hard”

To be blunt this is ignorant, lazy, unprofessional and borderline negligent. I won’t apologize for being harsh and I’m sure some readers will come away offended by this; a risk I’m willing to take given the seriousness of the topic. DBAs are highly skilled and well paid professionals – relatively speaking – and they should take the responsibility of data steward as serious as a heart attack.

I’m sure some of you will fire back that we (Microsoft) should provide better tools for this. I don’t disagree, but this is not an acceptable excuse for poor security practices. A simple search, using your favorite search engine, will yield thousands of results for how to change service accounts and password and there are even lots of sample scripts (VB, PowerShell, etc.). There is no excuse for sticking your head in the sand and repeating “it’s hard” over and over. You’re sitting on a ticking time bomb with no clue when it’ll go off. I hope you keep your resume up to date.

This isn’t a global indictment of all DBAs. There are lots of DBAs who approach their responsibility with seriousness and professionalism. They rotate service accounts and change passwords on a regular interval (45 days, 60 days, 90 days, etc.). I’m certain the first time they did this it was painful but each time it became more automated, easier, and took less time.

There are exceptions to every rule so if you’re following a different practice for securing logins kudos! But if you have logins that have or are allowed to have stale passwords I urge you, no, I beg you to take immediate action! Take action before it’s too late – before you have a security breach. I’m sure your like me and you don’t want your data in unauthorized hands.

Finally, if you’re one of these DBAs with lax security policies you better hope I never end up managing your group; your first task will be to update your resume. Something things deserve zero tolerance.

Author: "Dan Jones" Tags: "SQL Server, Security"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 06 Dec 2010 16:54

I often hear people say '”I’m not going to use Microsoft stuff because the don’t want to become victim of vendor lock in.” They often chose “open source” alternatives for pieces of the stack (web server and database to name a few). This isn’t necessarily a bad thing so long as they’ve done their homework, landed on the right design and chose the runtime that best met their needs. However, this seems to rarely be the case. They either assume that if they use one MS component they have to use them all. And second once they choose a stack they implement a poor design that locks them into that particular stack.

Let’s take for example the DB layer. Either you don’t know that MS has a free edition of SQL Server (Express) or you don’t care and you’re just going to use MySQL or PostgreSQL because that what everyone else seems to be using. That’s fine, you should feel comfortable with your decision and the best way to achieve that is to set forth some decision criteria and evaluate each against those criteria picking the one that ranks highest. Ok, you’ve got your DB engine picked, you picked your web server (Apache) and you’re going to code in PHP.

You have three choices for how to access the DB from your code: 1) straight in-line SQL which is platform specific, 2) use PDO (more on that below) or 3) write a full blown data abstraction layer.

The first one will get you started the fastest and as long as you never want to switch to a different backend you should be fine. But be careful there are some things like SQL injection that you’ll have to guard against. The third one, a custom abstraction layer, will be the most expensive one, if you’re going this route you probably have some very specific requirements you need to code for. Or you just like writing data abstraction layers for fun.

The second one, PDO, should be the first choice for all PHP developers.From the PDO manual: PDO provides a data-access abstraction layer, which means that, regardless of which database you're using, you use the same functions to issue queries and fetch data. You get a few super nice benefits: PDO manages transactions (however the underlying RDBMS must support transactions) and using prepared statements gives two benefits: better performance on queries that need to be executed multiple times with the same or different parameters and implicit guarding against SQL injection.

Microsoft recently released an update to the SQL Server PHP driver that includes support for PDO. You can download it here. Ignore the SQL Server information on the PHP site, it’s outdated and for whatever reason The PHP Group is reluctant or dragging their feet to get it updated. Makes no sense to me why they wouldn’t want to have a pointer to the latest and greatest.

While I’d like to see all PHP developers use SQL Server I’d much rather see them leverage PDO so that when they finally do see the light and want to try out Express or SQL Azure it’s a much simpler exercise. Your application is far more valuable if you can easily satisfy a client’s requirement to use a particular backend.

Author: "Dan Jones" Tags: "SQL Server, PHP, PDO"
Comments Send by mail Print  Save  Delicious 
Date: Monday, 22 Nov 2010 01:14

I’m testing out some new stuff and I need your help.Do you have a schema that has a bunch of complex dependencies between objects such that it’s a real pain when you need to alter it – you have to figure out the dependencies, unhook them, make your change, and then wire everything back together? I so can you send me your schema and some of the painful changes you had to make? Using the Email Blog Author link send me the CREATE script for the starting schema along with the alters were a pain. I don’t have anything to offer you for your assistance other than the satisfaction that you helped out the SQL Server team. Thanks in advance!

Author: "Dan Jones" Tags: "SQL Server"
Comments Send by mail Print  Save  Delicious 
Next page
» You can also retrieve older items : Read
» © All content and copyrights belong to their respective authors.«
» © FeedShow - Online RSS Feeds Reader