You got to look at it as a bit of a play on words. It could be invalid logic, or the way I have the title, In Valid Logic.
As of now, I will no longer be blogging on qgyen.net. It has been a good run.
All my old content will remain live though.
One thing I've been trying to do for a long time is to find a new domain name to move my blog too. One that is catchy, easy easy to remember, short, .com, et all. It is way harder than it sounds, really.
Qgyen is kind of old and out dated. It has all my Google juice, but that is mainly because I've had it for ~8-9 years. But it is just a word I made up. No one knows how to pronounce it, I don't give it out to people via word-of-mouth since it sounds dumb when I say it, and no one can spell it. Looking at my Google Analytics, often times people get to my blog by going to Google and searching for "ken robertson". Even Rich Mercer has told me he gets to my site by Googling for my name.
If people you know (friends/coworkers) get to your site by Googling your name, you need an easier domain!
But finding a good domain these days isn't easy. First, my name is too long, and even then, the .com is taken (I have the .name though, but who uses .name?). I had thought of a few others, but when thinking about the names later, they sounded too corny, or just didn't seem fitting.
The two main contenders I came up with were linkedlabs.com and explosivethoughts.com. Linked Labs just sounded kinda catchy, but I am not a lab. I'm not linked to any labs. But they sound good together. Then I got explosivethoughts.com... it sounded catchy at the time. Kind of like "this idea is so hot, it'll explode" or the tagline I came up with and put on the stub site was "lighting the fuse on bright ideas", or if I wrote some program I posted, could make the credits as saying "an explosive thought"... this domain was basically the result of staying up too late one weekend watching the movie Accepted where it had the guy who wanted to learn how to blow stuff up with his mind (hence, explosive thoughts). I looked up the domain and got it, but then a day or so later, was thinking that maybe I wouldn't want my blog/site correlated with word explosive.
Still trying to come up with something.
I've been doing some reading about Google's announcement of their "App Engine" platform for hosting scalable applications. Some of the reactions so far are pretty interesting. Partially of note, I see a number of places saying it will be the "Amazon killer", or "hosting killer", and even that it is "Geocities 2.0". In my opinion, all of them are wrong.
Google isn't going to kill anything. The hosting market is very vast, with diverse offerings, new consumers every day, and players always coming and going. They'll add a new dynamic for sure, but they'll be perfect for some, and won't be a good fit for others.
Easy scalability is now an issue. Up until recently, sites that needed to scale usually followed this (or a similar route): come up with idea, start site, begins to catch on, get VC (or money from somewhere), build (or pay someone for) a scalable infrastructure. Now, a super small (even single developer) can build an application that goes from zero to hero in mere weeks, especially with the advent of sites like Facebook.
Google vs. Hosting Companies
Google's target is different than your average hosting company. Google's specifically called out they're targeting developers and scalable web applications. At your regular $3.95/mo hosting company like GoDaddy, Dreamhost, or other small companies, this isn't something their customers are after. Sure, Google has some nice offerings for free, but many of them are not developers with those kind of needs. Many of the customers aren't developers, or have more generic needs like wanting a small site, blog, forum, etc. Google could make it easier for them to deploy, but I don't foresee Google targeting this. Common hosting is often high support and low profit. They don't really need BigTable, cloud computing, or anything like that.
Then there are your larger scale hosts who do big deployments, managed support, etc. Places like Rackspace, OrcsWeb, Engine Yard, or BitPusher (the main ones I know of). Often times the people who go to them are larger scale sites who might benefit from what Google offers, but they could also be turned off by the restrictions or need functionality outside of them. Some of the constraints like no direct disk access, no background processing, locked into BigTable/GQL, etc., could be too much. Google understandably places restrictions, since that is how they will offer the cloud they are, but any time you place a restriction, you eliminate some people who can use it. It is a game of give and take. High end managed hosting will still thrive. Often, their biggest asset is the level of personal support they can offer, especially when they do targeted expertise (OrcsWeb does Windows, Engine Yard primarily does Ruby). I don't see Google being able to match that.
Google vs. Amazon
Some people have said Google App Engine could kill Amazon EC2/SimpleDB/etc, but the reality is they focus on different markets. There is some cross over, both have their own restrictions, but one fills needs the other can't. Google is only web apps and specifically says they aren't offering virtual machines or grid computing. Amazon offers virtual machines and can do web app hosting, but does have restrictions on guaranteed availability and storage. Google promises simple web apps, Amazon doesn't, but you get a full system with Amazon. Companies that need background processing will definitely still use Amazon, and that likely represents a good portion of their usage. Companies like SmugMug offload a lot of their image processing to EC2, and something like that could not be solved by GAE. With Amazon's recent availability zones and elastic IPs, it is certainly possible to have high availability web apps. Google is more technically restrictive, while Amazon has a higher technical barrier. Additionally, Amazon is more diverse, with EC2, S3, Simple DB, and SQS. They can be used together, or separate.
Someone in a blog comment claimed uptime with Google would be better, but that is pure speculation. All systems fail, in one way or another. Making uptime claims for GAE when it is out less than 24 hours is very optimistic. EC2 has its issues, but Amazon makes that clear, and well designed systems on EC2 should adapt just fine. GAE will almost certainly, at one point or another, have an issue, degraded performance, momentary outage, etc.
On going impact
Probably the most powerful offering with GAE that hopefully develops further is their database. Scaling the database is often the biggest issue for large sites. The web tier almost never the bottle neck, but the database. Scaling a database can be very costly, especially with traditional RDBMs. In the future, we'll likely see much more with BigTable and products targeting distributed scalable databases made easy. Microsoft recently announce SQL Server Data Services, but I expect much more to be coming.
There will likely be other "cloud computing" offerings coming from others. Some speculate Microsoft will be, others have thought HP. I think there will still be more to come.
I hate databind, I hate postback, MVC please deliver.
Alright, it doesn't exactly have rhythm, but it popped in my head after wrestling a little databind/postback related bug this morning.
I know the iPhone SDK is basically old news now, but having recently got my iPhone, I have a few thoughts of my own.
First, I have downloaded it, however I haven't used it other than messing with the simulator for web development. Initially, I was disappointed with the lack of the interface builder, though they did recently update it with interface builder support. Hopefully some of the sample code gets updated to show how to do it that way. I've done interfaces-by-code and it is not really fun. My experience was with Qt in C++ on Linux back around 2000/2001.
What I really think will make a difference is the iTunes App Store. There have been a lot of complains about it, and about the application process, the $99 fee, and their somewhat selective approvals of applicants. The blog posts I've read about it pointed mostly to the processes for approval and such still being put into place, so the initial 'rejections' were more of a postponement. The $99 sucks, true, but I don't think it is that bad. Compared to the development tools I've bough on my own before, it isn't bad, and its cost can get made up with the benefits of the iTunes App Store.
So what makes the store so awesome? Have you ever tried to find and download applications for a mobile device before?
I hadn't downloaded much on my Blackberry, but back when I had various Windows Mobile/Smartphone devices, it was painful. Really, mobile applications are in a sad state for distribution. Sure, you can easily install them with with ActiveSync, but I'd find a lot of the programs by digging around various forums sites, sometimes with the latest version buried on page 24 of a thread that spans a year or so. Others were on websites that looked like crap and didn't inspire trust in putting this app on your phone. And there was a lot of old material, such as application list sites put together by someone a year ago, littered with Adsense trying to make some dough. On many developers sites, would find out dated, abandoned programs, not updated for the latest version, or maybe designed for one device vs another, since devices vary so much (ie, XV6700 vs Moto Q).
When I purchased my iPhone recently, I had another glimpse of this when looking around for information on iPhone apps. I was mostly looking for iPhone web applications, and there were a bunch of iPhone application list sites, some with pretty old apps and links that no longer worked, etc. I also was researching exactly what Jailbreaking did, but was spread across forums/blogs linking to each other.
It sucks. It is not customer friendly. The App Store changes all that. All applications are located in one central places, easy to find, plainly simple to deploy. There is an entrance fee (the $99), so yo know the developers are likely more serious. The 30% cut on sales does suck, yes, but you can still have free applications, so there is still a market for hobbyists.
Companies that actually want to make money off the applications get a great tool as well. It provides several crucial things to them. It helps their potential customers find their programs by acting as a directory. It serves as a deployment and notification system by handling the installation and upgrades for them (ie, push the new version to iTunes and everyone gets it). And finally, it solves e-commerce for them. Yes, 30% sounds like a lot, and is a little on the big side, but the App Store will eliminate the need for their own store/shopping cart system, credit card processing, merchant account, or need to go through someone else for that. Additionally, customers would be much more trusting as I'd expect Apple to handle billing them.
Overall, I hope there is more progress like this in the mobile arena.
Adorable photo of Nick we got back today. Yup, my son is a looker. Gotta run, he is on my lap, desperately reaching for the keyboard.
A while ago, I had signed up for just receiving an electronic copy of my cell phone bill. So earlier this evening, I had been trying to log onto my Verizon Wireless account to look at last month's bill and was having some trouble logging. Basically, I'd login but would get a "System Failure" message right when I log in and basically would be stuck.
So I call them and the customer support rep is walking me through logging in and they just weren't understanding what I was saying. It went kind of like this:
- Me: When I log in, it takes me to this 'System Failure' message
- Them: Well, when you first log in, do you see the sidebar which shows your phone?
- Me: No, I'm at a 'System Failure' message, their is no side bar
- Repeat above
- Them: Well, log out and close your Internet Explorer (note: I didn't want to confuse them even more by being on a Mac)
- Repeat from beginning
- Them: Well, it shows our site is up and working, so it must be a problem with your server (note: I really hate when non-technical people say that kind of thing)
So eventually, they got the bright idea and wanted to try and log in as me, so they asked me for my username, which I gave, and then asked me for my password. I was like "Excuse me? I am not giving that to you.".
Had they never heard of phishing? I was shocked that they even asked. How many times have I read security notice emails about "we will never ask you for your password"? Shouldn't that kind of thing be at the top of any customer service rep's DON'T list these days?
A few weeks ago, I made a post evaluating options for hosted Subversion services. At the time, I was basically drawn to an empty conclusion. Now, I'm relabeling it "hosted source control" rather than "hosted subverison", since my end goal is to have a nice solution for both subversion and git. The more I look at git, the more interesting it looks, but mainly for non-Windows development, since using it on Windows isn't quite as refined.
I tried using git with hosted-projects.com, since they offer WebDav space according to their site, though I couldn't get it working. I tried running litmus against it, a webdav testing program, and it failed pretty bad. So that was out.
So I was considering the likelihood that I may just need to set up my own virtual server for it on one of my servers. Then just a few days ago, as I was looking at using a git plugin for Trac, I found this program called Warehouse which was almost exactly what I was looking for. It is an application you install rather than a service, and it allows you to manage repositories, browse the source, and doesn't have the bloat of wiki/forums/tickets/etc, but it has a plugin architecture so it could eventually, if they were available. And better yet, on its forums, there was talk of git support in the next version.
Around this time though, I got a beta invite to Github. I had signed up there around the time of my previous post, but at the time, I thought it mainly did public/open source hosting, but it does allow private repositories. You can't create/manage your own users, but rather just grant access to other users on the site. It is appealing, but again, there is no clear subversion AND git solution as of late. They also announced their pricing recently for post-beta, which looks pretty decent. One main advantage to git is its portability. You can move it, sync it in multiple places, or just about anything you want.
Shortly after my first post, Josh Frappier, the co-founder of Unfuddle contacted me and I talked with him some. First, as I've mentioned before, I find it very appealing when those in charge of the companies whose services I use contact me. He was a very nice guy and talking with him made me like Unfuddle even more. He told curious why I hadn't considered Unfuddle more, and it was mainly because I was after personal repositories and Unfuddle has a 1-1 relationship for repositories-to-projects. Not sure if all the stuff he was talking about is public knowledge, but in short, he said they're working on some stuff that will make Unfuddle much more suitable.
Where do things lie now? As of yet, still undecided. Short term, it looks like using Github for git and Warehouse on my own is where I'm at. Long term, it looks like I'll have two options. I could run it myself and handle both with Warehouse, or have it hosted and managed by someone else with Unfuddle. Only time will tell. Part of it is primarily that git only went "mainstream" recently. As it catches on more, I'm sure more options will crop up.
In the meantime, going to continue evaluating options. I may just try and using Git 100% for personal stuff, even on Windows stuff, simply to see whether the tools under Windows really do interfere too much. You graphically view a repository on Windows with QGit. There is also a cygwin-less port of Git called msysgit in the works. So might not be as bad as one would think.
I had previously mentioned that I had moved my site over to Mono, however last night I actually moved it back to IIS. The main reason is that I was experiencing some CPU spikes where Mono would take 100% of the CPU. I had tried tracking it down over the weekend, though wasn't able to find the cause, and mean while, my site was going up and down as far as accessibility. Since I wasn't able to narrow down the cause, and wasn't able to dedicate a whole lot of time to it just yet, I figured I might as well switch it back to IIS for the time being so I can get it resolved without having my site down, or having to go onto the server and kill/recycle the mono process.
Hopefully in the next couple of days, I'll get it nailed down. I had a list of a couple of potential causes, and have crossed off a few of them. This morning, think I might have figured it out, but haven't had a chance to test it yet.
So there you have it. My blog is running on Gentoo Linux 2007.0, using nginx + FastCGI with a build on Mono from subversion (from today) and using VistaDB, and also running a Ruby based processor monitor called God, which is set up to make sure the service stays available and watches memory and CPU consumption.
What does it take to get Graffiti up on Mono? Really, not a whole lot. First, you need Mono. Graffiti currently requires Mono 1.9, and it looks like the Mono 1.9 Preview 4 was just released today. Then download and extract Graffiti, go into its main directory, and to test it locally, just fire up xsp2. xsp2 is basically a mini-webserver, kind of like the light built in webserver you can fire sites up in with Visual Studio. Nothing else is really needed. By default, Graffiti will use the small sample VistaDB database.
You would more than likely need Preview 4 to use all of Graffiti's functionality. There were some bugs in Mono that would have made Preview 3 a requirement, but if you have a Graffiti license, there was a bug with the some of the methods our licensing uses that was fixed either the day of the Preview 3 release, or the day before, and might not have made it in.
Probably the most common configuration to go with for a live site would be Apache and mod_mono, however I like to be a little different. I run some little Rails apps from this same virtual server, and in the Rails community, Apache is often viewed as a little bloated. A popular combo is to use nginx, and since nginx supports FastCGI, figured I might as well try and get the two to work together. Lighttpd is another popular web server, but I already had nginx in place, so figured would just keep that.
I'll probably post some more details soon, including my configuration scripts for nginx and for god. I also think I might try and do some basic benchmarks. Set up a couple of virtual machines here on a spare system and test Graffiti's basic performance under IIS6, IIS7, nginx, lighttpd, and Apache+mod_mono. Could prove interesting, so will maybe make it a weekend project.
Lately, have had some late night hacking to experiment with some Ruby/Rails apps. For a long time, I'd been neglecting Rails and had intentions of learning it, though always put it off. Though I'd been reading some books on it, but I tend to learn new languages by doing rather than reading.
Recently, had started using Skitch to take screenshots in OS X, but was disappointed with their Skitch.com service for posting photos online. It was ok, but didn't have configurable thumbnail sizes for embedding and always included a tagline below the image.
It offered posting photos online via WebDAV, so I thought, what if I wrote a quick Rails app to handle the WebDAV PUT request and then I could have my own page to go along with it with customizable thumbnail settings, various pre-formed content, and no taglines. Normally, Skitch will post it then copy the image location to the clipboard (it knows the location, won't let you return the location unfortunately), so I can paste that URL into a browser then add on ".view" to go to a custom page with options for the image:
Right now, have it just have a regular thumbnail/link, one for thumbnail/larger in a lightbox pop-up, and a thumbnail/link in BBCode. I have global thumbnail defaults, and individual image overrides. If the thumbnail would be bigger than the original, then it doesn't scale it and instead just does an IMG tag for the image itself. Also can upload images over the web interface, as opposed to always having to use Skitch. And it has authentication.
Nothing really super fancy and would likely take a Rails expert a few hours to whip up, though it was a good amount of experimenting for me over the course of a few nights, slowly adding this/that, getting it working on OS X, then Windows, then OS X again, and then on Linux. There are some things to be cautious of when changing environments, especially with the uploaded files (in dev, it was a UploadedTempFile, on Linux it was UploadedStringIO, on Windows I could remove it when done, on OS X/Linux I couldn't).
I'm not about to pull a Mike Gunderloy and jump ship from .NET to Ruby, though I definitely find the "convention over configuration" approach to Rails refreshing. I think ASP.NET will get a lot better with ASP.NET MVC, and I'm impressed with it from what I've seen so far. But I think .NET still has a couple of lessons it could learn from Rails, such as ease of deployment, ease configuration and definition of environments, and programmatic database migrations.
A big part of it has to do with Windows developers aversion of the command line. When I used to run Linux full time, I'd always have 2-3 console sessions open, especially when coding. When developing on Windows? Most likely none, unless I need to do a ping or something. Any console window is probably just a quick test console application that is running in debug mode. The command line has so much more power than the GUI, as the command line can be nicely organized while the GUI gets cluttered with add-ons, widgets, to-dads, and task bar icons.
Perhaps I should do a short series on some of my thoughts on how .NET could adopt some of the things that make Rails so easy.
As I've mentioned before, I've been a huge fan of Windows Media Center Edition. MCE, in my opinion, is just about the golden grail in terms of DVR experiences. The UI is very rich and responsible, very extensible platform, and quite feature-full.
However, recently I made the decision to move away from cable towards DirecTV. There are a couple of motivators:
First, the coming analog switch next year means my current tuners would be obsolete and I'd need to buy newer Digital Cable Tuners for the box, including paying fees for the CableCARD to power them. However, did I really want to stay with Comcast? I mean, with my own system, I couldn't leverage their on demand without paying for a normal receiver to accompany my MCE box. Was wondering, did I want to put more money into it?
Secondly, in keeping with my new years resolutions to simplify my life, going to DirecTV made sense. With MCE, it was my box, and my problem when it didn't work right. I was having issues with OTA HDTV and had been dragging my feet on buying a new, better antenna. Dragging them for like 9 months. I only ever thought of getting a new one at night when the thing was acting up, and that was never the time I wanted to research antennas and order one. By moving to DirecTV, if it doesn't work, its their problem. Call them, come out, fix the dish, replace the tuner, etc. They own the hardware. One less thing that would rest on me or I'd be a bottleneck on.
Why DirecTV vs Comcast Digital Cable? Well, several. The internet is littered with poor opinion over the Comcast DVRs. And I wanted a good DVR. I was dead set on a dual HD DVR. Didn't want one everyone thought sucked. DirecTV is more vested in their own DVRs as well. My dad has the same DVR package I got and likes his a lot, and I've liked it and the service when at their house. They put out regular updates, look for customer feedback, and even allow customers to use beta releases through their own online support forums. Never heard of that with Comcast. And the whole recent rave of commercials that Comcast has more HD channels than DirecTV? Don't think so. I counted them. Comcast: 28, DirecTV: 45. Sure, the important ones are likely on both. But really, cable advertising has always bothered me. Speaking of advertising, Comcast seems to also like littering their interface with ads. Bleh.
So what about DirecTV vs MCE? There are a few pluses and a few minuses. The interface is much more fluid and responsive in MCE. DirecTV does have a delay between pressing a button and getting a response. But its quality is great. Some things I really like, such as if you watch 20 minutes of a show and then decide to record it, it will save the 20 mins you've already watched. MCE didn't do that.
One thing I did find an issue with is DirecTV's support for pre/post recording. Normally, I like to record 2-3 minutes before and after a program, since some tend to run over a bit (FOX/American Idol). With MCE, it supported that and it would know the core minutes (ie, 8:00-8:59) were required, and the padding was optional. DirecTV's DVR doesn't see that... it requires all of it. To put it into perspective, take this scenario. You have two tuners. You record one show from 8:00-10:00 on FOX and with padding, it is 7:55-10:05. Then you want to record something on NBC from 8:00-9:00, so it marks it as 7:55-9:05. Then you want to record another show on CBS from 9:00-10:00, but the DVR won't let you, because the tuner is still busy from 9:00-9:05. It won't let you. With MCE, it would recognize that the padding is optional and it needs the tuner for something else. So it would have recorded NBC from 7:55-9:00 and CBS from 9:00-10:05, since that made sense. DirecTV doesn't recognize that. Because of this, their pre/post recording is essentially useless. Hopefully I can send some feedback and this maybe makes it into a future update.
There are some pluses one thing DirecTV does that MCE didn't is if you watch 20 mins of an hour long show, then don't have time to finish it, hit record and it captures the entire show, not just from then on. MCE didn't do that.
Additionally, with an update last week, I now have the DirecTV On Demand beta accessible on my box, and it is pretty nice. Doesn't yet have a full wealth of shows available, though they have a good list of channels signed on, and I like the interface and way it is set up. You can browse a global directory, or narrow it down to a specific channel, such as view the On Demand page for Discovery or TLC.
Overall, I am impressed and glad I made the switch.
I had decided to give Quicken Online a try. I've been missing out since dropping Money and really do need to know where my account is heading. Mainly, need to be aware of bills I've queued/checks I've written that haven't posted yet, and a bit of a forward projection so I know ahead of time if I might need to transfer some money from another account before a big grocery shopping trip or a weekend away.
The problem? Well, I am no longer on Windows, so can't use Microsoft Money unless I use it from within a virtual machine. I was thinking of trying Quicken for the Mac, but figured I'd given Quicken Online a try. They had a free trial, so figured why not.
Well, my trial lasted about 30 seconds.
The first strike is that they don't list Wells Fargo as one of the banks they support. Come on? Wells Fargo isn't some small bank no one has heard of. They are one of the big names.
But the main thing that turned me off is the lack of direction. Ok, I typed in Wells Fargo, it says we couldn't find them, so I click the link about not being able to find them, and all it says is to send them feedback.
But what do I do right now? There is no cancel button, since this screen came up before I even got into the application, and there is no skip, or enter later, or anything. They have created a barrier for me before I even enter the application.
Don't needlessly create barriers for users. Offer them direction. Feedback is good, but it doesn't do anything for me right now. Their feedback page doesn't even promise a response, so in essence, my trial is over and I won't be using their service.
I recently decided to look into using a hosted Subversion service. I used to always use Subversion locally using a method I've mentioned before, however recently, I was going to be doing some coding while out of town and wanted to be able to make regular check ins, but couldn't since all my repositories were local. So I decided to look into actually using a live Subversion server.
Initially, I thought I'd simply set up a Subversion server on my own servers and go with that, but I wanted to have an interface for managing the repositories, so I could easily create new ones or access/browse them through the web and didn't want to have to set all that up on my own. Additionally, there are a number of inexpensive Subversion hosing services, so sometimes it just isn't worth the effort.
I couldn't use something like Google Code, because most of these are personal projects and not open source. Though my three main contenders are hosted-projects.com, Beanstalk, and Code Spaces. Briefly, here are the pros/cons so far:
- Unlimited repositories (I prefer individual repositories over a generic "projects" repository)
- Includes WebDAV space (see end)
- Includes Trac install with each repository
- Butt ugly. Seriously, their website is unappealing and the control panel is even worse. It makes me question how much they've invested in their business, how they'd handle failure/failover/recoverery.
- While Trac is nice, it is a bit overkill for personal projects. I just want to view source/revisions, don't need tickets and a wiki. Can use Basecamp/Unfuddle if I needed those features.
- Rich interface, very nicely done
- Just does source/revision viewing, but offers integration with Basecamp for more functionality (not Unfuddle, unfortunately)
- Costs more
- Limits based on number of repositories. I could group projects, but don't really want to.
- Good interface, offers work items, forums, and a wiki, similar to Trac, but busier for basic change log type viewing. Code browser is a bit too busy too, too AJAXy.
- Unlimited repositories with their "Small Team" package, which is only a little bit more than Beanstalk
- Though you can get unlimited repositories, for the package level I am looking at of the three, it is the most expensive.
- While the interface isn't horrendous like hosted-projects, I don't like the over use of AJAX and modals.
Overall, I am pretty torn. Plain, simple, and unlimited (while ugly) with hosted-projects, or strong interface but limited repositories with Beanstalk. That is pretty much where I am at. If Beanstalk were unlimited repositories, it'd be a no brainer. If they were to double the repositories, I'd definitely stay there. For me, disk space is unimportant. I really don't need that much disk space, so with Beanstalk, the limiter is repositories and not space, so I'd be just as happy with 1/2 the disk space and twice the repositories.
I did mention something about WebDAV, though that is a very minor point. One system I am particularly looking to move towards is using git. I am thinking of just skipping a hosted Subversion solution and just going to using git. Git is basically a distributed version control system, where you have full history local to you and can commit and use it entirely while disconnected. Then you merely have multiple branches of the same code that can get merged together. In my recent case, of doing some coding while out of town, I was at my parents and they have (until they move in 2 months) crappy satellite internet since they are out of reach of regular utilities. Their connection is barely ok for web browsing, horrid for RDC, and I wouldn't trust it much with version control since it isn't always consistent or sustainable. With git, I could sync to my laptop before I leave, make checkins while gone, totally offline, then merge back when I get home. With a Tortoise-like clone being developed for Git, it will likely be a lot easier to use on Windows soon, and it looks like there is a Git bundle for TextMate, so I'd be set in my OS X + TextMate bliss.
If I went with git, I wouldn't really need a central repository site like hosted-projects, since you don't really create a repository that you then load everything into. Git doesn't have to be online at all. But what would be nice would be a simple web application to browse the history (even though I could do all that locally).
Recently, I ran across an issue with a particular way I was using Membership.GeneratePassword. I had written some code to handle some automated deployment and was using Membership.GeneratePassword to generate a password for a SQL user and then add it to the connectionStrings section of the web.config.
GeneratePassword takes two parameters, one for the length, and one for the number of non-alphanumeric characters (characters other than 0-9, a-z, and A-Z). The way I was using it was GeneratePassword(8, 0), so I wanted an 8 character password with just alphanumeric characters.
However, I eventually ran into an issue with it. It generated a password with ';' in it, which broke the format of the connection string. Even with the way I was specifying 0 for numberOfNonAlphanumericCharacters, it was still inserting a couple of non-alphanumeric characters. After testing, I found it was averaging 2-3 non-alphanumer characters per password. I just didn't notice it until one of them actually broke something.
I dug into the documentation for Membership.GeneratePassword and found for the numberOfNonAlphanumericCharacters, it showed the parameter description as:
The minimum number of punctuation characters in the generated password.
This contrasted what was shown to me in IntelliSense when I was initially coding it.
So while I specified I didn't want any non-alphanumeric characters, since the parameter is actually just a minimum, there is still a chance for them to slip in.
In examining the code through Reflector, I found that it generates the password in a kind of two-pass method. It first runs through and generates a password of the given length, allowing for a random chance of non-alphanumeric characters to be added. It then makes a small second pass where it ensures the given minimum has been meet. So if it needs additional non-alphanumeric characters, it will randomly replace characters until the minimum is met.
This makes sense, but it poses the problem that the method is not useful for generating simple alphanumeric passwords. In a case like mine, where I intend to use it in a connection string, it has the potential to break the syntax.
In the end, it would be nice if one of two things happened:
- Update the IntelliSense documentation to reflect that it is the minimum and not the actual number of characters.
- Update the method to treat the parameter as the minimum, but to support just alphanumeric characters when 0 is passed. As it is now, GeneratePassword(8, 0) and GeneratePassword(8, 1) are nearly identical, as the probability of getting a true alphanumeric password is low. With this suggested change, if you don't want non-alphanumeric, then use 0. If you do want non-alphanumeric, then you can just use 1 or more. If you don't care, perhaps they could add an overload for GeneratePassword(length) which maps to allowing a minimum of 1 (to encourage stronger passwords).
For now, I just switched to using my own base 62 random string generator and will make a mental note about the behavior.
Recently, have encountered some poor customer service interactions that I'm sure Seth Godin would have a field day with. Too often, companies now rely on overly simplified and generic templated responses to customer service questions, without paying attention to what the customer was after. I've had two run ins with this as of late.
First, just a few days ago with Ebay. I've been clearing my closet of spare hardware and have one thing that I've tried to sell twice now, and both times within a few hours of the auction ending, I've got an email saying there was bidding without the owner permission and my auction had been removed and fees refunded. The first time, I figured shucks and reposted it. Then it happened a second time, so I've now wasted two weeks trying to sell it. I contacted Ebay and first had to fight to find the place to actually send them a question, since they love their FAQs, and then when I asked what tools they have available to sellers to help against this, they sent me a generic response a few hours later about how to report fraudulent activity on my account. They didn't even read my short one paragraph question. There wasn't any fraud with my account, I was asking how to avoid it with others.
I also had some recently with Yahoo. I have my own server and when I've tried to email some people on Yahoo accounts, I've been getting bounced emails. In the bounce message, it directs me to a page with information on the error, and on that page it says that if you run your own server, you can fill out a form to register your server. Well, that page was a bad link and was going to a "page not found" error. So I emailed them to let them know, said I was getting this message, went to the page, went to register my server, and the page isn't there. Low and behold, I get a generic response about if I am getting that error, go to this page (the one I was already on). Zero help. I replied and explained I'd been there and one of the links on the page is bad. No response.
With Ebay, I was expecting them to respond saying there is nothing they could do and would have been totally fine with that. I figured it was worth asking. But instead, they respond with something that offer no help at all.
As a consumer, I don't want to be treated like cattle. If I am having to contact you, it is because something is broken already. Don't let poor customer service exacerbate the problem even more. I am not asking for a personal handler to walk me through it, but I would like for someone to actually reading my message and sending me a pertinent response.