Upgrading WiFi in a Lenovo T61 ThinkPad to a Windows 8.1 Compatible card via USB Drive

The stock WiFi card (an Intel 4965 AGN) on my ThinkPad crashes every 30 minutes under Windows 8.1 (netwlv64.sys every time).

It also seems not to like the 5Ghz channel (from an Asus RT-N66U) though this was true under Win 7. So I upgraded to an Intel 7260 which is performing flawlessly.

BEFORE Changing the WiFi card

  1. Download the Middleton BIOS iso.
  2. Download rufus from pendrivelinux.com
  3. Follow the instructions at pendrivelinux to format the usb drive as a bootable DOS disk.
  4. Copy the files from the Middleton BIOS iso to the usb drive. (either mount it or use 7-zip to extract the iso contents).
  5. Make sure your battery is near fully charged (battery icon is green not yellow on thinkpad indicator).
  6. Disable the TPM chip in the bios.
  7. Suspend BitLocker if it's on.
  8. Boot from the USB drive.
  9. Run the flash utility (lcreflsh.bat).
  10. Wait for several minutes.

Changing the WiFi card

  1. Follow the instructions here for removing the wifi card.
  2. Insert your replacement wifi card.
TPM can then be re-enabled and bitlocker resumed after rebooting.

Upgrading to an Intel Core i7-4770k CPU: UEFI, USB 3 and Windows 8.1

Finally got around to upgrading from an Intel Core i7-920 to a Core i7-4770k. These are roughly 3 processor generations apart (Nehalem microarchitecture to Haswell microarchitecture respectively). As has been their custom for as long as I've been building PCs, the switch necessitated a motherboard upgrade as the 920s used socket LGA1366 while the 4770k requires LGA1150.

I decided to stick with an Asus (pronounced uh-soos) motherboard and picked up their Z87-Plus. The board being replaced, the P6T, was also an Asus board. Come to think of it, so was its predecessor (the A8N-sli). Before that it was an Intel board. Even though I don't do much overclocking I love the attention to detail that Asus puts into its products. And its website is well organized - always easy to find drivers. A few years back a quick comparison of their website to the website of their competitors (Gigabyte, MSI, etc...) sent me flying into their arms. They're not the least expensive boards but I've had good experience with them.

I fallen in love with the front panel header connector they ship with their boards. You plug the front-panel connectors (power LED, reset switch, PC speaker, etc...) into the connector (pictured below) then plug the connector into the board. It's a lot easier to swap out motherboards because you don't have to reconnect the sometimes lilliputian shunts onto single pins.

Front Panel Connector
The processor itself is tiny. Back in the Pentium and Pentium 2 days the processor was huge. Size-wise I recall installing one that was somewhere between the size of an audio cassette and VHS tape (closer to the latter than the former). The processor's these days are not much bigger than a postage stamp though their fans seems to have gotten larger.

CPU and stock fan
One of the reasons I went with the Z87-Plus is that it comes with a UEFI firmware config (aka BIOS). UEFI is the next generation of computer firmware, the successor to BIOS with an emphasis on speed and security. UEFI includes a sophisticated menu system so there's no longer a need for add-on board ROMs to daisy chain prompts and increase the length of time it takes to boot. Beyond that, BIOS writers have richer libraries and greater access to machine resources. The UEFI BIOS on this thing blows me away - it's a mouse-driven modern graphical user interface instead of the standard text based interface that has been a staple of BIOS for over 2 decades.
UEFI BIOS for Asus Z87-Plus
And it displays each of the settings that have been changed in a confirmation dialog before saving them!
UEFI BIOS Confirmation Dialog (apologies for the fuzzy picture)
Performance-wise this thing is a beast. Even though it only has 8 Gigs of RAM Windows 8.1 consistently boots in under 10 seconds. The motherboard itself completes POST so quickly that I've had to turn on a 5 second pause (another nifty BIOS option) so that I have a chance to enter the BIOS if necessary before POST completes. There are all sorts of optimizations this BIOS offers; you can turn off any (or all) of the USB or SATA ports. You can disable initialization of pretty much every connected device. There's something called "hardware turbo" mode that is so fast that I had to turn it off (again because it was nearly impossible to enter the BIOS when POST completes in under a second).

As a pc hobbyist since the mid 90s it amazes me how much easier it has become to build your own PC. I haven't cut myself on an add-on card or connector in going on 10 years. :)



MSBuild: To Exec or Task?

So I needed to execute a custom task, mainly to transform some XML, as a part of the build. The build has largely migrated to msbuild. I've worked with make, ANT and other build systems before and they more or less all provide the same functionality: A way to specify a dependency relationship that ultimately ends in telling the compiler/linker what and when to compile/link.

MSBuild seems to take the object oriented metaphor a little deeper as it relates to XML. That is, both the verbs of the system (Targets and Tasks in msbuild-speak) and the nouns of the system (Properties and Items) are represented as elements. The best way to get a handle on these concepts is to peruse "MSBuild Concepts"

One of the first gotcha (there are always gotcha moments with build systems) moments I've encountered has been related to the Exec task. It's a built in task that ostensibly executes any command passed to it. Except that it seems to not like managed executables. Or parameters with whitespace (yep, 2013 and whitespace is still dangerous). It'll happily open notepad.exe (even without a fully qualified path) but chokes on a "Hello World" C# console app. But not a C++ "Hello World" app. And the exit code is usually -21474.... which to any programmer looks a lot like -(2^31) - 1 and is usually indicative of a bug lurking somewhere.

Fortunately it's pretty straightforward to roll your own Task. I love it when they provide both an interface and a handy default implementation so that you only have to override the behavior in which you're interested.

If you're doing anything other than calling cmd.exe intrinsics (e.g., echo, dir, etc...) then I highly recommend doing it inside a custom task.

Removing Printer Drivers in Windows

This can be done with the command:

printui /s /t2

Make sure to remove the printer from devices and printers first.

Make sure to remove the right printer (e.g., one that's using a compatability print driver will require deletion of that compatability print drivers as well as the device specific driver).

GhostDoc, Conventions and Writing Self-Documenting Code

Given the C# method signature:

        private void CreateImageStreamAndAddImage(
            FileInfo imageInfo,
            int ImageBufferSize,
            ImageStreamInfo streamInfo,
            IPictureUploadService pictureUploadService)

GhostDoc will generate the following xml doc comment:

        /// Creates the image stream and add image.
        ///
        /// The image info.
        /// Size of the image buffer.
        /// The stream info.
        /// The picture upload service.

which, IMHO, is almost perfect! (add should be adds). This conceptually simple trick, a bit of parsing, stemming and grammar, goes a long way towards encouraging method names that clearly communicates what the method does.

This wouldn't be nearly accurate enough if it weren't for the fact that C# (well, .NET in general) ships with naming guidelines. They're illustrated in the language spec, documentation examples and the  design guidelines for class library developers.

The more team development I do the more I appreciate the benefit of conventions designed to aid readability. Team members come and go but the code is always there. Documentation comments tend to rot (easily corrected with GhostDoc but still) but method names last until changed.

Clarity in method naming strikes me as akin to clarity in writing. A well written explanation can be orders of magnitude* easier to understand than a poorly written one. It's just as easy to muddle concepts in print as it is in writing - perhaps this is why it's called writing source code. Throw in teams comprised of members from different language backgrounds and these conventions become even more important (as does one's appreciation for the hurdles being overcome by amazing devs writing in a non-native language!).

Conversational Search

Although Google is my main search tool Bing has gotten much better over the years. These days when I use it, mainly while at work, I'm finding it very good at keyword based searches. So if I'm looking up a specific class in the .net framework and I know the name (or most of the name) then it's great at getting me directly to the URL I want.

This applies to local destinations as well. If I know, roughly, the name of the place I want to go then bing finds it very quickly. And I've grown to like the beautiful artwork you see when bing loads.

But when it comes to more exploratory search, where I don't have an exact keyword that I know will locate the result, Google really shines here and has gotten way better at this. These days more and more it seems I can just ask my question in conversational English and google finds it.

For example, I use OneNote extensively but something that annoys me about its copy/paste functionality is that it stuffs these "from ..." footers whenever you paste a link. 99 times out of 100 I don't really care about the source URL. Especially if the link is included in the pasted text.

I suspected that there's an option for controlling this but didn't want to wade through all of the options to find it. Glanced at a few of the option settings, didn't see anything obvious. So I go to Google. I start typing "OneNote copy paste" and 5 search suggestions appear. The last one "OneNote copy paste from link to original" looks promising so I try it. At the bottom of the first screen full of results is a link to a page titled "OneNote Stop Including Link to Original Source" which is exactly what I'm looking for. Options -> Advanced -> Include link to source when pasting from the Web. Uncheck that and we're done.

Musings on Lock Free algorithms: Trading Generality For Speed. Trading Simplicity for Complexity.

While talking with a coworker (much more senior than me - one of the awesome things about working at Microsoft) the subject of lock free algorithms came up. It was in the context of operating systems but got me to thinking about the issue in a more general way.

I am by no means an expert on lock-free programming. Only once in my professional career have I ever converted a lock based approach to a lock free approach (and yes it was both frustrating and exhilarating). But I've read a bit about it and played with some of what I've read out of curiosity.

At one level they seem to trade generality for speed (and, often, memory). In the same way that a general purpose sort can't* do better than O(n log(n)) but specialized sorts (e.g., radix sort, postman's sort) can (some are even constant-time!).

The specialized sorts take advantage of the structure of the thing being sorted (e.g., must be integral), hardware limitations (word-size and max addressable memory), uniqueness, etc... While it's true that these sorts can sometimes be adapted to a different class of elements (e.g., by tagging the element with the underlying sort element) doing so comes at the cost of increased memory (e.g., the width of the tag) and increased complexity. The adaptations still depend on the underlying constraints on the sort to achieve the performance gain.

Similarly, lock free algorithms tend to come with similar constraints: the expectation of a certain word size, uniqueness of index or uniqueness via algorithm (e.g., a GUID). Serializing (lock-based) access to a resource is conceptually quite simple. Limiting access to that resource to consumers that intrinsically can't collide (either by giving them a unique index or generating a guaranteed* unique GUID) increases the complexity of exclusive access at a cost of increased memory and/or increased computation (e.g., generating the GUID, indexing the consumer). But it also makes for much faster access as consumers no longer need to queue up waiting for a lock.

The more I think about it, the more I think lock free programming can be characterized by 2 propositions:
  1. Trading Generality for Speed.
  2. Trading Simplicity for Complexity.
As an aside, this is another wonderful exception to the general rule of thumb that simplicity is always preferable to complexity.

Win 8.1 Dispatch: Of Clouds and Virtuous Cycles

Finally getting around to upgrading to Windows 8.1. I very rarely upgrade before the first service pack, especially for workaday machines, but a driver issue made upgrading a little more urgent than normal. Once the primary workstation has been upgraded I've tended to shortly thereafter upgrade all home machines to make working remotely easier. Plus the habits of an OS tend to become second nature; I don't want to have to maintain two different modalities (one for work, one for home).

I started the upgrade (Windows 7 to Windows 8.1) of the home desktop box at 10:34pm. By 10:49 pm the entire process was done. I don't mean "you can see the desktop but can't do anything with it" done. I mean "logging onto facebook to post how fast this was" done. The desktop box is pretty well stocked (core i7 920, ssd-only storage, 12 gigs of ram) but it's by no means a beast.

Over the past few months I've gradually migrated all documents, pics, videos and the like to Skydrive (also a fan of gdrive and google docs but that's a post for another day). Ditto for programs that I've become accustomed to having installed on every machine: Notepad++ (formerly Ultraedit), Paint.net, 7zip, etc... Because the bits I care about are in the cloud I can finally feel comfortable wiping and reinstalling the OS without fear of a herculean, days-long-til-it-feels-right chore to get back up and running.

In a sense this lowered barrier to reinstalling the OS mirrors the recovery oriented trend in software engineering. Skydrive and the internet in general are providing the persistent store. Since Skydrive is limited* to 25gb I don't keep the bigger application bundles (office, windows itself, Cakewalk Sonar, etc...) on it but download them (and store them on a 1TB freeagent drive plugged into the wifi router) as needed.

The less time it takes to reinstall the OS and key apps the more likely I am to use this as a solution for the inevitable problems that creep up from time to time with any complex piece of software. 10 years ago wiping and reinstalling the OS + Apps was at least a 4 hour time commitment (usually longer) that lingered on for weeks while hunting down the various apps and settings it took to make the system feel like home. Under those constraints it makes sense to spend a few hours hunting down the cause of a glitch because a few hours is shorter than a few days.

Since then I've reliably installed windows 8.1 from USB 4 times in under 15 minutes. I expect there to be far fewer glitches that *have* to be tracked down (unless it's one that captures my interest). And over time I expect it to counteract the "don't install any software unless it's from the manufacturer" ethos that has crept into windows land because the OS + Apps situation was so precarious (of course race-to-the-bottom preloaded TSRs* didn't help).

I upgraded my work laptop last week but over the weekend discovered that it's still using a spin drive! (7200 RPMs but still, a spin drive!). So I swapped out an 80gig SSD that was lying around and proceeded to reinstall (as an aside, keeping intel chipset series 4/5/6/7/8+ drivers on a USB thumb drive really helps here). Lo and behold Windows recognizes the name of the machine (I used the same name) and offered to speed up the process by using the most recently saved settings for it! Not just desktop background or bookmarks/favorites either. It even remembered the File History backup location (on the freeagent plugged into the wifi router).

Ordinarily I'd be wary of using a drive so small (80gigs isn't very much after Windows and Office) but another consequence of cloud storage is that I'm realizing I need much less space on the laptop. The things I care about are either on Skydrive (and have started making it the default documents location - another awesome Win 8 feature) or can be downloaded from other sites (e.g., the app store, cakewalk.com, etc...) as needed. Skydrive is limited to 25gigs and, because it's contents have to be downloaded, is somewhat self-limiting. Less space on the laptop makes the wipe-reinstall scenario even faster. A virtuous cycle :)

Data point for why lines of code should be treated as an expense

CopyIcon vs DuplicateIcon

Because orphaned features take on a life of their own sowing confusion, undermining test fidelity and basically making it harder on the main bottleneck in software development: human comprehension.

PowerShell Cheat Sheet for Programmers

PowerShell's docs seem mainly oriented towards System Administrators. To make it a little more congenial to the way programming languages are often described in books I'm making this cheat sheet.

Syntax, Operators and Expressions

Variables and Parameters

Literals and Quoting

Functions

Keywords

Includes/Imports

Debugging

Error Handling

Quick Hit: Taming Notepad++’s Language Menu

W00t! Notepad++’s language support is awesome. It supports so many languages that navigating the languages list (e.g., if the file extension doesn’t match the registered extension) is a bit of a pain. Fortunately its contents are editable!

image

Now to remove the ones we don’t use here…

The Task Machine Gun

It happens all the time. The hallway discussion. The quick chat at lunch. The regularly scheduled meeting. Upon reflection you realize it happens pretty much every time you talk to this person. Every encounter results in a new list of tasks for you to work on.

You've been hit by a Task Machine Gun. It's reason for existence is to spray task bullets all over the place in hopes that a few hit the mark. The Task Machine Gun doesn't feel like it's being productive unless task bullets have flown.

There are many circumstances where the Task Machine Gun is exactly what you need. Take a production line. Each station in the line has been simplified as much as possible. To meet the goal of producing X widgets per hour you basically need a Task Machine Gun to make sure that the simple procedure is executed frequently. These days most of these jobs are done by machine.

Delivering an always available, outrageously scalable set of software services is pretty much the opposite of a production line. The Task Machine Gun in this circumstance can do quite a bit of harm because each task takes on a life of its own that lives long after its reason for being is even remembered.

What do you do when you realize there's a Task Machine Gun loose?

Duck! Figuratively of course.


GMail's Categories

GMail's new category system* has definitely grown on me. There are a lot of things I miss from Outlook but auto-categorization is awesome. There is a small learning curve which can be shortened by reading the 1 sentence description of the new categories (from the desktop website - hover over them and +5 for simple discoverability in the User Interface!).

The more I think about it, the more it strikes me that this system has a very real advantage over the fixed rules approach. The problem with those rules is having to maintain them as they inevitably become out of date (distribution list name changes, email address changes, etc...). 

The built in categories are great: Priority, Updates, Social and Promotions (aka spam-lite). For example: my apartment complex sends out a "delivery for you" email whenever a package arrives. I want to know about that immediately so I've dragged those into the Priority inbox. I've only had to do this 2 or 3 times before the system learned to automatically put them there. It somehow figured out that I don't really care too much about the "buy stuff from this vendor we sold ad space to" type messages (they go in Promotions). 

When it miscategorizes an email it's easy enough to correct by moving it to the right category. This can be done on phone but I've found that it's faster for me to recategorize a bunch of messages at once on desktop.

I expect mass mail campaigns to optimize for (or against) gmail's new system. Given that it learns from me dragging and dropping stuff into the right category it should be harder to game the system for long.

Since I've given up on manually categorizing** high frequency information streams I kind of appreciate the balance they struck between broad applicability without overwhelming granularity. Once a list has 7 or more elements in it IMHO it becomes unmanageable for most and has to be generalized/massaged/abstracted back into some set of 7 or fewer buckets so that humans can manage it without too much overhead. Kudos for cutting that magic number in half (due to User Interface concerns I'd guess).

While it takes some manual categorization before to get the system from a totally anecdotal guesstimate of 67% accuracy up way above 95%+ accuracy the up-front work is easy enough and infrequently required enough to not outweigh the substantial benefit of having a personal software agent helping me sift the wheat from the chaff.

Editing a Wiki from a smartphone

I recently listened to a podcast about a model of consciousness that relies on quantum mechanics. The theory, called Orchestrated Objective Reduction, posits (not uncontroversially) that consciousness arises when the wave function collapses in the apolar hydrophobic regions of microtubules in the brain.

That's a mouthful but, being a digital citizen, I figured I'd weigh in on the wikipedia page. As usual I'm on my smartphone. Coincidentally almost all of my listening and news reading is done on my phone. As was this post.

First problem: How do I get to the talk page? This is traditionally where you start before making edits to a page. I couldn't find a link to it on the mobile site.

Manually inserting Talk: before the last part of the URL does the trick. Seems like there should be a "talk" or "discuss" link somewhere...

Second problem: How do you take part in the discussion?

It looks like the convention is to pick a section, edit it then add your comments to the bottom. Followed by some moniker to delimit the end of your comment. Egads, somebody introduce these guys to the year 2000. Even phpboard would be a better discussion system...

Bridges to Somewhere: Design Patterns wikis

So there's a bit of infrastructure that's critical but somewhat neglected because of more pressing problems. As the list of more pressing problems gets shorter (that's successful engineering - happy dance!) tending to this piece of infrastructure becomes more pressing.
 
Let's call this piece of infrastructure AbInitoWidget (AIW).
 
The grand vision is to get rid of AIW. Fold its functionality into a different piece of infrastructure. Let's call this other piece of infrastructure InitioWidget (IW). IW has had heapings of engineering cycles dedicated to it and is in much better shape.
 
Alas, the path to the grand vision of folding AIW into IW is long and fraught with peril. The kind of peril that requires the expertise you'd really only find in the original authors/devs. Without that kind of expertise each break/fix cycle will take much longer as the current devs reverse engineer the intent of whatever code path they're tasked with fixing.
 
Don't get me wrong, I too am seduced by the idea that IW could operate ex-nihilo. It's a self-referential beauty that any Computer Science geek would love. But the costs are so high that it's wise to adopt a fallback strategy (or 3) in the event that the grand vision does not come to pass.
 
While discussing one fallback strategy, moving the care and feeding of AIW from BadErrorProneHardToDiagnoseAIWCareAndFeeder (BadCareAndFeeder for short) to the care and feeder responsible for IW (GoodCareAndFeeder), I found myself thinking through design patterns that might help.
 
The first that came to mind was the Bridge pattern. Since we'd like to change AIW from being maintained by BadCareAndFeeder to GoodCareAndFeeder this struck me as being visually like a bridge. This is incorrect. The bridge pattern decouples abstraction from implementation so that both can evolve independently. While it may be involved in the the Bad-to-Good-CareAndFeeder migration project it doesn't actually capture what I was looking for - a higher level abstraction than the classical Gang-of-Four Design Patterns.

Fortunately Wikipedia has wonderful sections on design patterns. Two of my favorites are:
  1. The wiki for the foundational Gang-of-Four book "Design Patterns: Elements of Reusable Object-Oriented Software"
  2. The "Computer Science Design Patterns" wikibook.
    1. Not quite as polished as GoF.

Ode to Notepad++

Slowly making the transition away from UltraEdit to Notepad++. UltraEdit is awesome and has been my main editor for probably going on a decade now. I've bought 3 upgrades.

Unfortunately it doesn't play nice with the revision control system at work. It takes a while to exit after editing a file during an interactive resolve session. And by a while I mean longer than Outlook! Outlook is notorious for its process taking forever to exit after you've exited the application.

Anyway, I'm finding all sorts of nifty stuff with Notepad++. Like being able to stuff a string into a bunch of lines at once. Alt-Click-drag (column mode) across the text you want to replace on all of the lines and start typing - presto! The text is inserted at the same location in all of the selected lines! Great for manually transforming text from copy/paste across different systems.

Another nifty feature is the highlight every word that matches the currently selected word. Great for finding things in log files.

Unlike UltraEdit, which threw everything but the kitchen sink into the app all at once, Notepad++ seems to be relatively minimalist. It relies on extensions to add extra functionality. So, for instance, when I wanted to navigate back and forth from whichever file and line I've been at previously (CTRL+-/CTRL+SHIFT+-), I was disappointed that it wasn't built into the app. My disappointment was premature - just looking up that plugin (there's a built-in minimalist plugin manager) exposed me to all the other plugins out there. The plugin (Location Navigate) installed quickly and worked immediately.

I still keep UltraEdit installed, mainly for things that I don't know how to do in Notepad++ yet. But that list is diminishing. And Notepad++ has already replaced it as my %EDITOR%.

From Tolerating Faults To Being Recovery Oriented

Many years ago I got into the habit of rebooting my work PC at least weekly. I don’t recall the exact circumstances which prompted this; after all, one of the bragging points for an operating system is how long you can go between reboots.

But I was raised back in the late Win 3.1 and Windows 95 days. Once had a job walking to each computer in every lab that we maintained with floppy disks to update network card drivers. We digress but the point is that back in the early days rebooting was a daily, if not multiple times daily, affair.

Another motivating factor was the transition from CGI to FastCGI back in the late 1990s. Instead of starting a massive executable to handle a single request, FastCGI allowed a single process to handle multiple requests. 

Porting a massive (at the time, to me at least) executable that:

  1. Was riddled with the simplifying assumption that it will only ever process a single request at a time.
  2. And was therefore full of global variables (some even initialized statically/at-compile-link-time).
  3. Was exceptionally brittle as these assumptions weren't explicit.
  4. Was developed by brilliant but inexperienced engineers and therefore re-discovered some of the errors a basic CS education (formal or informal) helps you avoid.
One of the things I learned back then was that even after it was mostly successful (single process successfully handling multiple requests without crashing) part of its success came from limiting the number of requests a single process handled before being recycled.


This brings us back to the title of this post. It struck me that limiting the number of requests was basically an admission that errors were not just unavoidable. Errors were going to happen so we will build in a way to firewall off many of them by starting from scratch on a regular basis.

Strictly speaking an algorithm that doesn't behave deterministically is, usually, incorrect. The lack of determinism is due to one or more errors (aka bugs) that must be rooted out. At least according to accepted wisdom.

Well it turns out that another way to increase the determinism of the system is to figure out when, on average, the error rate goes unacceptably high. Then start over before getting there. As an aside this strikes me as analogous to the trend in 20th century mathematics to prove the correctness of a given construct in the limit as opposed to correctness always and everywhere.

This is a profound alternative with implications for the design of large software systems. And by large I mean complex since size in-and-of-itself creates a kind of complexity. This alternative, Recovery Oriented Computing, basically acknowledges that at some point the complexity of a software project exceeds some threshold beyond which errors are guaranteed.

Once this is accepted, and it is a painful thing to accept for me at least, all sorts of other often overlooked considerations come to the fore. Things like:

  • How long does this thing take to start after a crash?
  • How easy is it for us to figure out when we should restart to avoid crashing?
  • Maybe we should emphasize architectural principles that optimize towards minimizing the length of time it takes to start (and restart).
  • Ditto for shutdown/stop.

Fare Ye Well, You Engineering Prophet

My boss recently changed teams (but thankfully not companies). I didn't get to work for him for very long but it was obvious almost immediately that he was one of those sui generis types that can make the difference between success taking a year and success taking 3+ years.

Over the years I've worked in several different companies: small, medium and large. A variety of industries: e-commerce/retail, professional services automation, telecom, ip video and have even been fortunate enough to work in the semiconductor industry - the source of that awesomeness that makes life as a programmer possible. But I digress...

There are always a few, and I emphasize few, that get it. From the 10ft up-close view, to the 100ft bit-of-distance view, to the 30,000ft god's eye view. And perhaps most importantly, they get what is and isn't important to focus on. Sometimes these people are in the right place at the right time and take a project that used to launch at 3am in the morning because it was so buggy that they needed that much time to put out fires to ship/go-live - every month, to a project that can ship multiple times per month with no one arriving before 7am and fires being unusual instead of the norm.

He's one of those kinds of people. An Engineering Prophet.

Designing For The Future

Agility has been a long and hard fought lesson learned over the past few decades. In CS classes we're told about ancient DOD style projects with names like Wooden Man, Stone Man and Iron Man. The pattern is pretty clear: Requirements Definition, Design, Implementation, Maintenance and Disposal (though we don't learn much about the last 2 as they're naturally somewhat difficult to teach given the time constraints of a semester).

When this theory hits practice what I've found is that the value of detailed up-front design is inversely related to how far into the future you're trying to design. Put another way: across many disciplines, not just our own, our ability to see into the future is *verifiably* not very good. Excellent works on the failure of prediction include Daniel Kahneman's "Thinking Fast and Slow" and "The Signal and the Noise: Why So many Predictions Fail - But Some Don't" among many others.

So, it's reasonable to ask, why shouldn't software be developed like other engineering disciplines? If you spend more time up-front in design then aren't you both being conscientious and efficient (problems caught in design are much cheaper to fix than in later stages)? And isn't the alternative (assuming there's only one) a recipe for death marches?
 
But, to choose another engineering discipline, don't mechanical engineers have the same problem? This is true but software, being soft, operates in an artificial environment. If an architect thought the Eiffel Tower would look better if only the Pyramids of Giza were off in the distance, possibly just ahead of the snow-capped peaks of the Swiss Alps no one would take him seriously. The alps aren't movable (yet), nor are the pyramids (yet).
 
Yet an analogous request in software, precisely because it's soft, can't be dismissed outright because software lacks (usually) such hard (and obvious) constraints. Built a program that knows how to read log files? Excellent! We've got tons of database logs that we need to parse. It wasn't designed for reading database log files? No worry, it's software, we can change it.

Essentially the softness of software, its malleability, guarantees that requirements defined will be incomplete (=incorrect) the second after the ink dries (assuming you still print them out). The Pyramids of Giza can be moved to France, the mountains can be twisted into a more photogenic orientation and nearly any idea is not only possible but a wonderful morass of design into which us software engineers happily but, usually unprofitably, dance.

A natural response to this is the impulse to "nail down the requirements and don't allow them to change". Unfortunately malleability is possibly the biggest value proposition that software brings to the table. Software is valuable precisely because it can be easily changed without having to reorient an assembly line, stamp out possibly millions of replacement widgets, ship them, track them, etc...

So we have this intrinsic tension. Software is valuable because it's easy to change. Accommodating change requires design. Design is essentially trying to predict the future. Prediction is hard because of change and change, as a fundamental value proposition, is and will always be the norm in software. Ergo, we should expect our ability to predict the future to be even worse for software than other disciplines (for which malleability is a less significant part of their value proposition).

So what do you do when you must see into a future to satisfy requirements well but that future is almost certainly guaranteed to have changed the next time you look? Look more frequently and don't spend too much time looking too far into the future.

This, to me, is the reason Agile gained prominence in the mid-90s. It's also, IMHO, the driving force behind shorter release cycles, earlier feedback and many of the other practices associated with Agile. We can't predict the future since, in our business, that future does not really exist yet. It's created in the iteration cycles during which stakeholders mold and shape the product; the molding and the shaping itself molds and shapes the future as stakeholders discover what they like, what they don't like, what they thought they'd like but don't, what they don't like but works so well they'll take, etc...

The further into the future we engage in detailed design the more time and energy we spend on a future that will be very different by the time we get there. The lack of obvious artificial constraints means that few blind alleys can be discarded outright. The future is often so different when we get there that instead of helping ourselves by designing for our expected future, we've designed ourselves into a corner with loads of technical debt created to get out of that corner.

Unit Tests, Robocopy and Shallow Mirroring

To increase the fidelity of a set of unit tests I needed to mirror a directory structure. The catch is that there are huge files that I don't want to copy. My first thought was to write a program that only copies the first N bytes of each file while mirroring the directory structure.

On a lark I figured I'd check if robocopy could do something like this. Turns out that it can! Dozens of gigs of bits conserved! Thanks robocopy!

robocopy srcdir destdir /E /CREATE

Thread view in Touchdown Exchange Android Client?

Finally getting around to reading through the manual for the exchange client I use on android (touchdown). Found a nifty feature I've wanted for ages: Thread view!

I find it much easier to keep the context of discussions by having them listed together.

Now if only they gave me an easy way to either save a .eml file or attach a message in response to another message. When there are lots of recipients it's a pain to have to add them and breaks the thread you're trying to update...

Getting Around “Strong name validation failed” During Development Cheatsheet

Is this DLL a strong named assembly?

sn.exe –v someAssembly.dll

 

To find out for all dlls in the current directory

del sn-output.txt for %i in (*.dll) DO sn -v %i >> sn-output.txt findstr /c:"is a delay-signed or test-signed assembly" sn-output.txt

How do I turn off strong name validation for someAssembly.dll?

sn -Vr someAssembly.dll

How do I turn of strong name validation for several assemblies at once?

Use command prompt FOR loop. for /F "skip=6 usebackq delims=," %i IN (`sn -Vl`) DO @sn -Vu %i.dll

What's currently excluded from strong name validation on my machine?

sn –Vl







Why Go Go War On Magic Numbers?

Magic numbers and, more generally, magic literals have been verboten in software development for decades. Just ran across a few more reasons why this advice is still sound.

By changing from “ThisMagicLiteral” to SomeClass.ThisMagicLiteral you not only get all the benefits of compile time checking you also make it easier to remove code that depends on the magic literal.

This might not seem like much of a benefit – especially since it can often take more time to type SomeClass.ThisMagicLiteral than “ThisMagicLiteral”. But given that the original authors may well be long gone by the time a given piece of functionality needs to be removed anything that makes removal less brittle is a big plus.

I’m also finding that the process of naming the literal itself aids in discoverability. Maybe it shouldn’t be SomeClass.ThisMagicLiteral. Maybe it should be SomeClass.TheFileThatThisMagicLiteralPointsTo. Or SomeClass.TheRegistryKeyThatThisMagicLiteralPointsTo. That extra bit of context might be enough to trigger an association in the mind of the developer maintaining the code (without access to the original developers).

It’s a big deal but is an often forgotten part of the software development lifecycle – disposal/decommissioning. Decommissioning is a lot easier if magic literals are replaced with constants. Lots of dependencies may have crept into the source. Often these dependencies aren’t even known to whoever happens to be maintaining the main block of code.

A Strikethrough Button in OneNote 2013

Where, oh where, is the Strikethrough button in OneNote 2013? How am I supposed to tell at a glance which tasks are complete on a page full of tasks?

The bad news: It’s not in the “basic text” tab of the Ribbon.

The good news: You can customize the Ribbon! In the screenshot below I’ve added a “MyGroup” group to the Basic Text tab of the Ribbon.

To open this editor click the “Customize Quick Access Toolbar” button (by default it’s at the top of the screen in the left corner next to the OneNote icon) then choose “More Commands”

strikethrough in onenote

Return to digital music making

Getting back into digital music making. Picked up an m-audio keystation 88 from Guitar Center. Very compact board – great feel for a semi-weighted controller keyboard.

Anywho, last I did this I used Sonar. So I grabbed Cakewalk Music Creator Touch 6 (that’s a mouthful of a title isn’t it?). Sonar is still very band/rock oriented – the default project is 8 audio tracks. I don’t plan on recording anything live. So after fiddling around for a while with the soundcard/devices tab I finally stumble upon the “insert synth track” command.

Insert Synth Track did the trick. Nice to see MC6 (touch) shipping with more than one synth!

New google maps is delicious!

Love the simplified ui. LOVE the contacts integration - finally searching for mom returns my mother's address! Kinda the whole reason I bothered entering it...

The half-hambuger bars on the lower left are unobtrusive but possibly too much for my tastes though as an engineering call I can see that there's enough subjectivity about it that either size is fine.

The layers are much simpler to select - LOVE IT! Wasn't obvious to me what the public transit layer does though. It doesn't have any stops listed as far as I can tell. It draws blue lines presumably along routes but I don't see any indication of meaning or use. Will check the docs but I like trying consumer apps out without reading the docs just to get a feel for the UX.

I never used latitude and honestly wish they'd just integrate fb location data but its removal seems to have decluttered the UI. Imho that's good :)

How Easy Is It To Spin Up A Linux VM On Windows Azure?

After watching Mark Russinovich’s presentation about IaaS on Azure from TechEd NorthAmerica 2013 I figured I’d try it out to see if it really is as easy as he makes it seem.

After about an hour or so I can confidently report that it is very easy; I’ve got an OpenSuse 12.3 linux VM running with 28Gigs of space to play around with! This is a stock image from the gallery. I went with OpenSuse because I used to run that distro several years back. And because there wasn’t a debian image in the gallery :)

logged into vm

Setting up the VM itself is trivially easy in the new azure portal. It’s a wizard with 5 steps. Most of the time setting this up was spent creating an SSH certificate and configuring a client terminal program (PuTTY).

The portal wizard has a link to instructions for creating the cert and importing it though I deviated a bit when setting up PuTTY. Instead of using PuTTYgen to import the cert created via openssl.exe (windows build of openSSL) I just opened up an SSH session to the linux VM and instructed PuTTY to accept the site’s cert.

first vm being provisioned

BitLocker Turning Off the Startup PIN for Workstations

So your desktop/workstation is using BitLocker encryption. The machine itself has a TPM (this has to be built in, can’t be added later).

But it somehow keeps asking you to type a PIN every time you reboot the machine. This is extremely inconvenient if you frequently reboot and frequently access your machine remotely. To turn it off:

1. Open an admin command prompt and type the following:

manage-bde –status

Look under the “Key Protectors” list. It probably says TPMAndPin. You want that to just say TPM. To change it:

manage-bde –protectors –add c: –tpm

Replace c: with your system drive if it isn’t c:.

Internet Explorer 64-bit: What is it good for?

About the only use I get out of 64-bit mode Internet Explorer is opening and searching large documents. Like the 30 megs of spew emitted by static code analyzers. Searching in 64-bit mode is so much faster that it’s worth the WinKey + “ie 64” (without the quotes) key combo that it takes to start it.

Windows 7 installs in 5 mins (4:49 actually) via USB 3.0 Key!

Windows 7 installs in 5 mins (4:49 actually) via USB 3.0 Key!

That's amazing!

Adding the Intel SATA drivers to the boot image (boot.wim) appears to have worked. Windows Setup recognized the new drive (as well as the onboard 32gb ssd).

Unfortunately adding the intel USB 3.0 drivers to the boot image does not appear to have worked. Windows Setup immediately asked me to browse to the drivers. And couldn't see the usb 3.0 key. Fortunately there was a copy of the drivers on another drive. Setup was able to use those drivers and recognized the USB 3.0 drive after I manually browsed to them. Have a question out on an internal DL - will update if a solution is found.

As an aside I can't believe an order (for the SSD) placed at 8:30pm Thursday night was delivered by 4pm the next day (friday). If ONTRAC (amazon's delivery service) is the future of delivery then I fear for the fate of brick-and-mortar electronics retailers.

Adding Intel Rapid Storage Tech drivers to usb 3.0 key for Windows 7 OS installation

So the Deployment Image Servicing and Management (DISM) tool is awesome. It allows you to manipulate Windows Images (.wim files) without having to boot into the image.
For Windows 7 x64 (downloaded from msdn subscription) boot.wim has 2 images. These images are at index 1 (WinPE) and index 2 (Windows Setup).
After following the steps in this article I became curious about DISM.exe. The technical reference for dism.exe is here.
My system uses Intel Rapid Storage SATA drivers that don’t ship with Win 7. So I’d like to see if dism can be used to add the SATA drivers as well (why not? they’re just another set of drivers…).
Copied the files from files\irst\drivers\x64\* to winpe\irst in keeping with the layout recommended in this article. Replaced the filenames with the corresponding .inf files. Let’s see if it works!

How to install Windows 7 from a USB 3.0 Thumb Drive (FAST installation!)

This is the best article I've found yet on how to take advantage of USB 3.0 when it's needed most - during OS installation!

Serves as a great starter tutorial on using Windows Image format (.WIM). The image (sources\boot.wim) is mounted first. Then drivers are added. then it's saved and unmounted.

Just remember to change the filenames to match the filenames for your usb 3.0 drivers. For my Fujitsu UH572 Ultrabook they are iusb3hub.inf and iusb3xhc.inf.

How to install Windows 7 from a USB 3.0 Thumb Drive (FAST installation!)

Why I still use UltraEdit when Visual Studio 2012 is such an Excellent Editor

While Visual Studio has come a long way as an editor (not to mention as an IDE) there are still a few features in UltraEdit that keep me coming back (currently on v17.20):

  1. right click the tab for any open file and you can copy it’s path into the clipboard. This is really handy when you need to type in an outrageously long path as an argument to a command.
  2. quick open (ctrl+q). Often I’ll use this to quickly open log files. The paths are long but included in troubleshooting emails.
  3. reload changed files. Visual Studio will prompt you but UltraEdit can do it automatically (with a configurable polling interval). It can also automatically scroll to the end of changed files – a great way to keep an eye on the spew.
  4. compare files. I never sprang for UltraCompare because the comparison functionality built into UltraEdit has been more than sufficient for my needs.
  5. Regular Expressions. As of Visual Studio 2012 this gap has finally been bridged but even as late as VS 2010 there was still no way to use .net/PERL (pcre) regular expression syntax in the various find dialogs. That’s ridiculous – good thing VS 2012 provides this feature.

Opening a command prompt in a SmartCard security context

In a computing environment with very high security access to many resources requires a physical credential. I’ve only seen this in 2 places: back while working for a bank and at Microsoft. I imagine a similar story in defense related work but I’ve never done defense related contracting so can’t speak from experience.

Anywho, physical credentials are great. Except when you lose them. Or leave them in the computer. Especially if the credential serves double duty; it’s your way to enter the building and to access secured resources.

Windows has a wonderful feature that lets you start a command prompt with the credential. As long as that command prompt remains open it has access to secured resources. So you can take your physical credential out, leave the window open and do what you need to in that command prompt window.

Enter the “runas” command. Introduced in Windows 7 or Vista IIRC, it lets you run a command under different security contexts. One of those contexts is SmartCard. So I created a shortcut on the desktop with the following command:

C:\Windows\System32\runas.exe /smartcard "C:\Windows\System32\cmd.exe /k cd C:\Users\XXX\YYY && C:\Users\XXX\YYY\YYY.cmd"



This opens a command prompt, asks for your credential password then runs the command prompt under the smartcard security context. In this case there’s a bat (.cmd) file that sets up the target command prompt with a bunch of stuff not relevant to this discussion. The /k option to cmd.exe keeps the window open.

Striping the boot volume across 2 SSDs for non-storage experts

If 1 SSD is good then surely 2 must be better? Smile

For a variety of reasons, one of which is to minimize compile/build times, I decided to try striping (RAID-0) the boot volume across 2 SSDs. My dev box is an HP Z420.

I’m a PC hobbyist not a storage expert. The answers to these questions might be obvious to someone more conversant with storage parlance. Posting this with hopes of helping other PC hobbyists enjoy maximum performance with minimum frustration.

Comments/fixes/errata welcome.

Can I get away with not using RAID at all?

The idea here was to use Windows 7 Dynamic Disks. This is very easy to setup and requires no extra hardware. Unfortunately Windows can’t boot from a dynamic disk.

Ok, How about Intel RAID since it’s built into the motherboard?

This *will* work but presents another problem. I’m striping SSDs. SSDs need TRIM support to extend their lifespan. Intel added TRIM support for it’s RAID solution (called Rapid Storage Technology) but only for series 7 chipsets. The HP Z420 ships with a C600 chipset which I presume is series 6. So no luck.

What’s this LSI RAID stuff in the “BIOS”

The firmware config (aka “BIOS”) has support for optional add-in RAID cards. One of which is the LSI 9212-4i. Apparently this falls somewhere between a standalone RAID controller add-in card and software RAID. And it’s relatively cheap – about $130

WTF is the LSI 9212-4i HBA?

While tracking this down I encounter an unfamiliar acronym (initialism really but no one uses that word properly). HBA is short for Host Bus Adapter. It’s a way of splitting the RAID implementation across the motherboard and an add-in module without requiring a full standalone RAID controller add-in.

Great, Does it support TRIM?

According to the LSI website TRIM is supported for LSI HBAs using an IT firmware (as opposed to an IR firmware). 2 new unfamiliar acronyms:

  • IT – Initiator Target
  • IR – Integrated RAID

I have no idea what these mean – that’s a Wikipedia surf session for another time. Turns out that the LSI 9212-4i HBA supports BOTH!

Both? At the same time? How quantum mechanically confusing!

Does it support both at the same time? Or is it in one mode or the other? Does it support both for UEFI and traditional BIOS?

Apparently it ships with IR firmware (usually) but that can be overwritten with IT firmware by following these instructions.

Turning on Visual Studio 2010 Code Analysis in a machine-independent manner

So you want to turn on Visual Studio 2010’s Code Analysis option but someone on the team rightly points out that this adds a machine specific directory to the .csproj file. Machine specific settings shouldn’t be in a shared project file (.csproj is typically shared).

Visual Studio 2010 uses MSBuild. MSBuild has lots of built in (“reserved”) properties. For instance, $(TEMP) will evaluate to the environment variable for temp.

Fortunately Visual Studio defines a property $(DevEnvDir) that, with slight modification, can point to the ruleset file required for Code Analysis in a machine-independent manner.

References

MSBuild Reserved Properties

Visual Studio Integration with MSBuild

Macros for Build Commands

Command Prompts with Date and Time

I spend a lot of time in command prompts and have found it useful to have the date and time associated with a given command. It’s great for comparing “how long does this take” especially after the fact. To add this to your command prompt on Windows 7 (and probably XP and earlier versions), change the PROMPT environment variable ala:

PROMPT=$d $t$_$p$g

I usually set this as a user environment variable. These can be set by typing Start then entering “sys env” (no quotes). The first option should be “Edit the system environment variables”. Add a new user environment variable named PROMPT with the value specified above.

PROMPT syntax is described at the online docs for the prompt command.

While the date and time displayed don’t account for the amount of time it takes to type in the command (with very long commands this can be a few minutes) it’s usually a good indicator.

ReSharper 7

Finally getting around to looking at ReSharper 7. Excited so far by these features:

  1. The new “generate” feature. Type Alt+Ins and a menu pops up offering, among others, to:
    1. Generate a constructor. This was already available in ReSharper 6.
    2. Generate Equality members. Very handy when reference equality simply won’t do.
    3. Generate Equality Comparer! This is way, way, waaaayyy too cool. Even though .NET has had Equality Comparers for ages I still see code littered with someStringVar.ToUpperCase() statements! This is error prone because it requires everyone to know in advance which case they’ll need to use. For a dictionary this might mean failing to find a key. Equality Comparers explicitly define what equality means for a type.
  2. You can make a bunch of variables public or private by selecting them in the source editor then clicking the hammer icon that appears. This brings up a menu that allows you to change the visibility.

So what’s this IntelliTrace thing all about?

Just read through an excellent series on IntelliTrace over at The Ultimate Visual Studio Tips and Tricks blog. It’s a 4 part series that explores the rationale behind a product like IntelliTrace and describes how it can be used.

Noteworthy Points

  • It used to be called “Historical Debugging”
  • It introduces some restrictions on Edit and Continue if you enable call tracing.
  • It has to be on when the process being traced is started. This is because it injects code when a process’s MSIL is JIT-ed into native instructions. This only happens once: the first time a block of code executed.
  • It’s a .net 2.0+ thing supporting C# and VB.net apps. Once again, sorry C++!
  • It’s pretty much the only way to debug Windows Azure apps.
  • It can be used in IIS via powershell, VS Test Manager, SharePoint.

Observations

The Events tracking reminds me a lot of process monitoring tools in SysInternals. Events of interest are operations that tend to be sources of error:

  • opening the wrong file.
  • writing the wrong value to a registry key.
  • binding the wrong parameter to a sql command.
  • parsing xml files that have errors in them.
  • et cetera…

There are many ways to write code to do any one of these operations. IntelliTrace comes with a predefined list that covers most (if not all). That is, IntelliTrace synthesizes all of the ways to open a file (by path string, by FileStream, etc…) into the concept of an Event (opening a file) and identifies every point in the execution of a program that a file was opened.

In this way IntelliTrace strikes me as operating at a higher level of abstraction than traditional (or live in the parlance of the series) debugging. Traditional debugging is about the call stack, locals, watches and breakpoints. IntelliTrace Events is temporally oriented and is all about groups of operations that tend to be sources of execution errors.

Logic Errors, where a program doesn’t exhibit faulting behavior (e.g., reading a file that doesn’t exist) but does not behave as intended, require the much more heavyweight (with respect to performance impact) IntelliTrace Calls tracing. Calls and their parameters are traced. IntelliTrace provides some filters to weed out sources of noise (e.g., system calls) but for debugging Logic Errors the onus is on the developer, who has knowledge of how the program should behave, to identify the source of an error.

To LINQ or not to LINQ

I’m a big fan of LINQ, especially the query expression form (e.g., from c in cars where …), but there are times when LINQ isn’t appropriate. Like all abstraction layers unfortunately the abstraction tends to leak the more deeply it’s applied (not unlike metaphors that break down when over-applied).

If you’re building a high-scale service and much of that service’s ability to scale is dependent on a relational database then it seems that time and time again this abstraction leakage becomes more of a problem than the convenience it provides.

In the case of LINQ-to-SQL the problem is that LINQ often translates queries in a very sub-optimal way. This translation is not readily apparent when a human being looks at the source (though it can be determined with tools). This is my first pet peeve about LINQ; by abstracting away the query translation (and thereby shielding the developer from having to learn SQL) the application itself becomes harder to maintain.

If the database itself, with all of its knowledge about the distribution of the data and query access patterns, occasionally comes up with sub-optimal execution plans then how on earth is LINQ-to-SQL, which doesn’t have this information available, going to do as well or better?

For small databases this abstraction leakage isn’t a problem. The increase in developer productivity probably more than makes up for the loss of efficiency.

For large databases this leakage quickly results in slow queries that are difficult to identify since the database only sees the translated query but the source code only shows the LINQ form. LINQ-to-SQL is so similar to SQL that learning SQL is a very small price to pay for the increased efficiency and maintainability of an application.