Monthly Archives: May 2010

Three Years late, Was It Worth The Wait? Windows 7, Vista Promises Delivered.

Having been married to Microsoft for most of my professional career doesn’t mean I drink the Kool-Aid.

I have had the distinct privilege to grow up in interesting times. I loved DOS. As a BBS operator DOS was the de facto OS for most BBSes that ran on x86 hardware. Combined with QEMM/DESQview  I was a multitasking fool, running many nodes on a single 386 and a ton of ram, 8 Megabytes to be exact.

Other OSes came and I tried them as well Even running OS/2 for a while. It was DOS compatibility and multi-instance that I was after, though you could run Windows 3.x apps in it, why bother.

I just didn’t see where Windows was anything near as powerful as my good old DOS prompt. I had used GUI’s before and knew that some day it would be the way things went. To put it bluntly though, I hated Windows. I my eyes at the time, it did nothing well. It made my sleek powerful machine run like a pig. It required me to learn how to do things the Windows way, which slowed me down. I even went so far as to actively refuse to own or install a mouse I so loathed Windows.

In many aspects my opinion hasn’t changed much. To be honest I blame the “over promise, under deliver” method of development that Microsoft seems to employ with Windows OS development.

Windows 3.11 for Workgroups was modestly noteworthy in my view because it help bring the internet into homes for the first time. I also knew the internet was awesome and powerful but I didn’t grasp the whole World Wide Web thing immediately ether. Not being a graphical guy I didn’t see what it bought me over any other tools that ran on the internet, until I really saw it running on Windows.

It still wasn’t enough to completely win me over. I was already working with GNU/Linux going back to DOS/Windows mostly to play games or develop on as a platform since that paid the bills. I had been using NT for quite a while as a systems/network/database administrator but still ran Linux at home and for other projects when I could. That changed a bit with the release of Windows 2000. To me it was Windows all grown up. 32 bits, nice GUI, fairly stable it had a lot going for it. Plus, Windows Me was such a miserable experience it was an easy choice to go with 2000 Workstation and pretend that particular nastiness just didn’t exist. Though it wasn’t until XP hit that I switched full time for my day to day desktop to Windows, from dual booting just XP all the way.

That lasted for quite a while…. Until Vista. Vista didn’t have a marketing problem, it didn’t suffer from bad press. It was just fundamentally broken. I don’t care how many Mojave commercials you run, live with it for a while and you will be just as unhappy with it if it was named Santa Clause.
I do thank MS for releasing Vista though, it turned me back on to Linux and specifically Ubuntu which I have been using now for the last couple of years. If anyone can close on Windows I think Ubuntu and Mark Shuttleworth have the best chance, unless Apple looses its mind and releases OSX for white boxes.

In the interest of full disclosure, I will always give Windows its due when it comes to ease of configurations and common usage. If it wasn’t for Windows my mom would still be putting stamps on her mail to me.

With that little history lesson, and my obvious bias against Windows I still always try the latest and greatest for MS. It is in my best interest to do so. I don’t want to ever be too far behind the curve, or have a lack of something to complain about.

So, with all my gripes and soap-boxing, here I sit typing away on a x86 machine with Windows 7 loaded on it, and I’m happy with it. So happy I’m not dual booting into anything at the moment and my laptop has it loaded as well.

Why, you may ask, am I back on the bandwagon? Here is the short list.

 

Performance.

That’s right, as bad as Vista was, 7 doesn’t show any signs of the past sins. My first big ugh moment with Vista was trying to copy files on the network. It just wouldn’t start, or if it did it took forever to finish. I know it was addressed in a patch and later by SP1 but it was a band-aid on a sucking chest wound. Rarely would I come close to gigabit speeds even though I’m on a managed switch and both ends can easily handle the load. XP came much closer, and if I wasn’t using Samba, Ubuntu just flew over the wire. Windows 7 brought that back in line. When I got near wire speed on my first test run I just assumed it was wrong. I still doesn’t handle lots of small files as well as my Ubuntu setup but its not enough to quibble about.

Visuals.

Vista had them but at the cost of making your state of the art machine run like last years eMachine you bought for your mom for 300 bucks. On the other hand Ubuntu with Compbiz was just stunning and ran on my older Pentium M laptop with a radeon x200 mobile GPU in it. Again, 7 addresses this it keeps the visuals from Vista and improves on them, I got to say the rotating wallpapers is my current favorite feature at the moment. It is still a generation behind Compbiz as far as raw visual stunning effects.
I’ll never forget when a friend of mine was going on about Aero glass and transparencies in Vista all I did was break out my laptop and tab through the running apps. Once he picked his jaw up he asked how I had gotten Vista to do that….. After he got over the second shock, that it was Linux, I had him trying Ubuntu for himself.
I’ve also attempted to use Stardock to get as close to the same effects on windows and just had to give up. There was enough annoying crashes and blips to make it not worth my time.

Organization.

I wasn’t sure I was going to like the new fat bar but it has quickly grown on me. I hate having a million icons on my desktop but I want things to be accessible that I use day in and day out. With Windows 7 replacing the quick launch with the ability to pin an application to the bottom bar, or in the start menu, you get the best of both worlds you task bar shows you what is running it also acts as your quick launch and it is remarkably uncluttered.
I am also a fan of the mouse over preview that shows you how many things you have open per group and what is in them i.e. having multiple browsers open or multiple management studio sessions. With the quick preview I can just peek and pick the one I need to work with now without having to alt-tab through everything.
The focus and fade effect you get when you mouse over then up onto the preview showing you only that window on the screen is also a nice touch. I use to always use the minimize all windows using the shortcut on the quick launch bar, then alt-tab through the list of running programs to find the one I was after it sucked but it was fast enough.

Compatibility.

Out of the box I had very few driver issues with 7. It even installed without my help on my Nvidia raid array. There are a couple of drivers missing for my laptop but no real show stoppers. Since Vista took the brunt of that attack I’ll chalk it up as a win in that column for Vista.

Security.

Don’t laugh I mean it. 7, even as a beta and now RC has a better, more polished security model. Not the open range XP was and not the heavy handed style of Vista.
Just to make other Ubuntu/Linux junkies upset I don’t think it is any more disrupted as having to execute under sudo to install components or do administrative actions.
I do wish there was a bigger push to move stuff out of the kernel space and into user land for security and stability but I think time will fix these issues as well.

Stability.

I still hear you snickering from the above topic but I must push on. Other than the 1.5 BILLION reboots to install software or update drivers I haven’t had any real issues with crashing.
The compatibly run as model actually worked for me on a couple of apps that didn’t play well under 7, but did just fine on Vista. Also, the fact you can install an application in this mode made life easier to the legacy stuff I have to have. 
Another thing that will make the OSX guys upset is I haven’t rebooted my laptop since the install was completed. Hibernate actually works and that is the mode I leave it in. On my new laptop with 4GB of ram and a decent SSD drive it comes back from hibernate in a flash(no pun intended, oh hell who am I kidding of course it was intended). I was pretty much guaranteed that If I put Vista into hibernate it was about a 1 in 3 chance that I’d have to ditch the saved image and reboot clean.

This all adds up to a better user experience and enhanced productivity without a steep learning curve. I don’t feel like this was rushed out the door and then crammed down our collective throats as the pentacle of operating systems.

If you haven’t tried it, do so. I think you will be pleasantly surprised.

What I’ve Read and Recommend to Others – SQL Server 2008 Part 1

I read, a lot. I’ve been a prolific reader all my adult life.

I use to split my reading between tech books and my regular relaxing reading but since I got into audio books several years ago I just pretty much read tech books now. Some times I’ll listen to a book and read a manual at the same time breaking from one or the other if I need to really focus on a particular passage.

This allows me to really chew through a large amount of text in a pretty short amount of time.

I also have a method of digesting the information as well.

When I read a large technical volume I usually do it in three passes.

First pass is a scan of the entire book marking things of immediate need or interest for detail re-read later.

Second pass is the detail look at my notes from the first pass.

Finally, I re-read the whole thing and take notes on stuff I may have skipped the first time through because I “thought” I knew about that already, or didn’t apply to my core skill sets.

I recently posted a short list of authors I always read and now I’m going to follow that up with specific texts with some notes.

Eventually, I’ll expand some of these into single book reviews, ones I’ve come back to over and over or that I found extremely interesting.

 

Microsoft® SQL Server® 2008 Internals (Pro – Developer)

Kalen took over Inside SQL Server full time for the 2000 release and hasn’t looked back. Inside SQL Server 2000 was a must have on any book shelf. 2005 saw a shift by breaking the book into several volumes and inviting the industries best to write about the fields they were established experts in.

This time around, that theme has been carried forward and several noted experts lend a hand in this volume. SQL Server as a product is impossible for a single person to be an expert in from end to end. Even the core engine is beginning to grow to a point that just being an expert in a single aspect may be enough to establish a career.

With the inclusion of Paul Randal, Kimberly Tripp, Conor Cunningham and Adam Machanic this book is an exceptional read. The access to other experts and team members on the SQL Server team at Microsoft lend this book a level of authority that any other book like it in this space just simply cannot match.

Again, Kalen gives us a solid walk through the core engine, what has changed and how it works. I was personally happy to see Conor do his chapter over the optimizer.

 

Inside Microsoft® SQL Server® 2008: T-SQL Querying

Itzik is a giant in the query world. He has a grasp of the T-SQL language that few can match let alone surpass. Time and again I reach for his work to solve a problem or just to learn something new.

With this volume he has added some other note worthy people have joined in. Lubor I have known for quite a while. He is a staple figure at Microsoft and is also a big thinker.

Steve Kass long time SQL Server MVP and scary smart fellow himself adds his wealth of knowledge and his ability to communicate to this book.

I have a feeling this will become a “must have” and a classic in it’s own right.

 

Microsoft® SQL Server® 2008 T-SQL Fundamentals (PRO-Developer)

Another Itzik book teaches you the foundations of T-SQL including previous versions of SQL Server and raps that up with the new stuff in 2008.

Never think you are too smart to read a foundation book no matter what point you are in your career. This is how you can catch new ideas, theories and techniques that “what’s new in…” books can miss.

Plus Itzik’s style and clarity may actually re-learn something you thought you already understood.

 

Pro SQL Server 2008 Relational Database Design and Implementation

 

I’ve known Louis for a number of years and have read every one that I know of.

He has a down to earth conversational style that is easy to read. Also, having Kevin Kline work on this book is just icing on the cake. I have been a fan of Kevin’s work since the 7.0 days.

This book is good for someone who has been doing some design and development work and really wants to start digging in deeper. He makes no assumptions about what you do or don’t know he starts at an introductory level and walks you through what it takes to do design work and turn that into a usable physical model.

I will say he has progressed to cover more material in every new volume and this latest outing is no exception. Make no bones about it Louis works for a living. There are a lot of text that will make you feel stupid or completely inadequate to do your job but Louis makes some of the more complex problems easy to digest by putting them in real world examples.

I’ve got a lot more books in the pipeline and will be talking about them soon.

I will also be covering 2005 and even 2000 in the coming posts. I know there are some folks still on 2000 and even 7.0 but I don’t have any books save one that old on my shelf at this point.
Since I do more than just SQL Server or database work in general I’ll be covering those volumes as well and why I like them.
Lastly but not least I will also cover books I’ve read and didn’t find that helped me move my knowledge forward, not that these books were bad, they just didn’t fit me, or there was another author who covered the material that made it easier for me to understand.

It’s Beginning to Look A Lot Like Christmas……

 

We got something good in the mail last week!

 

FusionIODuo640

 

Some quick observations:

The build quality is outstanding. Nothing cheap at all about this card. The engineering that has gone into this shows in every way.

It is made up of modules that are screwed down, I can see where they really thought this through so each rev of the card doesn’t require all new PCB’s to be manufactured.

It does require an external source of power via 4 pin Molex or SATA power connector period. Make sure your server has one available, even though these are sold by HP not all HP servers have the required connectors.

PCIe expander bays are few and far between. The issue is most of these are used to expand desktops, laptops or used in non critical applications mostly AV or render farms.

http://www.magma.com/products/pciexpress/expressbox4-1u/index.html

This is a nice chassis but they are currently being retooled and won’t be available for a month or so. It is the only 1U and it has redundant power.

It exposes two drives to the OS per card. We will initially configure them two per machine in a RAID 10 array for redundancy.

 

More to come!

 

Wes

Understanding File System IO, Lessons Learned From SQL Server

I do more than just SQL Server. I enjoy programming. In my former life I have worked with C/C++ and Assembler. As I spent more and more time with SQL Server my programming took a back seat career wise. Having that background though really helps me day in and day out understanding why SQL Server does some of the things it does at the system level.

Fast forward several years and I’ve moved away from C/C++ and spent the last few years learning C#.

Now that I work mostly in C# I do look up solutions for my C# dilemmas on sites like http://www.codeplex.com and http://www.codeproject.com. I love the internet for this very reason, hit a road block do a search and let the collective knowledge of others speed you on your way. But, it can be a trap if you don’t do your own homework.

I write mostly command line or service based tools these days not having any real talent for GUI’s to speak of. Being a person obsessed with performance I build these things to be multi-threaded, especially with today’s computers having multiple cores and hyper threading it just makes since to take advantage of the processing power. This is all fine and dandy until you want to have multiple threads access a single file and all your threads hang out waiting for access.

So, I do what I always do, ask by best friend Google what the heck is going on. As usual, he gave me several quality links and everything pointed to the underlying file not being set in asynchronous mode. Now having done a lot of C++ I knew about asynchronous IO, buffered and un-buffered. I could have made unmanaged code calls to open or create the file and pass the safe handle back, but just like it sounds it is kind of a pain to setup and if you are going down that path you might as well code it all up in C++ anyway.

Doing a little reading on MSDN I found all the little bits I needed to set everything to rights. I set up everything to do asynchronous IO and I started my test run again. It ran just like it had before slow and painful. Again, I had Mr. Google go out and look for a solution for me, sometimes being lazy is a bad thing, and he came back with several hits where people had also had similar issues. I knew I wasn’t the only one! The general solution? Something I consider very, very .Net, use a background thread and a delegate to keep the file access from halting your main thread, so your app “feels” responsive. It is still doing synchronous IO. Your main thread goes along but all file access is still bottle-necked on a single reader/writer thread. Sure, it solves the issue of program “freezing” up on file access but doesn’t really solve the problem of slow file access that I am really trying to fix.

I know that SQL Server uses asynchronous un-buffered IO to get performance from the file system. I did some refresh reading on the MSDN site again and struck gold. Writes to the file system may OR may not be asynchronous depending on several factors. One of which is, if the file must be extended everything goes back to synchronous IO while it extends the file. Well, since I was working with a filestream and a newly created file every time I was pretty much guaranteeing that I would be synchronous no matter what. At this point I dropped back to C++. I started to code it up when I realized I was doing things differently in my C++ version.

I was manually creating the file and doing an initial allocation growing it out to the size the file buffer and the file length on close if need be.

I started up my C++ version of the code and watched all the IO calls using Sysinternal’s Process Monitor. I watched my C++ version, and lo, it was doing asynchronous IO in the very beginning then switching to synchronous IO as the file started growing. I fired up my instance of SQL Server and watched as the asynchronous IO trucked right along…. until a file growth happened and everything went synchronous for the duration of the growth.

AH HA!

So, taking that little extra knowledge I manually created my file in C# set an initial default size and wouldn’t you know asynchronous IO kicked right in until it had to grow the file. I had to do a little extra coding watching for how much free space was in the file when I get close I now pause any IO,  manually the file by some amount and then start up the writes again keeping things from going into a synchronous mode without me knowing.

So, there you go my little adventure and how my old skills combined with knowing how  SQL Server works helped me solve this problem. Never assume that your new skills and old skills won’t overlap.

Found a Bug in SQL Server 2005

It doesn’t happen often but every once in a while you may be the lucky person to find a previously unknown bug in SQL Server.

It was a normal morning for me, checking the status of our servers going over any failure messages waiting for the day to ramp up. That’s when one of our lead developers came around the corner and told me he had an error when he had tried to create an index on a table he was working on. The more he tried to explain the error the more I started to worry. I had him send me the code and the error statement.

Location:     BtreeMgr.cpp:5372

Expression:   bufferLen > currOffset + ACCESSSOR_OVERHEAD

SPID:         116

Process ID:   5016

Msg 3624, Level 20, State 1, Line 2

A system assertion check has failed. Check the SQL Server error log for details. Typically, an assertion failure is caused by a software bug or data corruption. To check for database corruption, consider running DBCC CHECKDB. If you agreed to send dumps to Microsoft during setup, a mini dump will be sent to Microsoft. An update might be available from Microsoft in the latest Service Pack or in a QFE from Technical Support.

Msg 0, Level 20, State 0, Line 0

A severe error occurred on the current command.  The results, if any, should be discarded.

I had what we like to call in the high availability space a “pucker moment”. This wasn’t your normal, I typed something wrong and got an error, kind of problem. This was a real SEV 20 with an assert, the core engine had just puked on something it shouldn’t have.

Like all good DBA’s the first thing I did was run a DBCC on the database this error was generated from.

While that was going on I asked my very good friend, Google, if he had seen this particular assert before. For the first time in a very long time Google failed me! In the last few years if I hit this kind of hard error someone else has too and it is ether being fixed in a hot fix or addressed in the next version of SQL Server, but not this time.

So, we have this same schema on another server and the developer tried the exact same code there and had the exact same error.

I had him document the steps he took to get to this point and to his credit the steps were clear, concise and easily reproducible.

The DBCC finished with zero problems detected, which let me calm down a bit. That coupled with the fact it looked like I had a repeatable test case When the second database had cleared the DBCC I set about my task of reproducing the error and trying to find a work around. Lucky for us it was a simple matter of column organization in the index and we were able to apply it successfully and carry on with life.

I bundled up the documentation I had accumulated, ran the test case confirmed the bug and sent it off to the powers that be at Microsoft. Since we had a work around and it wasn’t a show stopper I didn’t raise it as a critical server down issue but Microsoft still worked it in a timely fashion.

So, what was the problem you say? It was an interesting edge condition.

We have a table that contains a composite primary key and the rest is made up of bit fields, a flag table.

We had to add a new column, another bit flag, to the table.

The non-clustered covering index was dropped the column was added to the end of the table.

The index was updated with the new column at the end of the column list and then *POOF* it blew up.

I think it has to do with two specific things.

First, bit fields are stored in a compact manor where multiple bits share a byte and aren’t truly separate from every other but field. It would be a huge waste of space to store each bit in it’s own byte but it would make things like this index issue less likely to happen.

Secondly we did a column add but didn’t drop and recreate the table repopulating it in the process so things at the page level weren’t nice and neat. The underlying clustered index wasn’t effected but when we tried to add an index back with the new field it couldn’t do it. The fix was simple, change the column order in the non-clustered index moving the new column up one. We verified the data without the index was correct and with the index was correct.

I haven’t tried it yet, but I am betting included columns won’t suffer the assert ether since the items don’t have to be sorted in the index.

So there you go! Having been on the software side of things a couple of times I always find it interesting when I find bugs in others products and work the issue to conclusion.

What is your take away from all of this? Never be afraid to submit a bug report to Microsoft. I have heard people say to the effect someone else will or has hit the bug and they will submit it. DON’T RELY ON THE ACTIONS OF OTHERS! Reporting bugs helps the community as a whole and makes the product better. When you assume someone else is worked it you are putting YOUR production servers in the hands of strangers. If someone has submitted it and it didn’t turn up in a search they will let you know, and be very kind about it to boot. You will get piece of mind that it is being worked on and it is a bug, or that you may keep someone else from stumbling onto this and not having the knowledge to fix it or work around it.

Wes