Being Thankful

20 November 2013

We’re quickly moving into that holiday season. My family and I have been amazingly blessed. We have a roof over our heads, we can share meals together, we have kids that make us proud and we love each other. Honestly, every Christmas season, when asked, I can’t think of a single thing that I want because everything that I need is fulfilled.


The only thing that makes me somber during these days is the thought of those whose day to day living is harder. I’ve lived through times like that – days where I told my little boys that we couldn’t rent a movie – I didn’t have the extra $3 to do that. It wasn’t about teaching a lesson or showing the value of money – those $3 had to go towards food. I know the stress that it can cause a family when every day is hard and the thought of doing something special for the holidays is impossible. Those times can occur when people run into unexpected emergencies or make some insanely bad decisions.


My hope is that, if families have the overabundance of love that we have, that we share it with those don’t have it right now. I know of people in the community already that have made decisions that will positively affect others for the rest of their lives. If we could all just do what we can, whether it’s time or money – I still think that it could create a genuine change for those who receive it.


When I was younger, I served with a community service organization for about 10 years. Most of our service was for young women and children, and one of the places that we worked with often was a shelter where women with drug issues were court appointed to spend their time. Most of these women were pregnant and/or had young children with them. Being pregnant with my first at that time, I couldn’t imagine that plus being addicted to drugs. We spent time with them, talking with them, talking about possibilities for the future and playing with the kids that were there. During the time that we served there, we saw many women come and go. A couple of years later, I was having breakfast with a friend. A young women, appearing very professional , came up to me and asked if I remembered her. I didn’t at first, since the change was so great. She told me about how she’d gotten her life back on track, was raising her child and working at a regular workplace. One thing I remember her saying was “We could never understand why you guys would want to come and spend time with us. We knew that you had families and other things that you could be doing, but it always made us feel good that you took time for us”. I know that she made the hard choices and that she did the work to pull herself up. I’m just glad that maybe I could be a small stepping stone or someone that shined a light that showed her a better future. I didn’t spend a dime, I just took the time to listen and have a conversation with someone many people might pass by.


This is the way that I want to express the thanks that I have for everything that I’ve been blessed with. I hope that others do as well.

PASS Summit 2013
After taking a year off, I’m heading back to the PASS Summit in Charlotte. This year will be a little different for me since I’m attending to help represent my company, SQL Sentry. I’m looking forward to seeing some people that I haven’t seen in a couple of years. I’m also looking forward to seeing a bit of Charlotte – SQL Sentry is actually located in Huntersville, close to Charlotte, but I’ve never really been there.

The first year that I went to the Summit in 2009, I went as a regular attendee. I met some people, attended a ton of sessions, went to some evening events, most notably a very late night breakfast on the last day of the Summit where I met Allen Kinsel (Blog/Twitter).

As a result of that meeting, I worked on the Program Committee for the next two years. Working on the Program Committee is great, a ton of work, a smidge of stress, but I had the opportunity to meet so many people including wonderful volunteers. It’s also one of those volunteer experiences where you actually get to see the results of all of your hard work. Unfortunately, since there is still work to do during the Summit, I probably only went to one or two sessions those two years.

This year, I’ll be working at our booth throughout the Summit. I’ll be demo-ing our awesome software, spending time with our team and talking with Summit attendees. We’ll probably also spend a decent amount of time trying to get Kevin (Blog/Twitter) into a kilt. I’m excited to talk with the folks that come to our booth, to let them know how SQL Sentry might help them, but even to pass on some knowledge that I may have gathered during my time as a production DBA. One of the things that I love best about my job is that – the opportunity to help people out. I doubt that I’ll even make one session this year, but I know that, once again, the Summit is going to be a great experience.

Before attending my first Summit, I had read that meeting other database professionals was equal in value to the knowledge that you get from the sessions. Admittedly, I was skeptical. The Summit has a huge number of great sessions by great speakers. That was the reason that I wanted to attend. Now, though, heading into my fourth Summit, where I know that I’m not going to be attending sessions, I’m just as excited as my first Summit. I’m going to have the opportunity to see old friends and meet new people. The learning, while not in a session hall, continues during discussions with other professionals and hearing the challenges that they experience in their jobs.

The word community, as much as it gets passed around, really applies at the Summit. I’m glad to be a part of it and I’m looking forward to participating in this new role. When you have a moment, come by and say hi to me and the rest of our team!

 | Posted by tledwards | Categories: DBAs, Discussion, PASS | Tagged: , , , |

One month in

25 March 2013

Four weeks ago today, I began working with SQL Sentry. Making the decision to take this job sounds like it should have been a cakewalk. I get to work for a great company and work with people that I respect and whose company I enjoy. I get to work from home and take advantage of the soft skills that I wanted to get back to using.

Although this all sounds great, there was one little thing that kept gnawing at the back of my brain. I wouldn’t be a ‘real’ DBA any longer. Actually I wouldn’t be a DBA at all. I’d heard this from others as well – that my skills would get rusty and that I was becoming a sales drone. I knew that I was incredibly tired of being on call – for most of my career I had been the only DBA, so everything fell on me. I also knew that I needed to switch things up a bit because I was burned out and, unfortunately, I’m still more than a few years away from retirement. At the same time, though, I had worked hard to pull together the skill set that I had and much of that had come from experiences that I had gone through, so if I lost that, I wouldn’t get it back. At least not anytime soon.

Now here I am one month in and I’m finding that I’m actually learning even more about SQL Server. How can that be? Prior to this, I have to admit, I learned what I needed to learn as I needed to learn it. If my company was never going to use Analysis Services, I was probably not going to study much on SSAS. In this job, though, our customers come from all different types of environments and the metrics that are most important to each of them are incredibly disparate. My head may be on fire I’m finding myself studying more so that I will be able to answer the questions that our customers might have. I’ll be attending the SQLSkills Immersion Event on Internals and Performance in May and if that doesn’t make my head explode than I don’t know what will.

Admittedly I won’t have to deal with issues in my environment on a day to day basis – (although I did have a drive go bad on my laptop, causing a ton of my test databases to go suspect), but I need to be a good resource for my customers if they run into issues. That, and I work with some amazingly intelligent people and I’ve always been just a little competitive. I don’t see my skills getting rusty – I see an opportunity.

 | Posted by tledwards | Categories: Uncategorized |

In a little over a week, I’ll be starting my new position at SQL Sentry. I’m excited about this new position for so many different reasons. I get to work with a great group of people – some of them I’ve known for awhile and some that I have yet to meet. Having worked solo for so long, I’m excited to be a part of a team. The other huge benefit is that Kevin Kline (Twitter|Blog) is going to be my boss. It’s not very often that you get to work for someone you respect and whose company you enjoy.

The next step on my journey

The other exciting part of this position is that I’m going to get back to educating. Since I was very young, I’ve wanted to teach. I even had the chance to teach computer science at the community college here for a few years. The only reason that I left that was the huge difference in compensation between teaching IT and practicing it. I was talking with my hubby a few months ago about where I wanted to go career wise. There were a lot of options, but they all included making use of some of the soft skills that I have including the opportunity to teach. With SQL Sentry, I’ll have the opportunity to help ensure that our customers know how to use our tools to their advantage. For any of you that have worked with SQL Sentry tools, there is a lot going on and I know well how little free time DBAs have. My hope is to work with Kevin and this team to provide learning content to help our customers get up to speed as soon as possible.

I’m so pleased that I’m getting this opportunity to work in a position that lets me make use of my DBA skills and make use of my educational background. Hopefully I’ll get the chance to talk with some of you in the near future!

 | Posted by tledwards | Categories: Uncategorized |

What in the heck?

What in the heck?


I know that I haven’t blogged in forever, but I thought this particular bug might hit some of my fellow IT people out there.

A couple of months ago, I was fortunate enough to pick up a consulting gig. This was when I realized that I’d been working off of laptops that came from my employers and didn’t have a decent one of my own. I purchased a Lenovo laptop (that I’m very happy with) and, with it, came Windows 8. I actually like it for the most part. It’s shiny and that’s good. The ability to group the tiles and title them really appealed to my OCD. It was great for everything except for the reason that I purchased it.

When I initially started the consulting work, I was working off of an older laptop that had Windows 2008 Server. I connect via a VPN to a VM that they had set up for me. Everything connected perfectly fine and I was working away at the desk that I’d set up upstairs. A couple of weeks ago, my new laptop arrived and at the time it was easier for me to work downstairs. Set up the VPN on my Win 8 machine and everything was great.

Once I moved back upstairs, though, everything went crazy. My VPN connection started disconnecting constantly – I couldn’t keep a connection for more than 5 minutes. I brought up my old laptop, connected to the VPN and everything was fine. I went back downstairs with my new laptop and the connection was fine. By the way, my other internet connections on the Win 8 machine were never affected – only the VPN connection. In case you need help catching up:

  • Old laptop – Windows Server 2008
    • Downstairs – Internet and VPN connections work
    • Upstairs – Internet and VPN connections work
  • New laptop – Windows 8
    • Downstairs – Internet and VPN connections work
    • Upstairs – Internet connections work – VPN disconnects every few minutes

As one last try before going back to my old laptop, Tim put a Win 7 VM on my new laptop last weekend. Guess what? It works fine everywhere. I’ve been working on it all day(upstairs) with no disconnects. Same laptop, same wireless card, same router, same VPN client – just an earlier version of Windows. I’m baffled.

While I hope I’m the only one that’s seen this type of craziness, I doubt it. Maybe this will help someone else from spending a week running up and downstairs.

 | Posted by tledwards | Categories: Discussion, Miscellaneous |

If you have used the Central Management Server (CMS) feature in SQL Server 2008+, by now you know that you can’t add the SQL Server instance that you are using to host the CMS as a registered server to the CMS. That is, you can’t directly add it. There is, however, a workaround as long as you connect to your servers via TCP/IP. To work around this limitation in the CMS functionality:

  1. Right click on the CMS and click “New Server Registration”

  2. In the “Server name:” text box type in the IP address of the server, a comma and the port number that the instance is listening on. In this example, the test server’s IP address is 192.168.1.1 and the instance that I want to add, which also houses the CMS, is listening on port 1433.

  3. In the “Registered server name:” text box, type the name of the instance (in this case, Test) and voilà, we have now been able to register the CMS instance on the CMS.

Over the last couple of years, I have met a number of you, who as I, are required to struggle daily to make CommVault work as the enterprise backup solution for your SQL Server databases. Given that, I thought I would share with you one of the issues that we have run into and possibly any third party product could run into that uses Microsoft’s Volume Shadow Copy Services (VSS). To be fair, I have to give the credit for the research and most of the write up of this issue and solution to one of the DBA’s that works for me (whom I am sure many of you know J), Samson Loo (twitter: @ayesamson).

Problem:

I/O operations are frozen on one or more databases due to CommVault issuing a “BACKUP DATABASE WITH SNAPSHOT” command and remain frozen until the operation completes successfully or is cancelled. (This is appears to be known behavior of VSS. If you wish to dig further into how VSS works, I would suggest reading these articles: http://msdn.microsoft.com/en-us/library/windows/desktop/aa384615(v=vs.85).aspx and http://msdn.microsoft.com/en-us/library/aa384589(v=VS.85).aspx).

This particular message gets logged in the SQL Server error log whenever any backup service makes use of the SQL Server Virtual Device interface to backup the database with snapshot. Microsoft Backup (ntbackup.exe), Volume Shadow Copy Services, Data Protection Manager, Symantec Backup Exec and other third party tools, in addition to CommVault can cause this message to be logged.

If ntbackup.exe is configured to take a backup of a drive that happens to house SQL Server data files, then the command “BACKUP DATABASE WITH SNAPSHOT” is issued to ensure the backup is consistent since the data files are in use. During this time, the I/O for the database that is currently being backed up is frozen until the backup operation is complete.

The message that you will typically see logged is:

    Error:

I/O is frozen on database master. No user action is required. However, if I/O is not resumed promptly, you could cancel the backup.

 

 

    Note: In the example error message above, the database master was referenced, but this could be any database on your instance that is being backed up.

Solution:

Disable VSS in the subclient following the steps below:

  1. Open the CommVault client (often called the “Simpana CommCell Console”)
  2. Navigate to the target subclient
  3. Right click anywhere in the white area

  4. Select Properties
  5. Uncheck “Use VSS” and click OK

 

Again, extreme thanks go out to Samson Loo (twitter: @ayesamson) for providing most of this content!

These days, I live and die by OneNote.  I read a ton of technical blogs and come across a number of great scripts and when I do, I save them to OneNote.  I take notes from meetings in OneNote and I even save videos and webcasts that I feel are especially pertinent to what I do in OneNote.  I have a ton of notebooks in OneNote each with a bunch of sections and pages (in fact, my OneNote notebooks are about 15GB in size!).  But the problem I have always had was that unless I wanted to search through my OneNote notebooks (which, I have to say, Microsoft certainly has included a very capable search functionality in this product), it was hard to find specific things because there didn’t seem to be a way to sort your OneNote sections and pages; basically, they just showed up in the order you created them unless you wanted to manually sort them (but who has the time for that!).

This was a problem until I came across this little lifesaver tool that makes keeping my OneNote notebooks tidy and in order.  It is a little program called the “OneNote 2010 Sort Utility.”  You can read more about this little golden nugget here.

If you decide that this little free utility might make your life easier, you can download it here.

And by the way, if you are using Microsoft Office 2010 Professional and you haven’t tried OneNote 2010 to organize your life (or at least your personal life). I strongly recommend giving it a spin. At first, it may seem a little daunting, just like writing your first SSIS package, being stared at by that blank screen. But, rest assured, there is help out there and a fairly active community of users. Once you understand it’s metaphor to a physical binder (if you are my age, you might even insert “Trapper Keeper” hereJ), with notebooks for different subjects and then sections within each notebook and then pages within the sections and the fact that you can actually print documents to OneNote 2010 as well as attach any kind of file, it becomes one of those tools that is hard to live without. In fact, it integrates so well with Outlook that if you have OneNote installed, your Outlook meetings will have a OneNote button on them and clicking that creates a page that contains all of the information from the Outlook invitation and then lets you take meeting notes. I could go on and on and, in fact, have because I intended this blog post to really only be about this OneNote 2010 Sort Utility, but OneNote is, unfortunately, one of those things that I am quite passionate about because it has saved my bacon a number of times. At any rate, if you don’t use OneNote or want to know how to use OneNote, here are some links to get you started (some of these might apply to OneNote 2007, which some of you may still be on, but the concepts generally also apply to OneNote 2010):

http://blogs.msdn.com/b/chris_pratley/archive/2009/03/10/i-heart-onenote.aspx

http://blogs.office.com/b/microsoft-onenote/

http://office.microsoft.com/en-us/onenote-help/getting-started-with-onenote-2010-HA010370233.aspx

http://office.microsoft.com/en-us/onenote-help/basic-tasks-in-onenote-2010-HA101829998.aspx

http://www.onenotepowertoys.com/

 


 

A couple of years ago, I wrote a blog post comparing a couple of different methods and/or products for SQL Server disaster recovery. Over the last couple of weeks, my company has had the opportunity to test the pre-release version of the latest version of Double-Take. I want to make it clear; this blog post is not an endorsement or criticism of the product, rather just some first impressions.

Some History

About four years ago, my company chose Double-Take as a disaster recovery solution because we were in the middle of relocating our data center across the country and we now owned a hotsite. This product, mind you, was chosen by a group of people that completely excluded any DBAs. We stumbled through what seemed to be a very kludgey installation process and finally got it to work on a couple of servers. We then proceeded to install this on all of our servers and were able to successfully use this to transfer all of our SQL Server instances to new hardware in our new data center. Compared to many of the options available four or so years ago, this was considered a big win.

 

Once the new data center was set up, we then proceeded to attempt to get it installed between the new servers in our new data center and the servers in our hotsite. For many of our servers, the installation went as expected (at least from our installation experiences from the data center move exercise) and we quickly got Double-Take mirroring up and running on several servers. The problem came when we tried to use Double-Take to mirror several mission critical servers that happened to sit in our DMZ. Because we were mirroring a server from our production data center that sat in the DMZ, we also had to mirror to a hotsite server that sat in a different DMZ. This exposed a huge weakness in the Double-Take product. Try as we may, we could not get the two servers talking across the two DMZ’s because Double-Take was dependent on WMI calls which meant that the ports used by WMI were dynamic and you could not predict which of the almost 65,000 ports it would choose, not a good thing for a DMZ as our network group was not going to open up all 65,000 ports in the DMZ (and rightfully so) for these two servers just to get Double-Take to work.

 

Today

Fast forward four years and our DR strategy for our DMZ servers hadn’t really progressed much. That is until we pressed our account management team at Vision Solutions (the company that now owns Double-Take) as we were very tempted to just to drop all of the licenses because of the limitations of the software. After meeting with a couple of their engineers, we received a pre-release version of Double-Take 6 which has thankfully removed all dependence on WMI. With Double-Take 6, we only have to open up a maximum of four ports to get this to mirror an instance across the two DMZ’s. The test installation, after a couple of hiccups (this is pre-release software, after all), went fairly well and it is looking promising. We still need to do a comparison against servers running SQL Server 2012 to test its AlwaysOn capabilities against those of Double-Take and compare the costs to see which works best for us in the long run, but for now, I think we finally have a DR solution for our DMZ in Double-Take. And even if the AlwaysOn technology in SQL Server 2012 proves to be just as or more powerful, there is no way that I will be moving 160+ SQL Server instances to SQL Server 2012 any time soon. So here is hoping for continued success with Double-Take as a DR solution in our environment.

 | Posted by tledwards | Categories: Administration, DBAs, HA/DR, Uncategorized |

As many of you have probably noticed, I haven’t blogged in quite a while due to work commitments, health issues and various family commitments (don’t want to go on too long here with excuses J), but I decided a perfect first blog post back might be taking the stored procedure that a friend and his consulting group have so graciously shared with the community. I am, of course, speaking of Brent Ozar’s sp_BLITZ stored procedure intended to help a DBA see what they are inheriting when someone dumps a new server on them. I kind of took a different twist on this and figured that this might be a great tool to use on all of the SQL Servers in an environment periodically by creating a report around it.

Some Background

I work as the lead DBA in an environment with over 160 SQL Server instances (a number that seems to grow by at least five or so every quarter) and somewhere in excess of 2,000 databases ranging in size from what some of you might consider laughably small to some rather large data warehouse databases, many of which are mission critical to our business. To manage this environment, I have a team of two other DBA’s that I lead. One started with the company the week after I started and the other, a junior DBA, has been with us just over a year. We have a great team, but even with three DBA’s, it is hard to be proactive without some tools to let you know what is going on. Unfortunately, we don’t have the budget for some of the major monitoring tools as the cost for our environment would be rather substantial. Needless to say, it is left to me and my team to be creative and create our tools and instrumentation. That is where Brent’s sp_BLITZ script comes in. With a report written around it that my junior DBA can go through on a weekly or monthly basis, we can be much more proactive with some of the more basic or fundamental settings that someone who shouldn’t have access to change, but always inevitably does, changes without our knowledge.

 

The Report

So, the report itself is pretty simple. Unfortunately, it does require that you have a server that has linked servers to all of your servers (we have a centralized DBA server that we use for this) and the sp_BLITZ script that can be downloaded from here has to be installed on each of these servers. This is a perfect use for the SQL Server 2008 Central Management Server feature that we have set up on our DBA monitoring server. What I have done in the report is created two datasets, one that queries a table that we maintain with an inventory of all of our SQL Servers which will feed the “Server Name” report parameter and the second which actually runs the sp_BLITZ stored procedure on the server that has been chosen from the dropdown. Brent has a great video on exactly what his script does at http://www.brentozar.com/blitz/. This report just gives you a format that you can go out and run off of your Reporting Services site or even schedule to run automatically in a Reporting Services subscription and have it automatically emailed to you or posted out in a document library on a SharePoint site if you are running Reporting Services in SharePoint integrated mode. This report does require that your Reporting Services service is at least 2008 R2 in order to work. One of the nice things about this report is that the URLs that Brent provides in the output for this stored procedure are active links in this report, so if you click in that URL cell, you will be taken to the page on Brent’s site that explains the Finding. Below are some screenshots of the report in collapsed and expanded form (all private information has been blacked out to protect the innocent or at least those who sign my paycheck J):

 

    

Figure 1 Collapsed Version of Report

 

    

    

Figure 2 Expanded View of Report

 

Setting Up The Report

So, to use the report that is freely downloadable at the end of this blog post, all you need to do is go into the Data Source for the report and change it to the name of your monitoring SQL Server or at least a server that has linked servers to all of the servers that you want to manage with this report, like so, replacing the text <Type Your Monitoring Server Here> with the name of your monitoring server.:

    

    

 

The next step is to make sure that you have a table on your monitoring server that has an inventory list of all of the servers from your environment and replace the <ServerName.database.schema.tablename> text in the query in the Servers Dataset with the pertinent information for your environment. See below:

 

    

 

 

From here, it is just a matter of deploying the report to your Reporting Services server and making sure that Brent’s stored procedure has been created on all of the servers that you wish to monitor.

 

The report can be downloaded here (you will need to go to Brent’s site mentioned earlier in this blog post to get the latest version of his sp_BLITZ script). I hope that you find this to be one of the many helpful tools in your tool chest to keep your environment in check.