We’re quickly moving into that holiday season. My family and I have been amazingly blessed. We have a roof over our heads, we can share meals together, we have kids that make us proud and we love each other. Honestly, every Christmas season, when asked, I can’t think of a single thing that I want because everything that I need is fulfilled.
The only thing that makes me somber during these days is the thought of those whose day to day living is harder. I’ve lived through times like that – days where I told my little boys that we couldn’t rent a movie – I didn’t have the extra $3 to do that. It wasn’t about teaching a lesson or showing the value of money – those $3 had to go towards food. I know the stress that it can cause a family when every day is hard and the thought of doing something special for the holidays is impossible. Those times can occur when people run into unexpected emergencies or make some insanely bad decisions.
My hope is that, if families have the overabundance of love that we have, that we share it with those don’t have it right now. I know of people in the community already that have made decisions that will positively affect others for the rest of their lives. If we could all just do what we can, whether it’s time or money – I still think that it could create a genuine change for those who receive it.
When I was younger, I served with a community service organization for about 10 years. Most of our service was for young women and children, and one of the places that we worked with often was a shelter where women with drug issues were court appointed to spend their time. Most of these women were pregnant and/or had young children with them. Being pregnant with my first at that time, I couldn’t imagine that plus being addicted to drugs. We spent time with them, talking with them, talking about possibilities for the future and playing with the kids that were there. During the time that we served there, we saw many women come and go. A couple of years later, I was having breakfast with a friend. A young women, appearing very professional , came up to me and asked if I remembered her. I didn’t at first, since the change was so great. She told me about how she’d gotten her life back on track, was raising her child and working at a regular workplace. One thing I remember her saying was “We could never understand why you guys would want to come and spend time with us. We knew that you had families and other things that you could be doing, but it always made us feel good that you took time for us”. I know that she made the hard choices and that she did the work to pull herself up. I’m just glad that maybe I could be a small stepping stone or someone that shined a light that showed her a better future. I didn’t spend a dime, I just took the time to listen and have a conversation with someone many people might pass by.
This is the way that I want to express the thanks that I have for everything that I’ve been blessed with. I hope that others do as well.
This post about learning opportunities for new DBAs has been moved to blogs.sqlsentry.com.
Posted by loriedwards
Four weeks ago today, I began working with SQL Sentry. Making the decision to take this job sounds like it should have been a cakewalk. I get to work for a great company and work with people that I respect and whose company I enjoy. I get to work from home and take advantage of the soft skills that I wanted to get back to using.
Although this all sounds great, there was one little thing that kept gnawing at the back of my brain. I wouldn’t be a ‘real’ DBA any longer. Actually I wouldn’t be a DBA at all. I’d heard this from others as well – that my skills would get rusty and that I was becoming a sales drone. I knew that I was incredibly tired of being on call – for most of my career I had been the only DBA, so everything fell on me. I also knew that I needed to switch things up a bit because I was burned out and, unfortunately, I’m still more than a few years away from retirement. At the same time, though, I had worked hard to pull together the skill set that I had and much of that had come from experiences that I had gone through, so if I lost that, I wouldn’t get it back. At least not anytime soon.
Now here I am one month in and I’m finding that I’m actually learning even more about SQL Server. How can that be? Prior to this, I have to admit, I learned what I needed to learn as I needed to learn it. If my company was never going to use Analysis Services, I was probably not going to study much on SSAS. In this job, though, our customers come from all different types of environments and the metrics that are most important to each of them are incredibly disparate. I’m finding myself studying more so that I will be able to answer the questions that our customers might have. I’ll be attending the SQLSkills Immersion Event on Internals and Performance in May and if that doesn’t make my head explode than I don’t know what will.
Admittedly I won’t have to deal with issues in my environment on a day to day basis – (although I did have a drive go bad on my laptop, causing a ton of my test databases to go suspect), but I need to be a good resource for my customers if they run into issues. That, and I work with some amazingly intelligent people and I’ve always been just a little competitive. I don’t see my skills getting rusty – I see an opportunity.
Posted by tledwards
In a little over a week, I’ll be starting my new position at SQL Sentry. I’m excited about this new position for so many different reasons. I get to work with a great group of people – some of them I’ve known for awhile and some that I have yet to meet. Having worked solo for so long, I’m excited to be a part of a team. The other huge benefit is that Kevin Kline (Twitter|Blog) is going to be my boss. It’s not very often that you get to work for someone you respect and whose company you enjoy.
The other exciting part of this position is that I’m going to get back to educating. Since I was very young, I’ve wanted to teach. I even had the chance to teach computer science at the community college here for a few years. The only reason that I left that was the huge difference in compensation between teaching IT and practicing it. I was talking with my hubby a few months ago about where I wanted to go career wise. There were a lot of options, but they all included making use of some of the soft skills that I have including the opportunity to teach. With SQL Sentry, I’ll have the opportunity to help ensure that our customers know how to use our tools to their advantage. For any of you that have worked with SQL Sentry tools, there is a lot going on and I know well how little free time DBAs have. My hope is to work with Kevin and this team to provide learning content to help our customers get up to speed as soon as possible.
I’m so pleased that I’m getting this opportunity to work in a position that lets me make use of my DBA skills and make use of my educational background. Hopefully I’ll get the chance to talk with some of you in the near future!
Posted by tledwards
These days, I live and die by OneNote. I read a ton of technical blogs and come across a number of great scripts and when I do, I save them to OneNote. I take notes from meetings in OneNote and I even save videos and webcasts that I feel are especially pertinent to what I do in OneNote. I have a ton of notebooks in OneNote each with a bunch of sections and pages (in fact, my OneNote notebooks are about 15GB in size!). But the problem I have always had was that unless I wanted to search through my OneNote notebooks (which, I have to say, Microsoft certainly has included a very capable search functionality in this product), it was hard to find specific things because there didn’t seem to be a way to sort your OneNote sections and pages; basically, they just showed up in the order you created them unless you wanted to manually sort them (but who has the time for that!).
This was a problem until I came across this little lifesaver tool that makes keeping my OneNote notebooks tidy and in order. It is a little program called the “OneNote 2010 Sort Utility.” You can read more about this little golden nugget here.
If you decide that this little free utility might make your life easier, you can download it here.
And by the way, if you are using Microsoft Office 2010 Professional and you haven’t tried OneNote 2010 to organize your life (or at least your personal life). I strongly recommend giving it a spin. At first, it may seem a little daunting, just like writing your first SSIS package, being stared at by that blank screen. But, rest assured, there is help out there and a fairly active community of users. Once you understand it’s metaphor to a physical binder (if you are my age, you might even insert “Trapper Keeper” hereJ), with notebooks for different subjects and then sections within each notebook and then pages within the sections and the fact that you can actually print documents to OneNote 2010 as well as attach any kind of file, it becomes one of those tools that is hard to live without. In fact, it integrates so well with Outlook that if you have OneNote installed, your Outlook meetings will have a OneNote button on them and clicking that creates a page that contains all of the information from the Outlook invitation and then lets you take meeting notes. I could go on and on and, in fact, have because I intended this blog post to really only be about this OneNote 2010 Sort Utility, but OneNote is, unfortunately, one of those things that I am quite passionate about because it has saved my bacon a number of times. At any rate, if you don’t use OneNote or want to know how to use OneNote, here are some links to get you started (some of these might apply to OneNote 2007, which some of you may still be on, but the concepts generally also apply to OneNote 2010):
A couple of years ago, I wrote a blog post comparing a couple of different methods and/or products for SQL Server disaster recovery. Over the last couple of weeks, my company has had the opportunity to test the pre-release version of the latest version of Double-Take. I want to make it clear; this blog post is not an endorsement or criticism of the product, rather just some first impressions.
About four years ago, my company chose Double-Take as a disaster recovery solution because we were in the middle of relocating our data center across the country and we now owned a hotsite. This product, mind you, was chosen by a group of people that completely excluded any DBAs. We stumbled through what seemed to be a very kludgey installation process and finally got it to work on a couple of servers. We then proceeded to install this on all of our servers and were able to successfully use this to transfer all of our SQL Server instances to new hardware in our new data center. Compared to many of the options available four or so years ago, this was considered a big win.
Once the new data center was set up, we then proceeded to attempt to get it installed between the new servers in our new data center and the servers in our hotsite. For many of our servers, the installation went as expected (at least from our installation experiences from the data center move exercise) and we quickly got Double-Take mirroring up and running on several servers. The problem came when we tried to use Double-Take to mirror several mission critical servers that happened to sit in our DMZ. Because we were mirroring a server from our production data center that sat in the DMZ, we also had to mirror to a hotsite server that sat in a different DMZ. This exposed a huge weakness in the Double-Take product. Try as we may, we could not get the two servers talking across the two DMZ’s because Double-Take was dependent on WMI calls which meant that the ports used by WMI were dynamic and you could not predict which of the almost 65,000 ports it would choose, not a good thing for a DMZ as our network group was not going to open up all 65,000 ports in the DMZ (and rightfully so) for these two servers just to get Double-Take to work.
Fast forward four years and our DR strategy for our DMZ servers hadn’t really progressed much. That is until we pressed our account management team at Vision Solutions (the company that now owns Double-Take) as we were very tempted to just to drop all of the licenses because of the limitations of the software. After meeting with a couple of their engineers, we received a pre-release version of Double-Take 6 which has thankfully removed all dependence on WMI. With Double-Take 6, we only have to open up a maximum of four ports to get this to mirror an instance across the two DMZ’s. The test installation, after a couple of hiccups (this is pre-release software, after all), went fairly well and it is looking promising. We still need to do a comparison against servers running SQL Server 2012 to test its AlwaysOn capabilities against those of Double-Take and compare the costs to see which works best for us in the long run, but for now, I think we finally have a DR solution for our DMZ in Double-Take. And even if the AlwaysOn technology in SQL Server 2012 proves to be just as or more powerful, there is no way that I will be moving 160+ SQL Server instances to SQL Server 2012 any time soon. So here is hoping for continued success with Double-Take as a DR solution in our environment.
As many of you have probably noticed, I haven’t blogged in quite a while due to work commitments, health issues and various family commitments (don’t want to go on too long here with excuses J), but I decided a perfect first blog post back might be taking the stored procedure that a friend and his consulting group have so graciously shared with the community. I am, of course, speaking of Brent Ozar’s sp_BLITZ stored procedure intended to help a DBA see what they are inheriting when someone dumps a new server on them. I kind of took a different twist on this and figured that this might be a great tool to use on all of the SQL Servers in an environment periodically by creating a report around it.
I work as the lead DBA in an environment with over 160 SQL Server instances (a number that seems to grow by at least five or so every quarter) and somewhere in excess of 2,000 databases ranging in size from what some of you might consider laughably small to some rather large data warehouse databases, many of which are mission critical to our business. To manage this environment, I have a team of two other DBA’s that I lead. One started with the company the week after I started and the other, a junior DBA, has been with us just over a year. We have a great team, but even with three DBA’s, it is hard to be proactive without some tools to let you know what is going on. Unfortunately, we don’t have the budget for some of the major monitoring tools as the cost for our environment would be rather substantial. Needless to say, it is left to me and my team to be creative and create our tools and instrumentation. That is where Brent’s sp_BLITZ script comes in. With a report written around it that my junior DBA can go through on a weekly or monthly basis, we can be much more proactive with some of the more basic or fundamental settings that someone who shouldn’t have access to change, but always inevitably does, changes without our knowledge.
So, the report itself is pretty simple. Unfortunately, it does require that you have a server that has linked servers to all of your servers (we have a centralized DBA server that we use for this) and the sp_BLITZ script that can be downloaded from here has to be installed on each of these servers. This is a perfect use for the SQL Server 2008 Central Management Server feature that we have set up on our DBA monitoring server. What I have done in the report is created two datasets, one that queries a table that we maintain with an inventory of all of our SQL Servers which will feed the “Server Name” report parameter and the second which actually runs the sp_BLITZ stored procedure on the server that has been chosen from the dropdown. Brent has a great video on exactly what his script does at http://www.brentozar.com/blitz/. This report just gives you a format that you can go out and run off of your Reporting Services site or even schedule to run automatically in a Reporting Services subscription and have it automatically emailed to you or posted out in a document library on a SharePoint site if you are running Reporting Services in SharePoint integrated mode. This report does require that your Reporting Services service is at least 2008 R2 in order to work. One of the nice things about this report is that the URLs that Brent provides in the output for this stored procedure are active links in this report, so if you click in that URL cell, you will be taken to the page on Brent’s site that explains the Finding. Below are some screenshots of the report in collapsed and expanded form (all private information has been blacked out to protect the innocent or at least those who sign my paycheck J):
Figure 1 Collapsed Version of Report
Figure 2 Expanded View of Report
Setting Up The Report
So, to use the report that is freely downloadable at the end of this blog post, all you need to do is go into the Data Source for the report and change it to the name of your monitoring SQL Server or at least a server that has linked servers to all of the servers that you want to manage with this report, like so, replacing the text <Type Your Monitoring Server Here> with the name of your monitoring server.:
The next step is to make sure that you have a table on your monitoring server that has an inventory list of all of the servers from your environment and replace the <ServerName.database.schema.tablename> text in the query in the Servers Dataset with the pertinent information for your environment. See below:
From here, it is just a matter of deploying the report to your Reporting Services server and making sure that Brent’s stored procedure has been created on all of the servers that you wish to monitor.
The report can be downloaded here (you will need to go to Brent’s site mentioned earlier in this blog post to get the latest version of his sp_BLITZ script). I hope that you find this to be one of the many helpful tools in your tool chest to keep your environment in check.
Some of you who are my age will recognize the reference in the title as a line from the movie “Top Gun.” Most of you will probably look at the title and think that this blog post is going to be about project management. Unfortunately, you may be disappointed to learn that it is really more of a personal blog post – one about life management.
Not too be confused with the impact of spending a summer in Tucson (www.flickr.com/photos/nickdouglas/58786813)
Much of this year, I have pretty much felt like the title. This last week and weekend, I actually came to realize the impact that moving at this pace for as long as I have has had on me and, more importantly, my family. This weekend was the first time in a long time that I have truly taken a weekend off from work. Initially, it was more out of exhaustion and truly being burned out that I did it, but I came to realize that I got a lot more than rest out of it. It was the first time in a long time that I truly took the time to laugh with and thoroughly enjoy my family without having things like work, studying for MCITP exams, the PASS Virtual Chapter that I am a leader of, etc. nagging at me in the back of my mind. I discovered that you need to be very careful not to let outside responsibilities and activities take over your life and cause you to take your family for granted. Luckily, I have an extremely wonderful and supportive wife and great kids who have been very understanding throughout this hectic year. Such a support structure is a gift that we have to be very careful not to over utilize.
In this time where job security is probably at its lowest level in several generations, we have to be careful to leave time for our families and loved ones while also trying to hold on to our jobs. It is easy to lose focus and not give the proper amount of time and attention to those we love because they are not the proverbial squeaky wheel when we have things like projects, training, work travel, conferences, etc. tugging on us. It is a difficult balancing act to be sure, but one that I believe will pay dividends over and over the better we become at it. The thing we have to realize is that our loved ones will probably be the last ones to call us on this, so we have to make sure to be vigilant in keeping things balanced. Because of this, I have decided that even though resolutions are made at the beginning of the year, I am going to start mine early and resolve to try to cut down on the outside activities that have kept me from fully enjoying my family and managing the balance in my life. I think that not only will everyone in my family be better for it, but that I will be more productive and happy in the activities that I do decide to continue engaging in.
Posted by tledwards
| Tagged: Personal
I ran across an installation issue with SQL Server 2008 on a Windows Server 2008 server the other day that baffled me a little bit. I was installing an additional instance of SQL Server 2008 on a server that already had a SQL Server 2008 instance and right before the installation completed, it died with the error: “A MOF Syntax error occurred.” Further investigation into the Setup Bootstrap logs gave this detail:
An error occurred while processing item 1 defined on lines 14 – 16 in file D:\Program Files\Microsoft SQL Server\MSSQL10.TMS_MODELING\MSSQL\Binn\etwcls.mof.transformed:
2010-05-18 13:41:02 Slp: Compiler returned error 0×800700a4Error Number: 0×800700a4, Facility: Win32
2010-05-18 13:41:02 Slp: Description: No more threads can be created in the system.
2010-05-18 13:41:02 Slp:
2010-05-18 13:41:02 Slp: Sco: Compile operation for mof file D:\Program Files\Microsoft SQL Server\MSSQL10.TMS_MODELING\MSSQL\Binn\etwcls.mof.transformed failed. Exit code 3
2010-05-18 13:41:02 Slp: Configuration action failed for feature SQL_Engine_Core_Inst during timing ConfigNonRC and scenario ConfigNonRC.
Much investigation on the internet turned up a lot of people that have been having this issue, but very few answers. After many installs and uninstalls, I finally tried the following, which seemed to work:
- I ran the setup.exe as an administrator (right click on setup.exe and click “Run as administrator) even though I am a local administrator on the box.
- I installed SQL Server using the Network Service account instead of the normal domain service account.
- The installation succeeded and I just went into Configuration Manager and changed the service account to the domain account after the installation.
The SQL Server community never ceases to amaze me. The number of people that are willing to take time out from their jobs and families to volunteer is especially impressive.
I’ve had the good fortune to be able to volunteer for the Program Committe this year. My job is to pull together special projects and whatever other slave work Allen thinks up for me. I’ve had a number of volunteers that have put great work into our current project. This project has multiple steps and has required a ton of coordination between the volunteers – but it is all coming together. It’s something that’s been needed for awhile and now it’s going to be a reality. I’d name names, but I know that I’d forget someone. So thank you to everyone that’s helped out.
A big (virtual) cake for all of you!
It’s not just me, though. Tim’s in the process of re-starting the Performance VC. He had mentioned the need for volunteers through our blog, Twitter and Blythe Morrow(Blog/Twitter) put out a call for volunteers on the PASS blog. He’s been overwhelmed at the number of people that have asked to help out.
For all of you that volunteer for PASS – kudos to you! For those of you that are thinking of volunteering, but haven’t yet, get ahold of Tim or me or go here for additional volunteer opportunities.