/*Note – I have yet to see if this breaks anything – this post is for observational use only.  If it does break replication, I’ll add some notes to the bottom of this post */

 

Late last week, we had the fabulous opportunity to see just what happens when the dual power supplies for your SAN are plugged into the same PDU and that PDU gets powered off.  As you can imagine, this caused all sorts of neat things to occur.  Fortunately, the situation was remedied fairly quickly, but not before breaking replication for one of my databases.

 

The error that I received was: “The concurrent snapshot for publication ‘xx’ is not available because it has not been fully generated…”.  There were probably different routes to go to fix this issue, but I decided to reinitialize the subscription from a snapshot and went ahead and created the snapshot.  This particular database is a little under 500GB, so I knew that this would take awhile.

 

The next morning when I came to work, I could see that the snapshot had completed and the subscription had been reinitialized.   I knew that it would take some time for all of the commands that had queued up in the distribution database to be written to the subscription and so I went about my day.

 

Later, I noticed that the number of undistributed commands didn’t seem to be going down at all and was in fact growing.  This is a fairly heavily utilized database with writes going on very frequently, but it still seemed odd.   I did some querying to try to figure out just how behind we were, but, to my surprise, the subscription was completely up to date.  This is where it started to get strange.

 

I queried the MSdistriibution_status view to get an idea of what was still outstanding and saw the following:

Article_id Agent_id UndelivCmdsInDistDB DelivCmdsInDistDB
1 3 0 2130
2 3 0 3295
3 3 0 8275
4 3 0 2555
5 3 0 8310
6 3 0 2521
1 -2 2130 0
2 -2 3295 0
3 -2 8275 0
4 -2 2555 0
5 -2 8310 0
6 -2 2521 0
1 -1 2130 0
2 -1 3295 0
3 -1 8275 0
4 -1 2555 0
5 -1 8310 0
6 -1 2521 0

 

You’ll notice that the row counts for agents -1 and -2, the number of undistributed commands for each article was the same as the distributed commands for agent 3.

 

When I queried the MSDistribution_Agents table, I saw three distribution agents.  One was named identically to the distribution agent job.  The other two had subscriber_ids with negative numbers and the subscriber_db was ‘virtual’. 

 

I did some digging around on some unnamed search engine to try to figure out what virtual subscriptions were.  To be honest, there wasn’t  a lot of information to be found.  I did find a question on Eggheadcafe.com from 2007 where it seemed like the person was having a similar issue.  Reading through, it looked like it had something to do with the settings for allowing anonymous subscriptions and immediately synchronizing the snapshot.  I looked at my publication and both those were set to true.

 

Of course, I posted a question on Twitter at the same time and Hilary Cotter [Blog/Twitter], stated that the virtual subscriptions “are to keep your snapshot up to date, ignore them”.  That seemed to be consistent with what I was reading, but it still seemed odd that they appeared to be holding onto undistributed commands.  This, in turn, was causing my distribution database to grow.

 

Since, at this point, I figured that I was going to have to tear the whole thing down and rebuild it, I thought that I’d try to modify the publication first.  I knew that I didn’t want to allow anonymous subscriptions and I didn’t need to immediately synch from a snapshot, so I ran the following commands:

 

exec sp_changepublication @publication = 'xxxxx', @property = N'allow_anonymous', @value = 'false'

exec sp_changepublication @publication = 'xxxxx', @property = N'immediate_sync', @value = 'false'

 

I reran my query of the MSdistribution_status and voila, the undistributed commands were gone.  When I queried MSDistribution_agents, there was still a virtual subscriber, but it was no longer filling up my distribution database with undelivered commands.

 

I still don’t know what caused this – it seems like something occurred when I reinitialized the snapshot.  So far, replication is continuing to run without errors.  I’d be interested to hear if anyone has run into a similar issue.

Unfortunately, it has been awhile since I have posted a blog up here, but having spent several hours last night on Twitter with a number of esteemed members of the SQL Server community trying to educate a blogger, John Dunleavy [Twitter/Blog] about the proper way to credit authors when you use their work inspired me to get this post up.  Unfortunately, the issue of bloggers or web site operators using other people’s work without properly crediting them is becoming an increasingly more frequent occurrence.  Earlier in the day yesterday, Aaron Bertrand [Twitter/Blog] had a very similar issue with a SQL Server MVP (you can read Aaron’s blog post and the associated comments here). 

So, the point of my post today is not really to rehash the issue of plagiarism (intended or unintended), but rather to discuss why most of us give back to the community in the form of informative blog posts, volunteerism, and answering questions on the forums and to extend that a little bit, if I may, by offering the services and knowledge of the SQL Server community to educate new bloggers about how to get started, what is acceptable, and what is not.

 

First off, why do we do what we do?  Speaking for myself and,  I think,  many members of the SQL Server community, we do this because a) we were all in a position where we were just starting out and needed help; b) someone helped us, answered our questions, and we feel honored to be able to do the same; c) we take great pride in having one of the most open, collaborative, and philanthropically motivated communities in the world of technology.  These services are provided free of charge to anyone who visits any of the hundreds of great SQL Server blogs out there.  The number of books that you would have to purchase or expensive courses that you would have to attend to get anywhere near the content that is freely available on blogs in the SQL Server community would set you back many, many thousands of dollars and it still wouldn’t provide you with all of the benefit of the experience that this community writes from.  Through our discussions last night, some barbs were thrown our way such as “how can you help if your[sic] linin[sic] your pockets” and “You guys are being selfish.”  I would like to address those as I believe there must be a huge misconception about the motivation behind what we do.  First off, there is “no pocket lining” going on here.  Speaking for our site, http://sqlservertimes2.com, we pay for the domain registration and hosting out of our own pockets which, in my eyes, is an investment back into the community.  We receive no revenue from our site as there is no advertising or services sold from the site.  Now, that is just our choice and I want to be clear that there is absolutely nothing wrong with hosting ads offering services from your site, if you so choose, to help pay the bills, as long as the content of your site is original or you have at least obtained the permission of the original authors or copyright owners to host non-original material.  Our motivation is strictly a collaborative one.  Lori and I post issues that we have come across in our jobs and the solutions that we have come up with to solve them.  Throughout our careers, we have relied heavily on others’ blog posts for our professional development and feel honored to now be able to participate in that and provide something back.  Our compensation is solely the feedback we receive from readers that lets us know that we provided something that saved someone some time somewhere down the road, nothing more, nothing less.

 

So, where does this leave us?  As I have said many times, I think the SQL Server community is one of the greatest technical communities around.  One of the main reasons for this is the lack of egos and willingness to share.  The majority of us are not insecure and welcome new bloggers with open arms because all of us are constantly learning.  If a day goes by where I haven’t learned something new, it was not a very good day in my book and the more people out there sharing knowledge, the more likely I am to learn something new.  This is a very forgiving community and I believe that if you are a blogger who has or is plagiarizing the work of others because, for some reason, you didn’t realize that it was wrong or you don’t know how to get started blogging, reach out to the community and ask for help.  There are many of us who will gladly help you start sharing your own knowledge and gifts with the community, all you have to do is be willing to understand that plagiarism is stealing and wrong and be open to feedback from the group.  I know most technical professionals are proud and do not like to admit being wrong, but  many times being wrong is the first path to learning.

It would be so nice if something made sense for a change.

It would be so nice if something made sense for a change.

A couple of weeks ago, we had a database that had become corrupted due to a disk failure.  After running a DBCC CHECKDB, it appeared that there was one table with a corrupted cluster index.  Paul Randal (Twitter/Blog) verified that this was the case and that it was, unfortunately, non-recoverable without data loss.  Fortunately, we had backups and this database was a replication subscription, so between the backup and publication, we were able to rebuild the database without losing any data.  We then had to rebuild a kind of goofy replication set up, but that’s another story.

 

As I was doing some validation about the data that was contained in the backups and what needed to be pulled down from the publication, I realized that there was at least one other table that was corrupted.  This is where it became confusing.   I went back through the output from the CHECKDB and noticed that the table in question was not mentioned.   At this point, I went through and matched up all of the tables that were listed in the CHECKDB output against the actual tables in the database and found that there were three tables in the database that were not listed on the DBCC output.  I ran DBCC CHECKTABLE against the missing tables and while two of them came back with no error, one was definitely corrupted.  The CHECKTABLE command actually “terminated abnormally” on the corrupted table.   I went back through the the error logs and dump files that had been generated at that time.  Aside from the references to the CHECKTABLE that was run, the only object referenced was the table that had shown up in the CHECKDB output.

 

As a note, this is a 2008 SP1 database.  I know that, when running CHECKDB from SSMS, it will only display the first 1000 errors, but there were only 274 errors mentioned in the output, so I didn’t think that was the issue.  I asked Paul Randal (mentioned above) and his thought was that perhaps the metadata had also been corrupted and because of that CHECKDB may not have seen those tables as existing. 

 

We’ve recovered from the incident, but it was a curious experience.  Up until now, I’ve gone off of the CHECKDB output, but this gives me reason to think that there might be tables that aren’t included. I am interested to know whether anyone else has ever run into a similar incident.  Hopefully it was just an (odd) learning experience.

SQL Search – a review

2 February 2010

SQL Search a new (free!) SSMS plugin from Red Gate, was released yesterday, February 1st.    This tool allows you to search for text within SQL Server objects.    I’ve had occasions where I needed to search legacy stored procedures for decremented code (backup log … with no_log – egad!).  There have also been times where database servers were renamed and any references to the previous server name had to be changed. Prior to the beta version of this tool, I’d used the syscomments system table as a way to find text within stored procedures, but the results definitely weren’t pretty.

 

SQL Search allows you to enter a piece of text, specify either all objects or specific objects and whether you want to search all databases or a single database.

 

In the example below, I searched for the term ‘Reject’ occurring within all objects on the Reporting database.  The list of objects, their types and how the match was made is displayed in the upper half of the window.  When the object that is highlighted in the top of the screen is a stored procedure or function, the text is displayed in the bottom half of the screen, with the text highlighted.

sqlsearch

 

One of the most helpful features is the link in the middle of the screen to ‘Select object in Object Explorer’.  Just as it sounds, a click will take you directly to that object.

 

In working with it, I haven’t noticed many downsides.  It takes a bit of time the first time a database is used as SQL Search indexes it, but after that, searches seem to go pretty quickly.  The first time that you search a database on a server, SQL Search needs to be opened within the context of that server, but after that point, it is added to the dropdown box of servers that are selectable.

 

SQL Search seems to be a helpful tool and, as I mentioned before, it’s free.  To download, just click here to go to the Red Gate site.

 | Posted by tledwards | Categories: Administration, DBAs | Tagged: |

I was tagged by TJay Belt (Twitter/Blog) in this latest series of blog stories.  I believe that it was started by Paul Randal (Twitter/Blog), carried on by Tom LaRock (Twitter/Blog) and then went viral.  Since ‘New Year’ seems to be synonymous with ‘everything going to heck in a handbasket’, it’s taken me awhile to respond, but here goes.

 

I’ll start by saying that if anyone would have told me that I’d be a DBA (or anything computer related for that matter)

You may ask yourself, well, how did I get here?

You may ask yourself, well, how did I get here?

when I was in college, I would have fallen down laughing.    My step-father was a biomechanical engineer and one of my main goals in life was not to be a geek like I thought he was.  I majored in Communications with a minor in English.   At the time of my

graduation I had never touched a computer or even wanted to.  So, how did I get to be a DBA?  Sheer coincidence.

 

IBM

Back in my kid-free days, I worked for IBM.  I actually had to use a computer (odd for me), but my responsibilities were working with IBM’s resellers and the maintenance plans they resold.  It was all about soft skills and I spent a ton of time on the phone with resellers.  All of the information that we gathered was stored in a(wait for it…) DB2 database.   After awhile, I took on the responsibility for putting together reports.  While there was definitely no administration going on, it was kind of fascinating to play with all of that data.  That all stopped, though, for my next life changing event.

 

And they looked so sweet...

And they looked so sweet...

Kids

I left my job at IBM just before I gave birth to my first child and became a stay-at-home mom.  Around the time my

 second child was born, I started to feel the desire to go back to school.  The odd thing is that the field that I was drawn to was computer science.  I’m not sure if it was due to some strange chemical imbalance or the need to spend time with something that actually had logic behind it, but I began my computer science degree shortly after my youngest son turned one. 

Going back to school with two little ones running around was definitely a challenge.  Getting to the end of an 800 line assembly language project and have my son smack his hand on the keyboard deleting it, helped me learn the value of saving and saving often.  I’m sure that trying to learn recursion while dealing with a cranky toddler helped my ability to persevere.   Eventually, though.  I completed the program and became a computer science instructor.  Teaching was and is still the field that provides me with the greatest amount of satisfaction.  I enjoyed it immensely and felt that I was good at it.  Unfortunately, though, by that time I was a single mother of two boys and job satisfaction doesn’t exactly pay the bills.

 

My first *real* job

After leaving my teaching position at the college, I was able to get a job teaching the medical staff at our local hospital the new order entry/documentation application.   I knew that this had to be temporary and that I needed to become a part of a more technical division.  During the process of keeping our training environment up to date, I ended up interfacing with our DBA group on a regular basis.  One of the DBAs left and that provided me the opportunity to join the team.   Our lead DBA was pure awesomeness and provided me with a good solid platform of knowledge.  That was back in 2003, completed my MCDBA in 2005 and the rest is, well, the rest is now.    Still working, still learning.

 

It was a crazy, twisted road to get here and I’m looking forward to the road ahead.  I’m not tagging anyone with this, but I’m thankful to TJay for giving me the chance to share my story.

Several weeks ago on Twitter, Colin Stasiuk (BlogTwitter), ) asked if anyone had a script to pull back job statuses from the SQL Server Agent.  I had been doing some work on a SQL Server DBA Console and had written some PowerShell scripts to give me various pieces of information and put together a PowerShell script that could be run as a SQL Agent job to periodically report the statuses of SQL Agent jobs.  Eventually, I think Colin went with a different solution, but I figured I would go ahead and post the PowerShell script that I came up with.  This solution has been tested against SQL Server 2000/2005/2008.

 

This first script is just a SQL script to create the SQL Server table that the PowerShell script writes the status information to and can be downloaded here, LastJobStatus_Table.sql.

CREATE TABLE [dbo].[LastJobStatus](
	[ServerName] [nvarchar](128) NULL,
	[Job_Name] [nvarchar](128) NULL,
	[Run_Date] [datetime] NULL,
	[Job_Duration] [time](7) NULL,
	[Run_Status] [varchar](50) NULL,
	[Sample_Date] [datetime] NULL
) ON [PRIMARY]

 

The next is the PowerShell script that does all of the work bringing back the SQL Agent job statuses.  It takes a parameter of Server, which is the name of the SQL Server that you want job statuses from.  Make sure that you change the name “MgtServer” to whatever the name is of the SQL Server where you intend to store the results from this script.   You’ll also need to change the root directory for where your scripts are loaded to match your environment. This script can be downloaded here, Get-LastJobStatusServer.ps1.

param([string]$Server=$(Throw "Parameter missing: -Server ServerName"))
[void][reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")

##Point to library files
$scriptRoot = "D:\DBAScripts"

--change script root directory to match your environment
#$scriptRoot = Split-Path (Resolve-Path $myInvocation.MyCommand.Path)
. $scriptRoot\LibrarySmo.ps1	#Part of Chad Miller's SQLPSX project on CodePlex
. $scriptRoot\DataTable.ps1	    #Included with the scripts for this blog, also from a CodePlex project (http://www.codeplex.com/PSObject)

Set-Alias -Name Test-SqlConn -Value D:\DBAScripts\Test-SqlConn.ps1

##Define variables

## open database connection
$conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=$server;
Initial Catalog=master; Integrated Security=SSPI")
$conn.Open()
$cmd = $conn.CreateCommand()
$cmd.CommandText= "	CREATE TABLE #JobsRun (	ServerName nvarchar(128),
											Job_Name nvarchar(128),
											Run_Date datetime,
											Job_Duration time(7),
											Run_Status varchar(50),
											Sample_Date datetime
										  );
					insert into #JobsRun
					select	@@SERVERNAME AS ServerName
							,j.name Job_Name
							,(msdb.dbo.agent_datetime(jh.run_date,jh.run_time)) As Run_Date
							,substring(cast(run_duration + 1000000 as varchar(7)),2,2) + ':' +
									substring(cast(run_duration + 1000000 as varchar(7)),4,2) + ':' +
									substring(cast(run_duration + 1000000 as varchar(7)),6,2) Job_Duration
							,case when run_status = 0
										then 'Failed'
								when run_status = 1
										then 'Succeed'
								when run_status = 2
										then 'Retry'
								when run_status = 3
										then 'Cancel'
								when run_status = 4
										then 'In Progress'
							end as Run_Status
							,GETDATE() As Sample_Date
					FROM msdb.dbo.sysjobhistory jh
						join msdb.dbo.sysjobs j
							on jh.job_id = j.job_id
					where	step_id = 0
					and		enabled = 1
					order by cast(cast(run_date as char) + ' ' +
								substring(cast(run_time + 1000000 as varchar(7)),2,2) + ':' +
								substring(cast(run_time + 1000000 as varchar(7)),4,2) + ':' +
								substring(cast(run_time + 1000000 as varchar(7)),6,2)  as datetime) desc

					delete from MgtServer.DBA_Console.dbo.LastJobStatus where ServerName = '$server'                       -- Change 'MgtServer' to the name of whatever the SQL Server is in
                                                                                                                                               -- your env that will house the LastJobStatus table which stores the
                                                                                                                                               -- results of this script
					insert into MgtServer.DBA_Console.dbo.LastJobStatus (ServerName, Job_Name, Run_Date, Job_Duration, Run_Status, Sample_Date)
					select	jr.ServerName,
							jr.Job_Name,
							jr.Run_Date,
							jr.Job_Duration,
							jr.Run_Status,
							jr.Sample_Date
					from	#JobsRun jr
					where	Run_Date = (	select	max(jr1.Run_Date)
											from	#JobsRun jr1
											where	jr1.Job_Name = jr.Job_Name)
					drop table #JobsRun; "
$cmd.ExecuteNonQuery()
$conn.Close()

 

There are references in the above script to a LibrarySMO.ps1 script that can be obtained from CodePlex (see the comments in the script for URL) and a DataTable.ps1 script (also from CodePlex, but included in the download file for this blog, for your convenience.)

# Taken from out-dataTable script from the PowerShell Scripts Project
# http://www.codeplex.com/PsObject/WorkItem/View.aspx?WorkItemId=7915

Function out-DataTable {

  $dt = new-object Data.datatable
  $First = $true  

  foreach ($item in $input){
    $DR = $DT.NewRow()
    $Item.PsObject.get_properties() | foreach {
      if ($first) {
        $Col =  new-object Data.DataColumn
        $Col.ColumnName = $_.Name.ToString()
        $DT.Columns.Add($Col)       }
      if ($_.value -eq $null) {
        $DR.Item($_.Name) = "[empty]"
      }
      elseif ($_.IsArray) {
        $DR.Item($_.Name) =[string]::Join($_.value ,";")
      }
      else {
        $DR.Item($_.Name) = $_.value
      }
    }
    $DT.Rows.Add($DR)
    $First = $false
  } 

  return @(,($dt))

}

Going with your gut

11 January 2010

This post is a response to Tim Ford’s Whose Blog Is It Anyway  challenge.  The opportunity to use the words: pony, nude and bong in a blog post about an actual experience was too much to pass up.

 

I’m a DBA and I’m logical – coldy logical, if you listen to K. Brian Kelley - but I’m here to tell you that sometimes you just have to go with your gut.  Most of us have an inner voice that clues us in on the things that you know but can’t rationalize or aren’t ready to deal with yet.  That’s the voice that lets you know that the URL in your son’s internet history with the word ‘PussyCat’, probably isn’t a site featuring live, nude cats.  Not everyone trusts that voice – just ask George “Let’s Have Padme Die Of A Broken Heart Instead Of Anakin Crushing Her To Death” Lucas – that would have been a far more awesome scene.

 

I’m here to talk about such a time, early in my career.   I had a great DBA to learn from, but he had moved on to another position.  I felt pretty firm in my knowledge and knew that, whatever came up I could fix or handle by simply using some magical tool, library or bong.

 

That’s when I ran into it – the problem that I couldn’t fix, but was going to cause me pain.  On a Friday evening I started seeing error messages in the SQL Server error logs that indicated that we were having disk issues.  Of course the error message didn’t read ‘Your disks are failing’, but everything that I was reading seemed to indicate that.  One thing that bears noting – this was the central order entry/documentation application for the hospital that I worked at – there was no acceptable downtime.  I contacted our system administrator who did some research and let me know that all of the disks looked fine.  Now, this was an SA whose knowledge I respected – he’d been in the field forever (not like ENIAC forever, but pretty close).   He alluded to the fact that I was still a newbie and probably didn’t diagnose the problem correctly.  At this point it was around noon on Saturday and even though I was getting tired of looking around, I figured that I’ve started, so I’ll finish.  I wasn’t able to find any information that was clearer than I had provided before, but I knew, in my gut, that we were about to have problems – big problems.  I called the SA again and tried to encourage him to look a little more closely.  I asked him to pretend that the RAID array was a horse stable.  From the outside, it might sound pretty happy.  On the inside it might look good initially, but as soon as he looks down he’s going to be very sad about the pony.  For some reason, that analogy didn’t work…

 

As the afternoon progressed, I kept with my gut feeling and bugged the heck out of that SA.  The disks actually gave up the ghost and 40 hours later (non-stop), with our application vendor and Microsoft on the line, we finally got the disks replaced and the application back online.  At the post-mortum, once everyone had gotten a few hours sleep, that same SA wanted to know how I knew what the problem was since none of the errors were absolutely clear about the cause.  I just told him that I knew in my gut that something bad was about to happen.  He said, “Well, if you’d told me that in the beginning, I would have done more research!”.

 

While the story above is true, some of the particulars have been changed to protect the innocent.

 | Posted by tledwards | Categories: Administration, DBAs, Uncategorized | Tagged: , , |

I was tagged by Jorge Segarra (BlogTwitter) who had been tagged by Thomas LaRock (BlogTwitter) in his post about his goals and themeword for 2010.  I was going to try to remain blissfully ignorant about being tagged, but then Tim went and posted his goals.  So I guess I’m on the line now.  My theme word for this year is:

 

Recharge

While there are many things that I want to accomplish this year, I don’t know that (m)any of them will occur until I can figure out a way to recharge.  I’m typically a self motivated type of person, but it seems like, during the previous year, I’ve hit the wall.

 

I’m not entirely sure what has caused this, but I’m guessing that it is some combination of the cyclical nature of job satisfaction, having a boatload of things going on at home and the disconnect between the amount of things that I would like to learn and the amount of free time that I have.

 

recharger

Is there a human connector on that thing?

I realize that there is no magic button that will instantly recreate the hunger for knowledge that I had when I began learning to be a DBA.  What I can do, though, is set some goals, work hard to follow through on them and be patient.   My hope is that in the process of achieving these goals, I’ll rejuvenate my love of this career path.

 

Goals

Pick one or two topics to focus on

I have at least three SQL Server books sitting on my desk and more at home that I haven’t done much more than flip through.  Rather than setting a goal to read all 3000 pages (doable, but daunting), I’m going to pick a couple of subjects to focus on and learn them as thoroughly as possible.  This is ongoing – if it’s March and I know everything there is to know about database corruption (or whatever it is I end up focusing on), I’ll move on to the next subjects.

 

Blog more

My first love is teaching.  It invigorates me and gives me purpose.  Blogging provides me an arena to hopefully teach people that are learning to be DBAs and the chance to share what I’ve learned.

 

Become more involved with PASS

As I’ve mentioned in previous posts and as Tim mentioned in his goals for 2010, we’ve talked often about starting a PASS chapter in Tucson.  This ties into my love of teaching and will help us to connect with folks locally who have similar interests.  I would also like to take part in other committees within PASS as needed.    This will definitely require a balancing act with work and family, so I’ll be taking baby steps to ensure that I don’t shortchange other areas in my life.

 

What does this all mean?

None of these individual goals are earth-shattering and that’s intentional.  I have a tendency to swing for the bleachers, but end up hitting to the pitcher and it makes me grumpy.  My hope here is that I make some good, solid line drives and then I’ll be set up to hit it out of the park.

 

I’m tagging a couple of people that have unknowingly helped me to recharge (some thank you, eh?) 

TJay Belt (BlogTwitter)

Wendy Pastrick (BlogTwitter)

Kendal Van Dyke (BlogTwitter)

A few days ago, in his blog Goals and Themeword for 2010, Jorge Segarra (Blog - Twitter) tagged Lori and me to write a blog about our goals and theme word for 2010.  While the title of my blog is somewhat sarcastic, it really reflects facing a year that will be full of immense opportunity and challenges as the result of a successful 2009. 

 

In 2010, I face opportunities on all fronts in my life, professional and personal, so here I outline some of those and end with what I feel will have to be my theme word for the year.

 

Professional

On the professional front, 2010 is going to be a year full of many opportunities.  The biggest challenge will be to take full advantage of these opportunities without letting everything else slip.  Here are the major opportunities, as I see them:

  1.  It is starting off with me shifting my focus in my position with my employer from more of a support role (production DBA) to more of a strategic role by leading a team of DBAs and System Architects and striving to make them stronger as a team as they take on what seems to be an impossible list of projects.  As anyone who has made this shift knows, the key to doing it is learning to delegate effectively, however, as most that have made this transition also know, you never get completely out of the support role.  As part of this, I have the unique opportunity to get to mentor someone who has recently shown a great deal of aptitude on our help desk and help mold him into a junior DBA, something I know will be extremely exciting and rewarding to participate in.
  2. As I move out of my DBA comfort zone, I will be required to learn a ton of technologies that my team will be responsible for that I haven’t really had to worry about up to this point.  Some of these technologies (SharePoint, SCOM, PowerShell, SSRS) I find to be extremely exciting opportunities and will be a great chance for me to learn and grow.  Some of the technologies I will be required to learn (i.e., SSIS) are technologies that I have tried to avoid and would rather not deal with, but circumstances dictate that I must and I know they will end up being growth opportunities for me, nonetheless. 
  3. Getting to attend the 2010 SQL PASS Summit.  Unfortunately, I missed the 2009 PASS Summit and, consequently, missed many great opportunities to learn and network.  I have promised myself that this year I am going if I have to beg, borrow or steal (okay, that might be a bit of an overstatement, but you get my point :) ).
  4. Starting a PASS chapter in Tucson, AZ.  This is something that Lori and I talked about for a while and that we are going to be very passionate about in 2010.

 

Of course, this new focus and need to learn these new technologies means a huge investment of time which leads me to my next challenge/opportunity – work/life balance. 

 

Personal

Throughout 2009, one of the biggest things that I never thought I could get right was balancing the needs of my job with the needs of my family.  In 2010, this challenge will become exponentially more difficult.  If 2009 taught me anything it is that I need to go into 2010 with some sort of system or plan to try to make sure that I give my family the time that they deserve while still living up to my work commitments.  This is a challenge that I am still working on cracking.  Some of the personal opportunities and challenges I face in 2010, other than spending more time with my family include:

  1. Dealing with some personal issues of one of our kids as he strives to find out who he is on the way to adulthood.  We have had some challenges with this over the last year and are seeking some supplemental help, but the challenge will be to define and stick to a plan that will help our son become a happy, productive, well-adjusted adult.
  2. Getting more involved in our church.  This has actually been on the list for a while now, but it needs to become a priority.  This is where we lead by example, not only for our church, but for our family as well, and is something that I see as essential for us to get to where we need to be spiritually.  I know that we have been blessed with many great gifts and talents in our family and it is time that we use those to give back.
  3. Continuing to grow the relationships that we have cultivated with our many friends in the SQL Server community.  I have to say that getting to network with the SQL Server community around the world in 2009 via virtual conferences and social media was one of the most unexpected and rewarding professional experiences of 2009 and, probably, my career.    Most people would put this as a professional goal, but as I have interacted with many of you, I see the friendships that are cultivating as much more than professional connections and feel blessed to have been able to have these friendships.
  4. Blogging more.  Again, this could be go either way, professional or personal, but I consider it a personal goal as it isn’t something that is really required by my employer (or something that probably a lot of my coworkers even know I do) and our blogs aren’t always technical in nature.  If you told Lori and me at this point last year that we would we start a blog in 2009 and be syndicated by SQLServerPedia, we would have laughed and said that would be ridiculous because we couldn’t come up with enough to write about that anyone would want to read.  Fortunately, we did get the blog off the ground in the last few months of 2009 and wrote some articles that people had some interest in, so the next challenge for us is to keep putting out content from our professional and personal experiences that, hopefully, people will want to continue to read.  In this way, we can feel like we are contributing something back to the great SQL Server community that has helped us out so much.

 

Theme word

So, all of this leads to my theme word for 2010, management!  In order to have a shot at accomplishing all of these goals, it is going to require management; management of time, priorities, expectations, and resources.  This is going to be probably the biggest challenge that I have faced so far, but if I am successful, the rewards will be great and have an impact on my life and my family that will pay dividends for many years to come.

Here is hoping that all of you have a successful and happy 2010!

The gifts we have already

14 December 2009

Christmas is less than two weeks away, but, for me, it’s Christmas all year long.   I’d like to take this opportunity to give thanks for all of the gifts that I’ve received.  They weren’t wrapped in shiny paper and, typicially, weren’t under a tree, but they were gifts nonetheless and far more valuable than anything that I might unwrap.

 

My family – I love my family.  I was watching our two youngest decorate the tree this weekend, with holiday music playing in the background, and was awestruck with how fortunate I am.  It’s not always easy and usually not perfect, but I am thankful for them every day.   My hope is that I can be a good wife, mother and daughter and never take them for granted.

 

My husband – I know that he’s included in the family part above, but he deserves an extra thank you for any number of reasons.  Before we met, if someone had asked me to describe my perfect partner,  that description would have paled when compared with Tim.  I was watching Mike Walsh’s interview with Brent Ozar about blogging.  They were talking about the SQL Server community and how important it is because most people can’t go home to their partner and bounce work issues off of them.  How lucky I am that I can!  But even if I stopped being a DBA tomorrow, Tim would still be the best partner I could ever imagine.

 

Grandmothers – Also typically lumped in with the family group, but I needed to give grandmothers their own space.  Tim and I both lost our maternal grandmothers this year and, for both of us, they were the last of our grandparents.  We’re both thankful that we had all of this time with them and that our children had the opportunity to get to know their great-grandmothers.  I hope someday to be as good of a grandmother as they were.

 

Employment – I have to admit that I’m not always as thankful for my job as I should be.  Especially with the current economic environment, I need to be grateful that I have a job where I get to do what I enjoy.  This position has given me many opportunities to become a better DBA and I’m grateful.   They also provided the opportunity to attend the PASS Summit which was definitely one of the highlights of this year.

 

SQL Server Community – Hopefully you’ve had an opportunity to read some of my other posts on the SQL Server community.  Tim and I are continually amazed by true sense of community that has been fostered by so many SQL Server professionals.   I’m indebted to many of you for all of the learning and support that you’ve provided to me this year.   I feel fortunate that I was able to meet so many of you in person at the PASS Summit.   My wish for next year is that Tim can take part as well.

 

My faith – I have no doubt that all of the gifts that I’ve mentioned above and all of the gifts that were not mentioned were given to me by God.  Without my faith, I would not only be without many of these gifts, but I would not have the capacity to value them as much as I do.   On my own, I am definitely not worthy of everything that I have, but I am blessed each and every day.  I recognize this and give thanks for it.

 

I hope that all of you have a wonderful holiday season.holly

 | Posted by tledwards | Categories: Personal | Tagged: , |