I was tagged by TJay Belt (Twitter/Blog) in this latest series of blog stories.  I believe that it was started by Paul Randal (Twitter/Blog), carried on by Tom LaRock (Twitter/Blog) and then went viral.  Since ‘New Year’ seems to be synonymous with ‘everything going to heck in a handbasket’, it’s taken me awhile to respond, but here goes.

 

I’ll start by saying that if anyone would have told me that I’d be a DBA (or anything computer related for that matter)

You may ask yourself, well, how did I get here?

You may ask yourself, well, how did I get here?

when I was in college, I would have fallen down laughing.    My step-father was a biomechanical engineer and one of my main goals in life was not to be a geek like I thought he was.  I majored in Communications with a minor in English.   At the time of my

graduation I had never touched a computer or even wanted to.  So, how did I get to be a DBA?  Sheer coincidence.

 

IBM

Back in my kid-free days, I worked for IBM.  I actually had to use a computer (odd for me), but my responsibilities were working with IBM’s resellers and the maintenance plans they resold.  It was all about soft skills and I spent a ton of time on the phone with resellers.  All of the information that we gathered was stored in a(wait for it…) DB2 database.   After awhile, I took on the responsibility for putting together reports.  While there was definitely no administration going on, it was kind of fascinating to play with all of that data.  That all stopped, though, for my next life changing event.

 

And they looked so sweet...

And they looked so sweet...

Kids

I left my job at IBM just before I gave birth to my first child and became a stay-at-home mom.  Around the time my

 second child was born, I started to feel the desire to go back to school.  The odd thing is that the field that I was drawn to was computer science.  I’m not sure if it was due to some strange chemical imbalance or the need to spend time with something that actually had logic behind it, but I began my computer science degree shortly after my youngest son turned one. 

Going back to school with two little ones running around was definitely a challenge.  Getting to the end of an 800 line assembly language project and have my son smack his hand on the keyboard deleting it, helped me learn the value of saving and saving often.  I’m sure that trying to learn recursion while dealing with a cranky toddler helped my ability to persevere.   Eventually, though.  I completed the program and became a computer science instructor.  Teaching was and is still the field that provides me with the greatest amount of satisfaction.  I enjoyed it immensely and felt that I was good at it.  Unfortunately, though, by that time I was a single mother of two boys and job satisfaction doesn’t exactly pay the bills.

 

My first *real* job

After leaving my teaching position at the college, I was able to get a job teaching the medical staff at our local hospital the new order entry/documentation application.   I knew that this had to be temporary and that I needed to become a part of a more technical division.  During the process of keeping our training environment up to date, I ended up interfacing with our DBA group on a regular basis.  One of the DBAs left and that provided me the opportunity to join the team.   Our lead DBA was pure awesomeness and provided me with a good solid platform of knowledge.  That was back in 2003, completed my MCDBA in 2005 and the rest is, well, the rest is now.    Still working, still learning.

 

It was a crazy, twisted road to get here and I’m looking forward to the road ahead.  I’m not tagging anyone with this, but I’m thankful to TJay for giving me the chance to share my story.

Several weeks ago on Twitter, Colin Stasiuk (BlogTwitter), ) asked if anyone had a script to pull back job statuses from the SQL Server Agent.  I had been doing some work on a SQL Server DBA Console and had written some PowerShell scripts to give me various pieces of information and put together a PowerShell script that could be run as a SQL Agent job to periodically report the statuses of SQL Agent jobs.  Eventually, I think Colin went with a different solution, but I figured I would go ahead and post the PowerShell script that I came up with.  This solution has been tested against SQL Server 2000/2005/2008.

 

This first script is just a SQL script to create the SQL Server table that the PowerShell script writes the status information to and can be downloaded here, LastJobStatus_Table.sql.

CREATE TABLE [dbo].[LastJobStatus](
	[ServerName] [nvarchar](128) NULL,
	[Job_Name] [nvarchar](128) NULL,
	[Run_Date] [datetime] NULL,
	[Job_Duration] [time](7) NULL,
	[Run_Status] [varchar](50) NULL,
	[Sample_Date] [datetime] NULL
) ON [PRIMARY]

 

The next is the PowerShell script that does all of the work bringing back the SQL Agent job statuses.  It takes a parameter of Server, which is the name of the SQL Server that you want job statuses from.  Make sure that you change the name “MgtServer” to whatever the name is of the SQL Server where you intend to store the results from this script.   You’ll also need to change the root directory for where your scripts are loaded to match your environment. This script can be downloaded here, Get-LastJobStatusServer.ps1.

param([string]$Server=$(Throw "Parameter missing: -Server ServerName"))
[void][reflection.assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo")

##Point to library files
$scriptRoot = "D:\DBAScripts"

--change script root directory to match your environment
#$scriptRoot = Split-Path (Resolve-Path $myInvocation.MyCommand.Path)
. $scriptRoot\LibrarySmo.ps1	#Part of Chad Miller's SQLPSX project on CodePlex
. $scriptRoot\DataTable.ps1	    #Included with the scripts for this blog, also from a CodePlex project (http://www.codeplex.com/PSObject)

Set-Alias -Name Test-SqlConn -Value D:\DBAScripts\Test-SqlConn.ps1

##Define variables

## open database connection
$conn = New-Object System.Data.SqlClient.SqlConnection("Data Source=$server;
Initial Catalog=master; Integrated Security=SSPI")
$conn.Open()
$cmd = $conn.CreateCommand()
$cmd.CommandText= "	CREATE TABLE #JobsRun (	ServerName nvarchar(128),
											Job_Name nvarchar(128),
											Run_Date datetime,
											Job_Duration time(7),
											Run_Status varchar(50),
											Sample_Date datetime
										  );
					insert into #JobsRun
					select	@@SERVERNAME AS ServerName
							,j.name Job_Name
							,(msdb.dbo.agent_datetime(jh.run_date,jh.run_time)) As Run_Date
							,substring(cast(run_duration + 1000000 as varchar(7)),2,2) + ':' +
									substring(cast(run_duration + 1000000 as varchar(7)),4,2) + ':' +
									substring(cast(run_duration + 1000000 as varchar(7)),6,2) Job_Duration
							,case when run_status = 0
										then 'Failed'
								when run_status = 1
										then 'Succeed'
								when run_status = 2
										then 'Retry'
								when run_status = 3
										then 'Cancel'
								when run_status = 4
										then 'In Progress'
							end as Run_Status
							,GETDATE() As Sample_Date
					FROM msdb.dbo.sysjobhistory jh
						join msdb.dbo.sysjobs j
							on jh.job_id = j.job_id
					where	step_id = 0
					and		enabled = 1
					order by cast(cast(run_date as char) + ' ' +
								substring(cast(run_time + 1000000 as varchar(7)),2,2) + ':' +
								substring(cast(run_time + 1000000 as varchar(7)),4,2) + ':' +
								substring(cast(run_time + 1000000 as varchar(7)),6,2)  as datetime) desc

					delete from MgtServer.DBA_Console.dbo.LastJobStatus where ServerName = '$server'                       -- Change 'MgtServer' to the name of whatever the SQL Server is in
                                                                                                                                               -- your env that will house the LastJobStatus table which stores the
                                                                                                                                               -- results of this script
					insert into MgtServer.DBA_Console.dbo.LastJobStatus (ServerName, Job_Name, Run_Date, Job_Duration, Run_Status, Sample_Date)
					select	jr.ServerName,
							jr.Job_Name,
							jr.Run_Date,
							jr.Job_Duration,
							jr.Run_Status,
							jr.Sample_Date
					from	#JobsRun jr
					where	Run_Date = (	select	max(jr1.Run_Date)
											from	#JobsRun jr1
											where	jr1.Job_Name = jr.Job_Name)
					drop table #JobsRun; "
$cmd.ExecuteNonQuery()
$conn.Close()

 

There are references in the above script to a LibrarySMO.ps1 script that can be obtained from CodePlex (see the comments in the script for URL) and a DataTable.ps1 script (also from CodePlex, but included in the download file for this blog, for your convenience.)

# Taken from out-dataTable script from the PowerShell Scripts Project
# http://www.codeplex.com/PsObject/WorkItem/View.aspx?WorkItemId=7915

Function out-DataTable {

  $dt = new-object Data.datatable
  $First = $true  

  foreach ($item in $input){
    $DR = $DT.NewRow()
    $Item.PsObject.get_properties() | foreach {
      if ($first) {
        $Col =  new-object Data.DataColumn
        $Col.ColumnName = $_.Name.ToString()
        $DT.Columns.Add($Col)       }
      if ($_.value -eq $null) {
        $DR.Item($_.Name) = "[empty]"
      }
      elseif ($_.IsArray) {
        $DR.Item($_.Name) =[string]::Join($_.value ,";")
      }
      else {
        $DR.Item($_.Name) = $_.value
      }
    }
    $DT.Rows.Add($DR)
    $First = $false
  } 

  return @(,($dt))

}

Going with your gut

11 January 2010

This post is a response to Tim Ford’s Whose Blog Is It Anyway  challenge.  The opportunity to use the words: pony, nude and bong in a blog post about an actual experience was too much to pass up.

 

I’m a DBA and I’m logical – coldy logical, if you listen to K. Brian Kelley - but I’m here to tell you that sometimes you just have to go with your gut.  Most of us have an inner voice that clues us in on the things that you know but can’t rationalize or aren’t ready to deal with yet.  That’s the voice that lets you know that the URL in your son’s internet history with the word ‘PussyCat’, probably isn’t a site featuring live, nude cats.  Not everyone trusts that voice – just ask George “Let’s Have Padme Die Of A Broken Heart Instead Of Anakin Crushing Her To Death” Lucas – that would have been a far more awesome scene.

 

I’m here to talk about such a time, early in my career.   I had a great DBA to learn from, but he had moved on to another position.  I felt pretty firm in my knowledge and knew that, whatever came up I could fix or handle by simply using some magical tool, library or bong.

 

That’s when I ran into it – the problem that I couldn’t fix, but was going to cause me pain.  On a Friday evening I started seeing error messages in the SQL Server error logs that indicated that we were having disk issues.  Of course the error message didn’t read ‘Your disks are failing’, but everything that I was reading seemed to indicate that.  One thing that bears noting – this was the central order entry/documentation application for the hospital that I worked at – there was no acceptable downtime.  I contacted our system administrator who did some research and let me know that all of the disks looked fine.  Now, this was an SA whose knowledge I respected – he’d been in the field forever (not like ENIAC forever, but pretty close).   He alluded to the fact that I was still a newbie and probably didn’t diagnose the problem correctly.  At this point it was around noon on Saturday and even though I was getting tired of looking around, I figured that I’ve started, so I’ll finish.  I wasn’t able to find any information that was clearer than I had provided before, but I knew, in my gut, that we were about to have problems – big problems.  I called the SA again and tried to encourage him to look a little more closely.  I asked him to pretend that the RAID array was a horse stable.  From the outside, it might sound pretty happy.  On the inside it might look good initially, but as soon as he looks down he’s going to be very sad about the pony.  For some reason, that analogy didn’t work…

 

As the afternoon progressed, I kept with my gut feeling and bugged the heck out of that SA.  The disks actually gave up the ghost and 40 hours later (non-stop), with our application vendor and Microsoft on the line, we finally got the disks replaced and the application back online.  At the post-mortum, once everyone had gotten a few hours sleep, that same SA wanted to know how I knew what the problem was since none of the errors were absolutely clear about the cause.  I just told him that I knew in my gut that something bad was about to happen.  He said, “Well, if you’d told me that in the beginning, I would have done more research!”.

 

While the story above is true, some of the particulars have been changed to protect the innocent.

 | Posted by tledwards | Categories: Administration, DBAs, Uncategorized | Tagged: , , |

I was tagged by Jorge Segarra (BlogTwitter) who had been tagged by Thomas LaRock (BlogTwitter) in his post about his goals and themeword for 2010.  I was going to try to remain blissfully ignorant about being tagged, but then Tim went and posted his goals.  So I guess I’m on the line now.  My theme word for this year is:

 

Recharge

While there are many things that I want to accomplish this year, I don’t know that (m)any of them will occur until I can figure out a way to recharge.  I’m typically a self motivated type of person, but it seems like, during the previous year, I’ve hit the wall.

 

I’m not entirely sure what has caused this, but I’m guessing that it is some combination of the cyclical nature of job satisfaction, having a boatload of things going on at home and the disconnect between the amount of things that I would like to learn and the amount of free time that I have.

 

recharger

Is there a human connector on that thing?

I realize that there is no magic button that will instantly recreate the hunger for knowledge that I had when I began learning to be a DBA.  What I can do, though, is set some goals, work hard to follow through on them and be patient.   My hope is that in the process of achieving these goals, I’ll rejuvenate my love of this career path.

 

Goals

Pick one or two topics to focus on

I have at least three SQL Server books sitting on my desk and more at home that I haven’t done much more than flip through.  Rather than setting a goal to read all 3000 pages (doable, but daunting), I’m going to pick a couple of subjects to focus on and learn them as thoroughly as possible.  This is ongoing – if it’s March and I know everything there is to know about database corruption (or whatever it is I end up focusing on), I’ll move on to the next subjects.

 

Blog more

My first love is teaching.  It invigorates me and gives me purpose.  Blogging provides me an arena to hopefully teach people that are learning to be DBAs and the chance to share what I’ve learned.

 

Become more involved with PASS

As I’ve mentioned in previous posts and as Tim mentioned in his goals for 2010, we’ve talked often about starting a PASS chapter in Tucson.  This ties into my love of teaching and will help us to connect with folks locally who have similar interests.  I would also like to take part in other committees within PASS as needed.    This will definitely require a balancing act with work and family, so I’ll be taking baby steps to ensure that I don’t shortchange other areas in my life.

 

What does this all mean?

None of these individual goals are earth-shattering and that’s intentional.  I have a tendency to swing for the bleachers, but end up hitting to the pitcher and it makes me grumpy.  My hope here is that I make some good, solid line drives and then I’ll be set up to hit it out of the park.

 

I’m tagging a couple of people that have unknowingly helped me to recharge (some thank you, eh?) 

TJay Belt (BlogTwitter)

Wendy Pastrick (BlogTwitter)

Kendal Van Dyke (BlogTwitter)

A few days ago, in his blog Goals and Themeword for 2010, Jorge Segarra (Blog - Twitter) tagged Lori and me to write a blog about our goals and theme word for 2010.  While the title of my blog is somewhat sarcastic, it really reflects facing a year that will be full of immense opportunity and challenges as the result of a successful 2009. 

 

In 2010, I face opportunities on all fronts in my life, professional and personal, so here I outline some of those and end with what I feel will have to be my theme word for the year.

 

Professional

On the professional front, 2010 is going to be a year full of many opportunities.  The biggest challenge will be to take full advantage of these opportunities without letting everything else slip.  Here are the major opportunities, as I see them:

  1.  It is starting off with me shifting my focus in my position with my employer from more of a support role (production DBA) to more of a strategic role by leading a team of DBAs and System Architects and striving to make them stronger as a team as they take on what seems to be an impossible list of projects.  As anyone who has made this shift knows, the key to doing it is learning to delegate effectively, however, as most that have made this transition also know, you never get completely out of the support role.  As part of this, I have the unique opportunity to get to mentor someone who has recently shown a great deal of aptitude on our help desk and help mold him into a junior DBA, something I know will be extremely exciting and rewarding to participate in.
  2. As I move out of my DBA comfort zone, I will be required to learn a ton of technologies that my team will be responsible for that I haven’t really had to worry about up to this point.  Some of these technologies (SharePoint, SCOM, PowerShell, SSRS) I find to be extremely exciting opportunities and will be a great chance for me to learn and grow.  Some of the technologies I will be required to learn (i.e., SSIS) are technologies that I have tried to avoid and would rather not deal with, but circumstances dictate that I must and I know they will end up being growth opportunities for me, nonetheless. 
  3. Getting to attend the 2010 SQL PASS Summit.  Unfortunately, I missed the 2009 PASS Summit and, consequently, missed many great opportunities to learn and network.  I have promised myself that this year I am going if I have to beg, borrow or steal (okay, that might be a bit of an overstatement, but you get my point :) ).
  4. Starting a PASS chapter in Tucson, AZ.  This is something that Lori and I talked about for a while and that we are going to be very passionate about in 2010.

 

Of course, this new focus and need to learn these new technologies means a huge investment of time which leads me to my next challenge/opportunity – work/life balance. 

 

Personal

Throughout 2009, one of the biggest things that I never thought I could get right was balancing the needs of my job with the needs of my family.  In 2010, this challenge will become exponentially more difficult.  If 2009 taught me anything it is that I need to go into 2010 with some sort of system or plan to try to make sure that I give my family the time that they deserve while still living up to my work commitments.  This is a challenge that I am still working on cracking.  Some of the personal opportunities and challenges I face in 2010, other than spending more time with my family include:

  1. Dealing with some personal issues of one of our kids as he strives to find out who he is on the way to adulthood.  We have had some challenges with this over the last year and are seeking some supplemental help, but the challenge will be to define and stick to a plan that will help our son become a happy, productive, well-adjusted adult.
  2. Getting more involved in our church.  This has actually been on the list for a while now, but it needs to become a priority.  This is where we lead by example, not only for our church, but for our family as well, and is something that I see as essential for us to get to where we need to be spiritually.  I know that we have been blessed with many great gifts and talents in our family and it is time that we use those to give back.
  3. Continuing to grow the relationships that we have cultivated with our many friends in the SQL Server community.  I have to say that getting to network with the SQL Server community around the world in 2009 via virtual conferences and social media was one of the most unexpected and rewarding professional experiences of 2009 and, probably, my career.    Most people would put this as a professional goal, but as I have interacted with many of you, I see the friendships that are cultivating as much more than professional connections and feel blessed to have been able to have these friendships.
  4. Blogging more.  Again, this could be go either way, professional or personal, but I consider it a personal goal as it isn’t something that is really required by my employer (or something that probably a lot of my coworkers even know I do) and our blogs aren’t always technical in nature.  If you told Lori and me at this point last year that we would we start a blog in 2009 and be syndicated by SQLServerPedia, we would have laughed and said that would be ridiculous because we couldn’t come up with enough to write about that anyone would want to read.  Fortunately, we did get the blog off the ground in the last few months of 2009 and wrote some articles that people had some interest in, so the next challenge for us is to keep putting out content from our professional and personal experiences that, hopefully, people will want to continue to read.  In this way, we can feel like we are contributing something back to the great SQL Server community that has helped us out so much.

 

Theme word

So, all of this leads to my theme word for 2010, management!  In order to have a shot at accomplishing all of these goals, it is going to require management; management of time, priorities, expectations, and resources.  This is going to be probably the biggest challenge that I have faced so far, but if I am successful, the rewards will be great and have an impact on my life and my family that will pay dividends for many years to come.

Here is hoping that all of you have a successful and happy 2010!

The gifts we have already

14 December 2009

Christmas is less than two weeks away, but, for me, it’s Christmas all year long.   I’d like to take this opportunity to give thanks for all of the gifts that I’ve received.  They weren’t wrapped in shiny paper and, typicially, weren’t under a tree, but they were gifts nonetheless and far more valuable than anything that I might unwrap.

 

My family – I love my family.  I was watching our two youngest decorate the tree this weekend, with holiday music playing in the background, and was awestruck with how fortunate I am.  It’s not always easy and usually not perfect, but I am thankful for them every day.   My hope is that I can be a good wife, mother and daughter and never take them for granted.

 

My husband – I know that he’s included in the family part above, but he deserves an extra thank you for any number of reasons.  Before we met, if someone had asked me to describe my perfect partner,  that description would have paled when compared with Tim.  I was watching Mike Walsh’s interview with Brent Ozar about blogging.  They were talking about the SQL Server community and how important it is because most people can’t go home to their partner and bounce work issues off of them.  How lucky I am that I can!  But even if I stopped being a DBA tomorrow, Tim would still be the best partner I could ever imagine.

 

Grandmothers – Also typically lumped in with the family group, but I needed to give grandmothers their own space.  Tim and I both lost our maternal grandmothers this year and, for both of us, they were the last of our grandparents.  We’re both thankful that we had all of this time with them and that our children had the opportunity to get to know their great-grandmothers.  I hope someday to be as good of a grandmother as they were.

 

Employment – I have to admit that I’m not always as thankful for my job as I should be.  Especially with the current economic environment, I need to be grateful that I have a job where I get to do what I enjoy.  This position has given me many opportunities to become a better DBA and I’m grateful.   They also provided the opportunity to attend the PASS Summit which was definitely one of the highlights of this year.

 

SQL Server Community – Hopefully you’ve had an opportunity to read some of my other posts on the SQL Server community.  Tim and I are continually amazed by true sense of community that has been fostered by so many SQL Server professionals.   I’m indebted to many of you for all of the learning and support that you’ve provided to me this year.   I feel fortunate that I was able to meet so many of you in person at the PASS Summit.   My wish for next year is that Tim can take part as well.

 

My faith – I have no doubt that all of the gifts that I’ve mentioned above and all of the gifts that were not mentioned were given to me by God.  Without my faith, I would not only be without many of these gifts, but I would not have the capacity to value them as much as I do.   On my own, I am definitely not worthy of everything that I have, but I am blessed each and every day.  I recognize this and give thanks for it.

 

I hope that all of you have a wonderful holiday season.holly

 | Posted by tledwards | Categories: Personal | Tagged: , |

We’ve been in the process of setting up reporting servers for some of our larger clients.  We’re also in the process of setting up DEV and QA environments (yes, Colin, I know, I know).  This involves restoring databases onto the new servers.

 

Transferring logins

There is a fairly detailed knowledge base article from Microsoft outlining transferring logins between servers.  This post is going to outline the gist of that article.  The first step is creating the  sp_hexadecimal stored procedure in the master database.  This stored procedure will be used by the sp_help_revlogin to create a hashed value for the login password.  The second is to create the sp_help_revlogin procedure – there is a script for SQL Server_2000 and one for SQL Server 2005/2008.  The output from running the  sp_help_revlogin procedure on the source server is a series of CREATE LOGIN statements for all of the existing logins.

 

There are three main benefits for using this method to transfer logins:

  1. It’s fast.  No scripting out individual logins.
  2. If the logins and user are from the same servers, the user and logins SIDS will match, avoiding any ‘orphaned users’.
  3.  The passwords come across with the logins.  Definitely preferable to keeping an Excel spreadsheet tucked away with all of the logins and associated passwords.

 

As with any process that is automatically generated, it’s best to review the scripts carefully before running them on the destination server.  A couple of issues that you need to watch out for are:

  • If a login has a defined default database that doesn’t exist on the destination server, obviously the CREATE LOGIN statement will fail.
  • If there are existing logins on the destination server, there is the possibility that a SID may already be in use.  In that case, you will need to create a new login and map the existing users.

Orphaned Users

CantGetThere

 One issue that comes up when transferring logins is that the database users may not be mapped to the server logins.   This occurs when there isn’t a corresponding login SID for the user SIDs in the database.  If the database(s) that are being restored are from a single server and the logins were transferred from that same server the SIDs will match.  If the SIDs do not match, the database users will not be attached to a server login and the user is said to be ‘orphaned’.

 

 There are some cases where orphaned users can’t be avoided.  In development environments, the databases on the destination server may be consolidated from more than one production server, logins may overlap and, as a result, the login and user SIDs don’t match.  Another instance would be when a production database is restored to a development server and the rights for the users in the development environment are different from their rights in the production environment.  For both of these cases, steps will need to be taken to match database users with their respective logins on the server.

 

 To determine whether there are any orphaned users, run the following stored procedure:

EXEC sp_change_users_login ‘report’

 

There are a few ways to handle these orphaned users.  The sp_change_users_login stored procedure can be used to update the connection between logins and users in SQL Server 2000.  While the sp_change_users_login has the option  to auto_fix logins, that action assumes that the username and login name match.  If they don’t, it will fail.  Using the Update_One option is a safer and the preferable way to handle it.  For SQL Server 2005/2008, the ALTER USER statement is the preferred method for mapping users to logins.  Greg Low’s article ‘Much ado about logins and SIDs’ provides a good explanation for these methods.

 

Quick script to map users to logins

 

If you’re sure that the usernames match the login names that they map to, here are a couple of quick scripts that you can run in each database that you restore from your production server to your development server to map the orphaned users – yes, they’re cursors, but they run quickly.

 

SQL Server 2000

DECLARE @Username varchar(100), @cmd varchar(100)
DECLARE userLogin_cursor CURSOR FAST_FORWARD
FOR
SELECT UserName = name FROM sysusers
WHERE issqluser = 1 and (sid IS NOT NULL AND sid <> 0×0)
    AND suser_sname(sid) IS NULL
ORDER BY name
FOR READ ONLY
OPEN userLogin_cursor

FETCH NEXT FROM userLogin_cursor INTO @Username
WHILE @@fetch_status = 0
  BEGIN
    SET @cmd = ‘EXEC sp_change_users_login ‘ + char(39) + ‘AUTO_FIX’ + char(39) + ‘, ‘ + @Username
    EXECUTE(@cmd)
    FETCH NEXT FROM userLogin_cursor INTO @Username
  END
CLOSE userLogin_cursor
DEALLOCATE userLogin_cursor

 

SQL Server 2005/2008

DECLARE @Username varchar(100), @cmd varchar(100)
DECLARE userLogin_cursor CURSOR FAST_FORWARD
FOR
SELECT UserName = name FROM sysusers
WHERE issqluser = 1 and (sid IS NOT NULL AND sid <> 0×0)
    AND suser_sname(sid) IS NULL
ORDER BY name
FOR READ ONLY
OPEN userLogin_cursor
 
FETCH NEXT FROM userLogin_cursor INTO @Username
WHILE @@fetch_status = 0
  BEGIN
    SET @cmd = ‘ALTER USER ‘+@username+‘ WITH LOGIN ‘+@username
    EXECUTE(@cmd)
    FETCH NEXT FROM userLogin_cursor INTO @Username
  END
CLOSE userLogin_cursor
DEALLOCATE userLogin_cursor

I went to install SQL Server 2008 on a Windows Server 2008 R2 box today for the first time and was greeted with the error “You must use the Role Management Tool to install or configure Microsoft .NET Framework 3.5.”  The text of this error was basically about as decipherable as the voice of the adults in the old Peanuts cartoons, so I fired up my old friend Google to find out what to really do.  It seems that Windows Server 2008 R2 ships with .NET Framework 3.5.1 and in order to proceed, you need to go to Server Manager (that window that pops up when you login and stays unless you close it) and enable (really, install) the .NET Framework 3.5.1 and any prerequisites.

ss2008_error_1

 

ss2008_error_2

Last week Microsoft released CTP3 for SQL Server 2008 R2.  At first blush, it doesn’t sound like it should be a huge release.  It is really a minor version release (10.5) as the next major release is scheduled to be SQL Server 2011, so one would expect to see maybe some additional or increased functionality, but not really any huge changes. 

 

As it turns out, there are some additional features with this release, but the big change is coming in the editions and pricing of SQL Server.  Microsoft is adding two additional “premium” editions of SQL Server, Datacenter Edition and Parallel Data Warehouse Edition, which are not only going to be more expensive than the Enterprise Edition, but will be required to take advantage of some of the key features of SQL Server 2008 R2, such as multi-server management of more than 25 instances, StreamInsight (complex event processing), Star Join Queries and parallel database copy (Parallel Data Warehouse Edition). 

 

The retail pricing for these two new “premium” editions are both set at $57,498 per processor, with no option for per server plus CAL pricing.  Enterprise Edition pricing is set at $28,749 per processor or $13,969 per server with 25 CALs, while Standard Edition pricing is set at $7,499 per processor or $1,849 with 25 CALs.

 

The Enterprise Edition will be required to take advantage of new features such as PowerPivot, multi-server management of up to 25 instances and Master Data Services.

 

As part of SQL Server 2008 R2, Microsoft has announced that features such as backup data compression (which can reduce the size of backups by up to 60%) will be available in the Standard Edition where it was previously only available in Enterprise Edition.

PASS Summit Kool Aid

9 November 2009

Last week, I had the opportunity to attend the PASS Summit in Seattle.   As I had mentioned in a previous post, I was fairly anxious about attending, because I knew that there were going to be around 2,000 people there and I had met two in person.  Yep, 2 out of 2000.  Let’s just say that I wasn’t worrying about how to fit in time for catching up with those folks.

 

I knew that the PASS Summit would be a great learning opportunity.  I’ve attended Tech-Ed, SQL Server launches and other similar SQL Server events – the learning that occurred at those events was extremely valuable.  In looking at the sessions for the Summit, I knew that it was possible (probable) that my head would explode with newly gained knowledge.  There are plenty of folks that will be blogging about the sessions and all of the excellent speakers – I may be doing that in a future post, but that’s not my focus here.

 

koolaidman
Ohhhh yeeaaaahhhh!

My focus is on the PASS community.  While I already knew that there were helpful, friendly people that were already a part of PASS, I never thought that it would would pervade the entire conference.  I had the opportunity to meet an incredible number of people – those whom I was familiar with through Twitter, blogs or forums and those whose faces and names were new to me.  In every instance, they were accessible and welcoming.  In turn, these experiences encouraged me to go seek out and introduce myself to others.  This was truly a community in the best sense of the word.

 

Tim and I have been talking about getting more involved and have discussed starting a PASS chapter here in Tucson.  The experiences of last week have made me see that this is not only doable, but necessary.  I’ve supped of the PASS kool aid and it was not only yummy, it’s my new favorite drink.

 

I’m looking forward to keeping in touch with the people that I was fortunate enough to meet and becoming more involved in PASS, both locally and virtually.   My hope is to share this community with others and help it to grow.

 

On a more personal note, there were a few individuals that went above and beyond the call of  (professional) duty last week.  I hope that I’ve let you all know personally how much your thoughts and prayers meant.  Tim and I were pleasantly surprised and touched by your willingness to listen and help.  We truly thank you from the bottom of our hearts.

 

On a completely unprofessional note, I was overjoyed to be a part of the karaoke jollification (yeah, it’s a word) on Thursday night.  I was impressed with the singing (and dancing) talents of this crew.  I’m just hoping that there are no incriminating pictures…