I always thought that Mars was a planet, but apparently it also has to do with multiple pending requests within a single SQL Server connection.  MARS (Multiple Active Result Sets) was introduced in SQL Server 2005 and provided the ability to handle these multiple requests.  Like

Apparently this isn't the only Mars out there

Apparently this isn't the only Mars out there

anything else, though, it has to be used correctly.

 

About a week ago, I started seeing the following error on one of my servers:

 

DESCRIPTION:  The server will drop the connection, because the client driver has sent multiple requests while the session is in single-user mode. This error occurs when a client sends a request to reset the connection while there are batches still running in the session, or when the client sends a request while the session is resetting a connection. Please contact the client driver vendor.

 

After digging around and talking to some of the developers in house, I found that they were making use of MARS, but not always correctly.  To avoid the error above,  ”MultipleActiveResultSets=True” needs to be added to the connection string.  Adding that seems to have fixed the issues.

I ran across an installation issue with SQL Server 2008 on a Windows Server 2008 server the other day that baffled me a little bit.  I was installing an additional instance of SQL Server 2008 on a server that already had a SQL Server 2008 instance and right before the installation completed, it died with the error:  “A MOF Syntax error occurred.”  Further investigation into the Setup Bootstrap logs gave this detail:

An error occurred while processing item 1 defined on lines 14 – 16 in file D:\Program Files\Microsoft SQL Server\MSSQL10.TMS_MODELING\MSSQL\Binn\etwcls.mof.transformed:

 

2010-05-18 13:41:02 Slp: Compiler returned error 0×800700a4Error Number: 0×800700a4, Facility: Win32

 

2010-05-18 13:41:02 Slp: Description: No more threads can be created in the system.

 

2010-05-18 13:41:02 Slp:

 

2010-05-18 13:41:02 Slp: Sco: Compile operation for mof file D:\Program Files\Microsoft SQL Server\MSSQL10.TMS_MODELING\MSSQL\Binn\etwcls.mof.transformed failed. Exit code 3

 

2010-05-18 13:41:02 Slp: Configuration action failed for feature SQL_Engine_Core_Inst during timing ConfigNonRC and scenario ConfigNonRC.

 

Much investigation on the internet turned up a lot of people that have been having this issue, but very few answers.  After many installs and uninstalls, I finally tried the following, which seemed to work:

 

  • I ran the setup.exe as an administrator (right click on setup.exe and click “Run as administrator) even though I am a local administrator on the box.
  • I installed SQL Server using the Network Service account instead of the normal domain service account.
  • The installation succeeded and I just went into Configuration Manager and changed the service account to the domain account after the installation.

Unlike the similar Reese’s dilemmas, these results were far less favorable…

Reese's are yummy

Reese's are yummy

 

As many of these issues begin, a developer put together a procedure.  When he ran it locally, it ran in a matter of seconds.  For some reason, though, he wanted the procedure run from a remote server.  When he attempted to run that same procedure from the remote server, it took a number of minutes.    It was apparent that it wasn’t the result set being returned, as there were OLEDB waits while the query was processing.

 

In looking at the query, it wasn’t anything spectacular.  It dumped some data into a temp table and then joined that table to an existing table and displayed the result.    The problem ended up being the temp tables.  While the query was being executed against the remote server, the temp tables were being created on the server the query was running on.  So, pulling the data into the temp table and joining the two tables all had to occur with data being pulled across the network.

 

The fix was pretty simple – change the procedure to create a table on the remote server and drop it once the procedure was finished.  The procedure ran in the expected time period.

 | Posted by tledwards | Categories: SQLServerPedia, T-SQL | Tagged: , |

The SQL Server community never ceases to amaze me.  The number of people that are willing to take time out from their jobs and families to volunteer is especially impressive.

 

I’ve had the good  fortune to be able to volunteer for the Program Committe this year.  My job is to pull together special projects and whatever other slave work Allen thinks up for me.   I’ve had a number of volunteers that have put great work into our current project.  This project has multiple steps and has required a ton of coordination between the volunteers – but it is all coming together.   It’s something that’s been needed for awhile  and now it’s going to be a reality.   I’d name names, but I know that I’d forget someone.   So thank you to everyone that’s helped out.

 

A big (virtual) cake for all of you!

A big (virtual) cake for all of you!

It’s not just me, though.  Tim’s in the process of re-starting the Performance VC.  He had mentioned the need for volunteers through our blog, Twitter and Blythe Morrow(Blog/Twitter) put out a call for volunteers on the PASS blog.  He’s been overwhelmed at the number of people that have asked to help out.

 

For all of you that volunteer for PASS – kudos to you!  For those of you that are thinking of volunteering, but haven’t yet,  get ahold of Tim or me or go here for additional volunteer opportunities.

Not-for-profit organizations can be awesome and extremely tough at the same time.   About 15 years ago, I joined a not-for-profit community service organzation.  Much like PASS, there were local groups, regions and  an international level.  On all three levels, this organization was definitely able to make a difference. I was fortunate to serve at both the club and regional levels and take part in an international event in Japan.  It was definitely a life-changing experience.

 

One of the problems that I noticed while I was serving on the regional board was that the more that we were able to accomplish, the more that was expected of us.  That was great and it was exciting to see the possibilites, but the problem that we faced is that our resources hadn’t changed.  Because of our tax status, we had to be very careful about how we used the funds raised during our fund raisers and the vast majority of that went to other non-profit organizations that we supported.  In order to give the regions and clubs more funding, we had a couple of options: raise dues, raise conference attendance costs or have more fundraisers during our conferences.  Raising dues and raising conference attendance had the possible side outcome of fewer members or fewer people attending conferences.  Additional fundraisers at the conferences took time away from what we were meeting about.  So we made the decision to make better use of what we had and revisit those ideas in the future.

 

I see a similar issue with PASS.  We’re incredibly fortunate to have an organization that provides as many resources as PASS does and with a free membership.  While there is a great staff at PASS, much of what gets done here is because of a community of dedicated volunteers.   I’ve been fortunate enough this year to be a part of the program committee and that has allowed me the opportunity to understand more of what goes on at PASS.

 

There have been many discussions about what PASS does well and less well along with what they should be doing.  The latest discussions have been about the PASS Summit survey results.  There have been a number of blog posts about it - Brent Ozar (Blog/Twitter), Tom LaRock (Blog/Twitter), Steve Jones (Blog/Twitter) and Andy Warren(Blog/Twitter), to name a few.  I’m not picking on these particular bloggers or even this particular discussion topic.  They all make valid points. 

 

The questions that I found myself asking (and answering) are:  Is this survey the best possible survey?  Probably not.  Did it provide PASS with valuable information?  Yes.  Are there people in the community that might be more skilled in writing/interpreting survey results?  Possibly.  Is paying for a company to write and interpret surveys for PASS the best use for our funds?  I don’t know.

 

If I were to look at the wish list for PASS, I’m sure that it would be huge.  Especially when it comes to items that require funding.  If there are additional things that are needed that will require additonal funding, that money needs to come from somewhere.  Do we increase the registration cost at the PASS Summit?  Do we institute dues?  Both of those choices have a direct affect on PASS membership and the members that are able to take part in the PASS Summit.  Short of that, we have to look at either companies or individuals that are willing to donate their time and resources.   Anyone that has volunteered for not-for-profit organizations know that getting companies and/or people to donate isn’t always the easiest thing.

 

I believe that members should continue to provide (constructive) criticism of PASS when it’s needed.  I don’t believe that there should be a step up or shut up attitude.   But if you can’t volunteer, then understand that PASS can’t continue to grow without also growing its resources.  If you have ideas, provide them.  If you have time, volunteer.  If you find that leprechaun at the end of the rainbow, take him out, steal the pot of gold and donate some of it to PASS.

 

I think most of us will agree that PASS is a pretty amazing organization.  It’s up to us to make it even better.

 

 

The Generate Scripts task in SSMS is insanely handy.  A few quick clicks and you can have tables, stored procedures, users, etc. scripted out.  I’ve used it quite a bit, but I ran into an unusual situation yesterday.

 

I needed to create a database schema and thought I’d use the handy dandy Generate Scripts task.  Popped through the wizard, clicked finish and it errored!  Here was the error message:

 

I was thoroughly confused – I was running SSMS 2008 against a SQL Server 2008 server.  I wasn’t sure where SQL Server 2005 even came into play.

 scripts_error

 

I went through the process again, this time paying closer attention and noticed this:

script_option

Apparently the Script Wizard defaults to SQL Server 2005.  I changed it to SQL Server 2008 and everything ran as expected.  While I had run this task against other SQL Server 2008 instances, apparently none of them made use of the new data types in 2008 and, as a result, didn’t generate errors.  Now why it would default to SQL Server 2008 is an entirely different question….

 

baby_facepalm

 | Posted by tledwards | Categories: DBAs, SQLServerPedia | Tagged: , |

Okay, maybe I’m being a little sarcastic.  I don’t troubleshoot dynamic SQL very often, so I don’t always see potential issues right away.  For those dear readers who work with it regularly, you should stop reading now – this is all pretty basic – but it took a few minutes out of my day.

 

This is the only dyna- that I like

This is the only dyna- that I like

My troubleshooting methods consist of displaying the command created by the dynamic SQL and seeing if it runs correctly or if I’m missing a quotation mark or something along the way.  There is probably a better way to troubleshoot, but again, I play with it so rarely that I’m stuck in a rut.

 

Evaluating Dynamic SQL commands

Late last week, a developer sent the following block of dynamic SQL code because he was having issues getting it to work:

EXEC
('
USE [master];
BEGIN
ALTER DATABASE [random_dbname] SET ONLINE;
WAITFOR DELAY ''00:01'';
END
USE [random_dbname];
'
)

 

I followed my normal troubleshooting methods and everything worked fine.  Trying to execute it as above, I received the following error message:

 

Msg 942, Level 14, State 4, Line 7
 Database 'random_dbname' cannot be opened because it is offline.

 

On first glance, I was confused, because it was obvious that I brought the database online.  I soon realized, though, that everything within the parentheses was being evaluated prior to being executed.  Apparently SQL Server  has a shorter memory than I do.

 

Breaking it into two separate statements like below accomplishes what needed to happen

EXEC

(

– Bring the database online

USE [master];

BEGIN

ALTER DATABASE [random_db] SET ONLINE;

WAITFOR DELAY ”00:01”;

END


)

go

EXEC

(‘ USE [random_db];

/*blah blah blah*/


)

Thinking that this ‘emergency’ had been handled, I went back to my other tasks. 

 

Database Context and Dynamic SQL

As these thing happen, though, received another call because after he ran all of this, the database context remained Master.  Fortunately, this was easy to explain.  The database context switching exists only during the execution of the EXEC statement and does not persist after its completion. 

 

None of this is rocket science or even deep SQL knowledge, but maybe it’ll save a minute or two for some other DBA out there.

 | Posted by tledwards | Categories: DBAs, SQLServerPedia, T-SQL | Tagged: , |

The Professional Association for SQL Server is restarting a Virtual Chapter focusing on SQL Server performance.  The goal of the PASS Virtual Chapters is to provide free and timely training focused on a particular SQL Server area or set of functionality (in this case SQL Server performance).  I have been honored to have been chosen to lead this particular Virtual Chapter, but, as you can imagine, this can’t happen without volunteers from the community.  We are looking for individuals to serve on the Performance Virtual Chapter steering committee that are:

 

  • Passionate about SQL Server (and who isn’t right? ;)
  • Interested in helping and serving the SQL Server community
  • Either have a blog or a Twitter presence
  • Willing to put in a couple of hours of work a week on such things as arranging speakers, putting together presentations, etc.  Generally, working to help get good education out to our SQL Server community on performance related topics.

If this sounds like you and you are interested in serving on the Performance Virtual Chapter steering committee, we want you!  Please contact Tim Edwards at sqlservertimes2@gmail.com.

 | Posted by tledwards | Categories: PASS, SQLServerPedia | Tagged: |

Last year my better half, Tim, suggested that we start a blog. It made sense for any number of reasons, but it scared the heck out of me. I couldn’t imagine that there was anything that I could ever blog about that hadn’t already been posted and probably by someone with much more experience than I have. I have a tendency (as I did today) to go out and search for other blog posts that cover the material that I’m about to write about to ensure that I’m at least adding something new with my post.

 

In school, if you’re assigned a paper on the pastoral imagery used in “The Adventures of Huckleberry Finn”, the instructor knows that there have been several works on that particular subject and assumes that you will be using (and referencing) information from those works. Blogging, for most people though, is not an assignment – it’s something that you make the choice to do. The people that read your blog assume that the ideas, tips and facts that you blog about are yours, unless you attribute them.

 

Over the past few months, there have been numerous tweets and blog posts about bloggers that have been plagiarizing other people’s works. In some cases the posts are lifted word-for-word and other cases they have selectively reworded the blog posts, but they were still identifiable. I have no idea whether it was intentional or that they were uninformed about how to use information from other posts. K. Brian Kelley [Blog/Twitter] wrote a post ‘Avoiding Plagiarism’ a couple of weeks ago. I thought I’d take this opportunity to add a little more information. As a note, I’m not an expert in plagiarism, so if any of you reading this post find errors, please comment and I’ll update this post.

On dictionary.com, the Random House Dictionary definition of plagiarism is:
“1. the unauthorized use or close imitation of the language and thoughts of another author and the representation of them as one’s own original work.
2. something used and represented in this manner.”

 

Reading this definition clarifies the reasons for my fear of blogging. I would never lift language from another blog post, but there have been blog posts that have inspired me (like K. Brian Kelley’s) to write a post. Here are some ways that I handle referencing other works.

 

I think you should read this

Example: Kevin Kline wrote an excellent post about the pains of not being included in a meme. You should read it here: http://kevinekline.com/2010/01/14/goals-and-theme-word-for-2010/
In this case, I have nothing to add, but I want to give my audience the opportunity to read great posts that I’ve come across.

 

You can say it better than I can

Example: PowerShell is fabulous. It’s so awesome that it’s caused some otherwise contentious DBA’s to wander astray. Colin Stasiuk [Blog/Twitter] admitted as much in a recent blog post : “…it’s no secret that I’ve been having an affair on TSQL. I’ve been seeing PowerShell behind her back and I gotta tell ya even after the initial excitement of a new language I’m still loving it. “
I know that I couldn’t have said it better than Colin, so in addition to linking to his post, I quoted his remark. Quotes should be used sparingly – if you find yourself quoting more than a sentence or two, you should probably use the example above.

 

Note: Blogs, whitepapers or other articles that are copyrighted require permission prior to their use. In addition, some online works have posted requirements on how they can be used. Brent Ozar (Blog/Twitter) has a good example of that here.

 

This is what I researched

Example: While the sp_change_users_login has the option to auto_fix logins, that action assumes that the username and login name match. If they don’t, it will fail. Using the Update_One option is a safer and the preferable way to handle it. For SQL Server 2005/2008, the ALTER USER statement is the preferred method for mapping users to logins. Greg Low’s (Blog/Twitter) article ‘Much ado about logins and SIDs’ provides a good explanation for these methods.

 

This is probably where unintentional plagiarism occurs most often. If, during your research, you read blog posts, articles, whitepapers, etc. and find useful information, your best bet is to attribute them. If you recall the definition of plagiarism above, it applies to both language and ideas, so if you learned something that you’re passing on a blog post or if you’re using that information to validate your ideas, they need to be cited. Again, keep in mind any copyright laws that might apply.

 

What doesn’t need to be cited

Common knowledge/generally accepted facts

Items that are common knowledge or generally accepted facts do not need to be cited.  Examples of common knowledge are:

  • A table can only have one clustered index
  • SQL Server is an RDBMS
  • Most SQL Server based questions can be answered with “It Depends”

 There is a decent article on common knowledge here.

 

 

Results of personal research

If you’re blogging about an incident that occurred or the results of test that you ran, they don’t require a citation. That is, unless, you did research to solve the incident or used other information to validate your test results.

 

Fair Use

The term ‘Fair Use’ had been bandied about in the recent plagiarism incident. The idea of fair use has no exact definition, but is determined by a set of guidelines. There is a good definition at Plagiarism.org and a good article titled “The Basics of Fair Use” by Jonathan Bailey. According to Plagiarism.org the guidelines look at:

  1. The nature of your use
  2. The amount used
  3. The affect of your use on the original

The ability to define fair use is pretty obscure and personally, I wouldn’t want to try and stand behind that argument.  The incident mentioned above definitely fell outside of those guidelines, in my opinion.

 

Public Domain

At some works fall out of their copyright term and become part of the public domain.  The Wikipedia article regarding public domain can be found here.  While the copyright laws no longer apply, they still require citations.  This point is moot for any SQL Server blogs, since, at this time, there aren’t any works old enough to have fallen out of their copyright term.

 

Conclusion

There is a huge amount of helpful information in blogs. Blogging also provides an opportunity for us to share information and experiences. I think that it’s understood that we learn from other people – just ensure that you credit those people for their hard work.

 | Posted by tledwards | Categories: DBAs, Discussion, SQLServerPedia | Tagged: , , |

As DBAs, we are increasingly being asked to manage more and more technology.  Some of that is the result of internal pressures (i.e. taking on additional roles) and some of that is the result of Microsoft bundling an ever increasing array of different technologies within SQL Server.  Dealing with these various technologies has become a weekly issue for me, but it really came to light today when I had a SQL Server Reporting Services 2008 server that was consuming 100% of the CPU.  Doing some poking around, I realized not only did I not really know anything about this beast called “SQL Server Reporting Services”, the tools to manage it are extremely lacking (now, that is my opinion coming from a position of complete ignorance about this technology).  I connected to the SSRS 2008 service with SSMS and, from there, I could only view three things:  SSRS jobs, security, and shared schedules.  I determined that none of the shared schedules were responsible for the utilization since nothing was scheduled to run anywhere near the time that the problem started, so that was a dead end.

 

Next, I connected to the Report Service by hitting http://[servername]/reports.  From here, I could look at all of the various reports that had been deployed to the instance, general site settings, security, both instance-wide and at a report level, and I could look at my own subscriptions.  The one thing that seemed to elude me was visibility into what, if anything, users were running on the SSRS server.

 

Frustrated, I connected to database instance through SSMS that hosts the ReportServer database.  I figured there had to be something in the database I could query to give me some visibility into what my SSRS instance does all day.  Thinking like a DBA, the first thing I did was look under “System Views” in the ReportServer database.  I saw two views, ExecutionLog and ExecutionLog2, so I decided to do a simple SELECT TOP 100* from each to see what they would give me.  This is where I stumbled upon my gold nugget for the day.  Right there in the ExecutionLog2 system view was all of the information that I had been looking for.  Running the following query, you can get a wealth of valuable information on what reports users are running, when they are running them, what parameters they used, and how long the report took to generate (broken down into data retrieval time, processing time, and rendering time) – all key information for trending the load that your server is under and <gasp> justifying new hardware, if needed.

 

SELECT InstanceName
       , ReportPath
       , UserName
       , RequestType
       , Format
       , Parameters
       , ReportAction
       , TimeStart
       , TimeEnd
       , TimeDataRetrieval
       , TimeProcessing
       , TimeRendering
       , Source
       , Status
       , ByteCount
       , RowCount
       , AdditionalInfo
FROM ExecutionLog2
WHERE CONVERT(VARCHAR(10),timeend,101) >= -- Some start date that you supply
AND CONVERT(VARCHAR(10),timeend,101) <= -- Some end date that you supply

 

To many of you who regularly use SSRS this may be very remedial, but I figured I would throw this out there for those DBAs who like me, have to learn this stuff on the fly in a crisis.

 

By the way, as a side note, for those who are curious about why the ExecutionLog2 system view was used instead of the ExecutionLog system view, it appears that the ExecutionLog system view exists for backward compatibility for SQL Server 2005 Reporting Services upgrades.  The ExecutionLog2 system view provides much more information than the ExecutionLog system view.