I am speaking at 24 Hours Of PASS: Growing Our Community.


The PASS organization puts of these all day webinar events called 24 Hour of Pass.   Usually they have some theme.  The one coming ups theme is Growing Our Community.  So they asked for new up and coming speakers to submit sessions.  I along with 23 others was lucky enough to be chosen.  They even takes place June24th-25th.   You can click on the image above for more information.

I will be speaking all about the basics of the SQL Server Agent and all the great things you can do with it.  So please check it out and all the other great sessions going on that day.  Also please register for a session that you want to see because there are limited number of registration per each webinar.

I have been meaning to blog about this and many other things but have had some of the busiest last couple of months I have ever had,  So hopefully I can get started back here more starting in July.

“MongoDB is the WORST!”

“MongoDB is the WORST!”

This is a statement (or something similar thereto) that I have heard or read many times.  I am not here to say that this statement is right or wrong.

Though if you ask that person “Why?” and their answer is “Well just because, haven’t you heard all the issues it has?”  Then that person can’t give some specific examples and THAT I will say is wrong.

I see a tendency for people to sometimes dismiss a technology on word of mouth alone.  People getting info second hand can cause for some things to be twisted


Also technology changes at a very fast pace.  So even if you find a blog post or article about a problem, check the date of the article.  If it is over a year old the problem that article discusses could no longer be an issue in a newer version.

For example, one thing I read about MongoDB is concurrency issues where writes would lock the entire MongoDB instance.   We should note that developers made this better in version 2.0 by having the lock only be on the database level.   And within the last couple of months with the release of 3.0, they have been able move the lock down to the collection level.

Am I saying that MongoDB is the best?  “No”.   I am not going to say it is the worst either.  I feel that it has its place.   Would I use it to replace an established RDMS, probably not.   Would I use it for something new that could benefit from a NOSQL solution like MongoDB? “Maybe”.   I would want to do more research first.

I use MongoDB as the example; however, you could replace “MongoDB”  in this post with many things  (Microsoft Azure SQL Database,  DynamoDB and even SQL Server).

I just want to advise people not to dismiss things on hear-say or limited knowledge alone.   Before you start making declarations about things being the “worst” or the “best” do your own research and don’t just parrot what you hear.



My first year as a speaker

This weekend I received word that I will be speaking at the SQL Saturday in Houston.   This reminded me that I have been speaking about SQL for almost a year now.  Last year’s SQL Saturday in Houston was my first time to speak about SQL outside of my workplace.

Since that event, I have spoken at a total of 3 SQL Saturdays (Houston, Austin, Albuquerque), as well as two user group meetings and presentations at my workplace.  I also will be speaking at three or four more SQL Saturdays in the next couple of months (Atlanta, NYC, Houston and hopefully Vancouver).

Thinking back to that first event as a speaker, it seems like a lifetime ago.   I drove down to Houston Friday afternoon for the speaker dinner that night.   I remember meeting the other speakers and having them ask me about what I was presenting.    I would answer and almost feel ashamed.   I wasn’t presenting on some new sexy feature or some in-depth advance talk.  All I was talking about was the SQL Server Agent.

The next day I went into the speaker room and while prepping for my session, I talked to a few more of the speakers.   One brief exchange with David Stein (b | t) did help calm my nerves some.  He told me something to remember: ‘once you step behind the podium, you all of a sudden become an expert. The people in the room are there for a reason they want to hear what you have to say.’

My session was right after lunch, and there were a good slate of speakers during that time slot , so I wasn’t expecting a large turnout.   There were only around 10-15 people in the session (which was probably a good thing given my inexperience).   I was so nervous going through the material that I ran through it at light speed.  I finished about 15 minutes early!   Afterwards I felt like I might have made a mistake trying to become a speaker in the first place.  The room monitor handed me back the session evaluations.   I thanked him and threw them in to my back pack with the rest of my things.

At this point I was thinking “well I gave it a shot.”  I felt that I was probably the worst speaker ever.  I got back to the speaker room to decompress a little.  I opened up my laptop to check my email and settled down with some snacks.   Eventually I decided it was time to see how bad the reviews were.

I was quite shocked.  There were actually positive comments!  It looked like some people actually got something out of my session.  There was constructive criticism, but there were no comments like “Waste of time” or “Worst Sessions ever”.   Most feedback I got was to engage with the audience more.  I realize that this was a problem with my session, I was doing everything real time in SSMS going through the different parts of the agent.   This meant I had to do a lot of moving the mouse around pointing and clicking on things.   Doing that caused me to be looking at my computer screen more than the audience.   Which, as a new nervous speaker it was my natural instinct to stare at a computer screen and not the audience.

So with this constructive feedback in hand, I worked a complete overhaul of the session by adding a massive amount of screenshots.  This enabled me to click from slide to slide and engage with the audience, talking more in-depth about what each issue was showing.

I presented the reworked session a few weeks later at the local Austin SQL group (CACTUSS) for both the North and South Meetings.  I was still nervous before giving the presentation, but to less of an extent as before.   The new deck and style of presentation worked a lot better and allowed for better interaction with the crowd.

My next speaking event was SQL Saturday Austin where I debuted a new session on parameter sniffing.  This was a session that took me many hours and several late nights to create.  This time I was less nervous than the previous outings… until my room started to fill up.  I ended up having a packed room with around 50 people!  The session went fairly well.  I still had issues, but I had a good crowd and felt pretty good afterwards.

I again looked at the session evaluations and was shocked by how positive the reviews were.   There was only one review that it seemed the person flat out hated it, but they did not give any real reason why, so it is hard to figure out how I could have better connected with that reviewer.    However, there was lots of great constructive feedback.   Which as a new speaker is something I appreciate very much.

A week later I went to give the same session at SQL Saturday Albuquerque.  This was my first time to go out of state for a SQL Saturday.  I was nervous because I didn’t know very many people who would be there.   This was a very different experience than the week before having the event at home.

I arrived at the hotel Thursday evening to check in at the hotel, to my delight a group of SQL family members were down in the lobby.  I knew who most of them were from following them on twitter, but  never had any face-to-face contact with them.   One recognized me as a SQL person and introduced me to the group. That night I went to dinner with a group of them and got to talk and hang out that night.

The next day I went to Mike Fal’s (bt) precon over Powershell.  It was a great precon and I got some great info out of it.  Mike was one of the people I got to talk with the night before, he also drove me around a lot over the week which was super nice of him.

Then came Saturday, the day of the event.  I was slotted for the last session of the day, which was good because it gave me extra time to tweak and add some things to my session using some of the great feedback I received from the previous weekend.

When it was time for my session I was again nervous.  I had a much smaller crowd than the previous week and I think most of them, like me, had their brains already full from all the great information they had received that day.  For some reason things just felt off during the session. Things just didn’t seem to come out right. I made my way through it though, and at the end I was disappointed because I felt like I took a step back from the session I gave the week before.

Again the reviews were very positive but I still felt like I let them down.

At the after party I got a chance to talk more with some great people. I spent a long time with John Sterrett (b | t) and Lori Edwards (b | t)  talking about things. After they left I talked with Melody Zacharias (bt) . This event was her first SQL Saturday to speak at and she did a great job. She also came to my session and gave me some great advice. She really helped me feel better about the session I gave and about what I was doing. I also got to talk with Mike Fal more and he gave me more great advice and things to think about. He also told me to get out there a blog.

I am amazed how much I have grown and learned in such a short time. I have gotten a chance to meet and talk with some absolutely wonderful people.

I would like to formally thank all the wonderful people who have helped me over this last year. First to Nancy Hidy Wilson (b | t) , Allen Kinsel (b | t) and the organizers of the SQL Saturday in Houston. They gave me the chance to do my first session and for that I will be forever grateful. I would also like to thank Grant Fritchey (b | t) something he said in a break during his precon he gave at a SQL Saturday in Dallas was what made me decide to actually try my hand at speaking. I would also like to thank John Sterett for his feedback on one of my sessions as well as offering to review my slide decks and scripts. Also I want to thank Meredith Ryan (b | t) and the organizers of the SQL Saturday Albuquerque for putting on a wonderful event and having me as a speaker. I would like to thank both Mike Fal and Melody Zacharias for the kind words and advice. You both helped fan my fire for speaking after having it dim a little after a session I felt that was not the best. And last but certainly not least (actually most) I want to thank my wife for all of her support. She has been very understanding with my late nights working on sessions, helping me practice my sessions, and being gone on the weekends to give these presentations.

So my advice to anyone thinking about speaking at a SQL event or usergroup: Give it a shot!  The SQL family is very welcoming to new speakers. There are many people out there happy to lend you a hand along the way. Even if you feel like you don’t have anything new to bring to the table I will disagree, you bring your own unique view and experience on things.

The estimated query plan and the plan cache (Part 2)

Continuing from the last post we now are going to look at what happens with the estimate plan when the plan in cache is invalid.

The specific example we will look at is where the plan in cache has become invalidated by dropping and re-adding a constraint on table that is part of the plan.

We will be using the same database ( AdventureWorks2012 ) and stored procedure ( spGetPersonByType ) from the previous post.

First we blow away the plan cache with DBCC FREEPROCCACHE.  Then we will run our stored procedure so we put a plan in cache.

EXEC [dbo].[spGetPersonByType] @PersonType = N'sp'

Now we have this plan in cache.


So now if we pull an estimated or actual plan for this stored procedure we will get this one with the Nested Loops join.

Next, drop and re-add an existing constraint on the Person table. We will do it on the constraint that determines the valid values for the PersonType column.

EXEC sys.sp_dropextendedproperty @name=N'MS_Description' , @level0type=N'SCHEMA',@level0name=N'Person', @level1type=N'TABLE',@level1name=N'Person', @level2type=N'CONSTRAINT',@level2name=N'CK_Person_PersonType'


ALTER TABLE [Person].[Person] DROP CONSTRAINT [CK_Person_PersonType]

ALTER TABLE [Person].[Person] WITH CHECK ADD CONSTRAINT [CK_Person_PersonType] CHECK (([PersonType] IS NULL OR (upper([PersonType])='GC' OR upper([PersonType])='SP' OR upper([PersonType])='EM' OR upper([PersonType])='IN' OR upper([PersonType])='VC' OR upper([PersonType])='SC')))

ALTER TABLE [Person].[Person] CHECK CONSTRAINT [CK_Person_PersonType]

EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'Check constraint [PersonType] is one of SC, VC, IN, EM or SP.' , @level0type=N'SCHEMA',@level0name=N'Person', @level1type=N'TABLE',@level1name=N'Person', @level2type=N'CONSTRAINT',@level2name=N'CK_Person_PersonType'

Now let us pull the estimate plan for the following.

EXEC [dbo].[spGetPersonByType] @PersonType = N'in'

Here we get the Merge Join plan.   This was what I expected to see since the plan in cache had become invalidated it doesn’t use the plan in cache and uses a new one as if there wasn’t one in cache.


So now lets pull an estimated plan for the following.

EXEC [dbo].[spGetPersonByType] @PersonType = N'sp'

The plan that comes back from this is a surprise to me.   We get back the Merge Join plan again.   In the previous post we saw that when there is no cached plan, the estimated plan will come back with the plan that would be generated for those specific  parameters.   Which in this case the @PersonType = N’sp’ would generate a Nested Loops join plan.

Here we will actually run the stored procedure using the value of ‘sp’ for the parameter with the include actual execution plan setting turned on (Ctrl + M).

EXEC [dbo].[spGetPersonByType] @PersonType = N'sp'

We see that the actual execution plan that is generated is now back to the Nested Loops join plan. Which is good because it means that the actual execution plan was not stored in cache when the estimate plan was pulled earlier. Though we do see that in subsequent pulls of the estimate plan, it will continue to bring back that same plan no matter what the parameter values are sent in.

This is strange behavior, especially looking at what we saw in the previous post. I plan to look into this farther and may revisit it in a later blog post. If you have any insights on this behavior feel free to comment below.

The estimated query plan and the plan cache (Part 1)

While looking at execution plans and parameter sniffing.   I started to wonder about estimated query plans and what differences there were if there was a plan already in cache or not for that query.

I already knew that if I ran estimated plan that it would not store a plan in the cache from previous reading.  Then I started to wonder how the estimate plan worked if there was a plan in cache versus no plan in cache .

So first let us add this stored procedure to the AdventureWorks2012 database.

 @PersonType NCHAR(2) -- Types IN (18484), EM(273), SP(17), SC(753), VC(156), GC(289)



SELECT       p.[FirstName],
 FROM       [Person].[Person] p
 INNER JOIN [Person].[EmailAddress] e 
           ON e.[BusinessEntityID] = p.[BusinessEntityID]
 WHERE p.[PersonType] = @PersonType


Next we will clear out the plan cache using DBCC FREEPROCCACHE. This way we can see the behavior of estimated query plan when there is no plan in cache.


Now we will highlight the following statement and not run it but use Ctrl+L to bring up the estimated query plan.

EXEC [dbo].[spGetPersonByType] @PersonType = N'sp'

We get back the following execution plan:

EstimatePlan1aWe get a simple plan with a nested loop join. Now if we look at the properties of the select we will see a few things. One is that the Parameter Compiled Value for our single parameter was ‘sp’. Also the other thing to note is the RetrivedFromCache comes back as false. Both of these is not surprising but still lets me know that this plan that is generated is the same as if I ran the proc for the first time using ‘sp’ as the parameter value.


Now we will get the estimated query plan for the same proc but parameter value. Again we don’ want to actually run this procedure yet.

EXEC [dbo].[spGetPersonByType] @PersonType = N'in'

We see that we get a simple plan again but with a merge join instead of a nested loop join.


If we look at the properties of the select we see something very similar to the one before.  The  Parameter Compiled Value has change from ‘sp’ to ‘in’.  Which is because we changed the parameter value passed in.   The RetrivedFromCache still is false.  The query hash is the same but the query plan hash is different.   This is because the query is the same just with different parameters but because of the different values of the parameters the plan that is generated is different.


Now let us actually run the following statement.   Then after it is runs  pull the estimated plan for the statement

EXEC [dbo].[spGetPersonByType] @PersonType = N'sp'

We get the same estimated execution plan with the nested loop joins as we did previously with this statement

EstimatePlan1aWhen we look at the properties we see one slight difference.   Now the RetrivedFromCache is true.  Since we actually ran the statement a plan got put into cache.   Then we pull the estimate plan we were able to retrieve it from cache without having to actually generate a plan.


Now let look at the estimated query plan for the statement with ‘in’ as the parameter value.

EXEC [dbo].[spGetPersonByType] @PersonType = N'in'

So now we no longer get the merge join plan for this statment we get the nested loop join plan.   Again as we see the same properties as before with the key being the Parameter Compiled Value is ‘sp’  which make a the nested loop plan.   Even thought the statement has ‘in’ as the parameter value.

EstimatePlan3bSo while this might not be surprising behavior.   I just wanted to not take it for granted and take a look at what all is happening more in depth.

That will do it for part 1 on this topic.     Next week I will look at if the plan get invalidate by updated stats or schema.   Before the plan gets rebuilt with a actually run of the query will the estimated plan pull the old one from cache or generate a new one that won’t be stored in cache.

New SQL Blogger Challenge – Accepted

A little while ago Ed Leighton-Dick issued a challenge for new SQL bloggers (and even established bloggers).   The challenge is to write a blog post once a week for all of April.  I saw this and knew that it was time to really try my hand at blogging.   I have been trying to for a while now but I would write a post then get sidetracked and never stick with if it for long.

Another thing i have done is go ahead and moved my blog to lancetidwell.com.   I was thinking about using silentdba.com but decided to heed advise that Brent Ozar gives to new bloggers to use there name and not some alias.

So the first post will be published April 7th for the challenge.   I might throw in a few other blog post that I have been meaning to write for months between the Tuesday challenge posts.

Goals for the Future

I have had a few goals I have wanted to complete but haven’t always been the best at keeping these goals a priority.   Life always gets in the way.  So this post is a public way for me to place these goals and try to hold myself more accountable.

1) Make at minimum two blog posts a month for a year straight.   Of these posts one needs to be technical in nature.

2)Give a presentation at local user group meeting.   This is something I have been meaning to do in the last year but just haven’t push myself to do it.

3)Submit a abstract for two SQL Saturdays.

4) Present at a SQL Saturday.

5) Learn to Skate.   Doing this one right now slowly but surely getting more comfortable.

6) Learn to Play Hockey.   Next step after I get down the skating portion.  I know this seems like something weird for a 34 year old person from the south to start doing.

7) Network better.   Like the name SilentDBA suggests I this is something that is uncomfortable for me but something I need to work on.

8)  Go back to PASS Summit.  This one I have already made a priority as I have already paid for it in early December 2014 so I could take advantage of the early bird price making this easier to do than paying the full price.

9) Take the family to Disney World.   Even though a lot of these goals are more professional goals I want to make sure I don’t lose sight of the wonderful family I have.

10)  Attend a SQL Cruise.  SQL Cruise is something that seems like great fun in so many ways.  This goal might be a little ways off but definitely something I want to keep in mind.

11?)  Get Certifications.  This is something that was on my goal list a while but now I am not sure about since the retirement of the MCM certification.   I didn’t see myself getting a MCM anytime soon but they way it was retired makes me leery of any certification at this time.