Monday, August 31, 2009

Submitted PASS Board Of Directors Application

Well after much deliberation I have decided to submit an application for the PASS Board of Directors.  I embark on this journey as a true learning experience.  Hopefully, I will make it to the interview stage and gain some additional insight on the selection process.  As the process proceeds I will update you all here on my blog.  Wish me luck!

Talk to you soon

Patrick LeBlanc

SQL Down South

SQL Lunch Calendar

We have been working hard trying to get speakers to complete the year for our SQL Lunch schedule. I am excited to announce that the calendar is almost complete. There are still a few openings in December if any of you are brave enough to give us fifteen to thirty minutes of your time email me at If you are interested in receiving updates about the upcoming SQL Lunches please go here: and register.

Date: 9/14/2009

Time: 11:30 AM

Presenter: Jeff Cole. Jeff has been designing and building information technology systems for over 20 years. He has experience across many industries including banking & finance, health care and telephony using a variety of mainframe languages as well as the gamut of Microsoft development environments. He has been working lately in the middleware space developing systems on multiple platforms using integration technology to solve complex connectivity problems. Jeff is currently working with Attunity to help customers modernize legacy systems and make the most of their technology investments with an emphasis on the Microsoft technology stack including SQL Server and Integration Services.

Topic: Oracle-CDC for SQL Server Integration Services.

Leverage your SQL Server skills and software to reduce cost and accelerate Oracle data replication, synchronization, and real-time integration and enable operational reporting, business intelligence and data warehousing projects.

Date: 9/28/2009

Time: 11:30 AM

Presenter: Thomas LeBlanc. Thomas is a Database Administrator for Amedisys, Inc in Baton Rouge, LA. He has been in the IT field for 20 years starting as a COBOL programmer, graduating to dBase, FoxPro, Visual FoxPro, VB 3 thru 6 and even some .Net(C#). Designing and developing normalized databases has become his passion. Full-time DBA work started about 8 years ago for Thomas while working at a Paper Mill in St. Francisville, LA continuing with IEM, then Amedisys. Current activity at Amedisys involves 4 other DBAs

Topic: Historical DMV Information

DMV are great to find information about performance without running a trace. But there information is lost once the SQL Server service is restarted. After finding some useful scripts online (Thanks SQL Server community!!!) , I decided to schedule a job to run the night before to store Index Usage, Least Used Indexes, SP Usage, Missing Index and Expensive CPU Usage. We will review the scripts and historical data captured.

Date: 10/12/2009

Time: 11:30 AM

Presenter: Tim Mitchell .Tim is a Microsoft SQL Server consultant, developer, speaker, and trainer.  He has been working with SQL Server for over 6 years, working primarily in database development, business administration, data integration, and automation.  He has earned a number of industry certifications and holds a Bachelor's Degree in Computer Science from Texas A&M at Commerce.

Tim is principal and senior data engineer for Tyleris Data Solutions, a business intelligence and data management consulting firm.  As an active member of the community, Tim has spoken at venues including SQL Saturday and the PASS Business Intelligence SIG, and is an active volunteer and speaker at the North Texas SQL Server User Group in Dallas.  Tim is an author and forum contributor on and has published dozens of SQL Server training videos on

Topic: Deploying Report Builder 2.0 for Self-Service Reporting

For the DBA, creating and maintaining reports can take up a significant amount of time.  Constantly changing requirements, an endless stream of ad-hoc requests, and simple formatting changes all add up to a good deal of maintenance.  To lighten the load, Microsoft offers SQL Server Report Buider 2.0, an easy-to-use reporting tool for building published and ad-hoc reports using SQL Server Reporting Services.  In this SQL Lunch presentation, we'll look at the capabilities of Report Builder 2.0 and how it can be used with SSRS to allow users to create and maintain many of their own reports.

Date: 10/26/2009

Time: 11:30 AM

Presenter: Barry Ralson. Barry is currently Vice President for Technical Solutions with Birmingham-based ComFrame Software. Since joining ComFrame in 2001, his client successes include Aflac, Honda, and the Children's Hospital of Alabama. In addition to speaking at the Alabama .Net Code Camps 1, 4 and 5, Barry has delivered presentations on Business Intelligence with Microsoft technologies at SQL Saturday 1 and 4.

Topic: Accidental DBA and Performance Data Collector

A new feature in SQL 2008 may well improve the ability of “accidental DBA’s” to monitor, manage and tune SQL Server.  This presentation will outline how to setup Performance Data Collector (PDC) and use the included Management Data Warehouse reports to take action on your database server based on information and not ‘gut feel’ or hunch.

Date: 11/9/2009

Time: 11:30 AM

Presenter: Brandon Fridge. Brandon is a Developer for Lamar Advertising

Topic: SQLCacheDependency

The SQLCacheDependency class provides an extremely lightweight and simple method of integrating your web app with the browsers built in caching.

Date: 11/23/2009

Time: 11:30 AM

Presenter: Patrick LeBlanc. Patrick is the Sr. Database Administrator for Lamar Advertising. He is the founder of, leader of the Baton Rouge Area SQL Server User Group, Founder of SQL Lunch and a regular blogger on

Topic: SSIS – Dynamic File Name

SSIS is a very flexible and robust tool. One of the great features is the script task. Using the script task I will show you how dynamically build file names.

Friday, August 28, 2009 – Looking for a Website Template

As most of you know I am hosting a bi-weekly Live Meeting event called SQLLunch.  I recently purchased and I am looking of a good template (free of course) so I can quickly bring the site up.  If any of you have any suggestions please send me an email at  I plan on posting recordings of the events on this site and displaying a calendar of upcoming events.

Talk to you soon

Patrick LeBlanc, Founder, and

SQL Down South

SQL Reporting Services 2008 Upgrade – The Call To Microsoft

I recently submitted an article to SQLServerCentral about my adventures in upgrading my existing SSRS 2005 instance to 2008.  In the article I described the steps taken to successfully upgrade a 2005 instance to 2008.  My initial attempt, which I spoke about in the article, was a cut and dry upgrade.  The entire process was Point, Click and Go.  That wasn’t the case when I attempted the upgrade on our actual Development Web Server that hosts our Reporting Services instance.  Towards the end of the install I received the following error:


Fortunately we have Premier Support with Microsoft.  I called Microsoft around 9 AM CST, and after about 5 hours of digging through the install logs the Support Engineer solved the problems.  I just have to say, hats off to the engineer, she did an excellent job.  So, I decided to give it another shot on another server.  I, of course, received the same error.  Unfortunately, I hadn’t received the summary of work from the engineer and as a result I was back in the same place, stuck.  I had written down a few steps and decided to give them a try.  I did receive this error before according to my notes and the only thing that needed to be done was restart the install, choose the Maintenance Option and run a repair.  Once the repair was complete I was able to finalize the installation and everything was up and running.  Thanks Microsoft Support.

Talk to you soon

Patrick LeBlanc, Founder and the SQLLunch

SQL Down South

Thursday, August 27, 2009

SQL Saturday #21 – Session Submitted

I just submitted my session for SQL Saturday #21 in Sunny Orlando Florida.  I have been working on a presentation for about about month and I want to test it out.  The title is, “Using the CLR to Monitor Disk Space”.  In the session I will show how I used a couple of CLR Functions and Stored procedures to centrally monitor disk activity all of our SQL Servers. The cool thing about it is that I have recently added a notification module.  If the disk reaches the configured threshold an email is sent notifying all parties of the occurrence.

I know that there are several tools that are already being used to perform such an action, but I am always looking for a new way to accomplish and old task.  For me, its a way to learn about new technologies and show how they can be put to work. 


Talk to you soon

Patrick LeBlanc, Founder and SQLLunch

SQL Down South

Wednesday, August 26, 2009 – The Rebirth and the Script of the Week

About two years ago I started, and initially the response was great.  Within two weeks I had received about 25 submissions.  This set me up for about 6 months of Script of the Week.  Unfortunately, the submissions came to a screeching halt and I really did not have the time to submit a script a week.  So I kind just let it fade. Since I have been blogging and becoming more active in the community, I have been receiving more and more emails about new scripts on the site. Therefore, I have decided to give the site “New Life”. I will be searching the Internet for scripts and attempting to write scripts that are worthy of the title, “Script of the Week”.  If I use a script that I did not write, I will give the author full credit.  So if any of you have some scripts that you would like to share with our SQL Server Community feel free to email it to or register at and submit it yourself. 


Once a week I will be posting “The Script of the Week” here.  If you are registered with you will receive the script via newsletter from the site.  This weeks script is written by Ian Stirk.  The script allows you to inspect cached plans and extract information that can be used in improving performance of your SQL Queries. 

CREATE PROC [dbo].[dba_SearchCachedPlans]
@StringToSearchFor VARCHAR(255)
Purpose: Inspects cached plans for a given string.

Parameters: @StringToSearchFor - string to search for e.g. 'MissingIndexes'.

Revision History:
03/06/2008 Initial version

-- Do not lock anything, and do not get held up by any locks.
st.text AS [SQL]
, cp.cacheobjtype
, cp.objtype
, DB_NAME(st.dbid)AS [DatabaseName]
, cp.usecounts AS [Plan usage]
, qp.query_plan
FROM sys.dm_exec_cached_plans cp
CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) st
CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) qp
WHERE CAST(qp.query_plan AS NVARCHAR(MAX))LIKE @StringToSearchFor
ORDER BY cp.usecounts DESC

Example Usage of the Stored Procedure:
1. exec dbo.dba_SearchCachedPlans '%<MissingIndexes>%'
2. exec dbo.dba_SearchCachedPlans '%<ColumnsWithNoStatistics>%'
3. exec dbo.dba_SearchCachedPlans '%<TableScan%'
4. exec dbo.dba_SearchCachedPlans '%CREATE PROC%MessageWrite%'


If you want more details about the query go to:  If you want to see more scripts like Ian’s please visit

Talk to you soon

Patrick LeBlanc. Founder and the SQLLunch

SQL Down South

Monday, August 24, 2009

Article Of the Week

I really enjoy reading technical articles. They often enlighten me on new technologies or provide me with a new way to accomplish an old task. On most days I am inundated with new reading materials (Books, Magazines, Web Postings, Blog Postings, T-SQL Scripts, etc…) and it is often difficult to sift through it all and find the ones that are really worth reading. However, since I am working on my PhD and consider myself an educator, I have decided to read even more. As a result, I have decided to share one of my favorite articles from the past weeks readings with all of you hear in my blog. Some of the articles will be new and some will be older, but each article should provide someone with some type of information. So let’s get started.

The first article, in the many to come was published in the January 2008 MSDN magazine. Ian Stirk’s article, Uncover Hidden Date to Optimize Application Performance, highlights the many uses of Dynamic management views and functions. He provided the reader with several queries that would assist in solving various performance problems that can be traced back to poorly performing database queries. Several queries are given that can help DBAs identify Costly Queries by I/O and CPU, Fragmented Indexes, Missing Indexes, Queries Suffering from Blocking, and many more. If I had to choose my favorite query among the many I would have to say the one that helps to identify Missing Indexes by Database.

Go out and read the article when you have some time, especially if you are having some performance problems. Thanks Ian for a great article. Check back next week for another article.

URL to article:

Talk to you soon

Patrick LeBlanc, Founder

SQL Down South

Friday, August 21, 2009

SQL Profiler – Wild Card Filter on TextData

WARNING: This may be very obvious to most of you. However, several people that I have encountered did not realize this could be done. Therefore, I decided to share this.

For a DBA the SQL Server Profiler is a must have tool in his/her arsenal of items when trying to solve problems. Recently, I gave a presentation on the Profiler at SQL Saturday in Baton Rouge, LA. The topic was “An Introduction to the SQL Server Profiler”. There were approximately 40 attendees and of those in attendance about 95% were developers. There were a few DBAs in the audience.

I started the talk by discussing  concepts pertaining to profiler terminology, trace events, trace categories, etc… Then I moved on to filtering traces. I discussed a couple of the obvious trace filters such as: SPID, database, and duration. Then as expected one of the attendees asked, “Can you filter a trace by text within a query”. Not exactly like that, but once I finished my probing that is what it summed up to be. One of the DBAs in the audience, to my surprise, immediately said, “You can’t do that”. Hmmmm, I thought.

In my experiences as a DBA, especially at my current place of employment, I have become very proficient in using the profiler. The primary Operational Database that I am responsible for uses mostly Dynamic SQL, which makes my life difficult (sometimes). Therefore, I cannot use the Stored Procedure events and filter on the Object Name or the ObjectID.

I digress. I said to the DBA of little faith, “sure you can”. I opened the Profiler and created a trace that only contained the SQL:BatchCompleted event from the TSQL trace category. I only chose the TextData column from the list of available columns, and of course the SPID must be displayed.


Then I clicked the Column Filters Button and added %select *% and %select*% to the TextData filter under the LIKE choice. The attendee wanted to find all the queries that included a select *. Notice that I added two variations of the request. I did this because it is syntactically correct to issue either statement, and I wanted to ensure that all variations of the request were captured by the trace. Like this:


Notice that I added the %, which is a wild card character for T-SQL, to the beginning and end of each filter. As expected, it behaved just as it does when used in a T-SQL query. After running a couple of queries that contained both filters the results where:


As you can see the trace captured both of the statements. Surprisingly, not one person in attendance realized that the SQL Profiler was capable of performing such a filter. As always please provide your comments, thoughts, criticisms and ideas regarding this.

Talk to you soon

Patrick LeBlanc, Founder


Wednesday, August 19, 2009

August SQL Server User Group – Near Real-time Data Warehouse

Well we had a really good meeting this week at our User Group.  There were about 17 physical attendees and 5 virtual attendees.  I spoke on the Near Real-time Data Warehouse.  This is the second time that I have given this talk.  Most of the attendees were very familiar with the concept of warehousing.  However, only a few were actually involved with designing and managing data warehouses. 

There were several questions about why do we need a Near Real-time data warehouse.  I ask the question myself every day.  Especially when there are problems, but the short answer is because our Operational Database is highly normalized some may even call it an Object-Oriented database.  Nevertheless, reporting on it is a nightmare for our Report Developers. 

I attempted to record the meeting with Community Clips from Microsoft, instead of the built in Live Meeting Recorder.  However, when I changed from my PowerPoint slides to Management Studio the recording fell apart.  The worst part is that I did not realize it until I tried to watch the video later that evening.  Probably should have done a little more testing before I decided to completely abandon the live meeting recording feature.  I am going to give it another try, but on a shorter presentation.

If you are interested in attending our User Group Meeting Virtually, please send me an email at

Talk to you soon
Patrick LeBlanc, founder
SQL Down South

Monday, August 17, 2009

Replication – Accommodating Application Inserts

Recently I have been involved in lots of projects involving replication.  I have done some things to make replication work in some very strange environments.  This posting discusses one of my adventures.  If you have any other suggestions on how this should or could have been done shoot me an email at pleblanc@tsqlscripts.  Please do post your comments here on the blog also.  I am always interested in learning and getting advice from all of you out there in SQL Server Land.

I received a call from a client regarding some replication issues they were having a couple of days ago.  He tried to describe it, but I just couldn’t understand.  Therefore, we scheduled a face-to-face and I was on my way.  This is what I found out:

They have a vendor database that is replicated to one of their local SQL Server 2008 databases in their environment, which I will call VENDOR.  This data must remain in an unchanged state.  No one touches that data.  That data is then replicated to a database, which I will call MERGE, that has several other SUBSCRIPTIONS from various sources, but all the sources are SQL Servers.  The challenge is that users are allowed to enter data into the MERGE database.  Even further, this data may be the same data that is replicated from the VENDOR database or vice versa.  Because of this, replication is breaking due to PRIMARY KEY VIOLATIONS. 

With that said, we had to come up with a solution.  A solution that would accommodate data from the application and data from replication that could potentially violate primary key constraints.  I asked the Application Developer how did he handle the violations, even though I had a pretty good idea.  He stated that he did a “check for existence” based on the primary key.  If the row existed he would update the row, and if it did not he inserted a new row.  Very simple solution.  I took his solution and applied it to the Transactional Replication that was already in place.

When configuring Transactional Replication you are given three methods of Statement Delivery.

  1. Using an Insert, Update or Delete statement
  2. Using an Insert, Update or Delete statement without a column list
  3. Call a stored procedure

I chose #3.  This allowed me to apply the same “check for existence” methodology that the developer used.  If you accept the default when configuring replication, SQL Server will automatically generate each stored procedure for you.  The following script is an example of a stored procedure that was generated.

CREATE procedure [dbo].[sp_MSins_dboTableOne]
    @c1 int,
    @c2 int,
    @c3 varchar(25)
     insert into [dbo].[TableOne]
        column1, column2, column3
    values (@c1,@c2,@c3)

The script was then modified to accommodate the Application Inserts, which can be seen in the following example:

CREATE procedure [dbo].[sp_MSins_dboTableOne]
    @c1 int,
    @c2 int,
    @c3 varchar(25)
    /*1*/if not exists(select * from dbo.TableOne where column1 = @c1) /*1*/
        /*2*/ insert into [dbo].[TableOne]
            column1, column2, column3
        values (@c1,@c2,@c3) /*2*/
        /*3*/update dbo.TableOne
            column2 = @c2,
            column3 = @c3
            column1 = @c1 /*3*/

The main modification is an addition of an IF NOT EXISTS and an UPDATE statement, which is labeled as Items 1 and 3, respectively.  Utilizing the primary key, I was able to check for the existence of the record.  If it did not exist a new row is added to the table, which is labeled as Item 2.  If it does exists, the row is updated based on the primary key, which is labeled as Item 3.  Since the values are passed as parameters to the Insert stored procedure, I was able to identify the primary key and all the other columns.  As a result, I was able to use them in my modifications.

I would use a lot of caution when modifying these procedures.  This database only receives approximately 2000 inserts and updates within a month.  You must ensure that the changes you are making will not increase the latency of your replication.  In others words, test, test, test, test.


Talk to you soon,

Patrick LeBlanc, Founder (

SQL Down South

Thursday, August 13, 2009

Upgrading To Windows 7

Well I spent most of this past Saturday upgrading my Laptop to Windows 7 Ultimate, which I have to say was not a bad experience.  If you are planning to do this, I suggest that you run the Windows 7 Upgrade advisor, which can be downloaded from here, prior to running the upgrade.  After it completed, a report was provided that that stated that I may have a few minor issues with the following applications:

  1. SQL Server 2008
  2. Visual Studio 2008
  3. Dell Web Camera Software
  4. Microsoft Virtual PC

Of the four listed, detailed information was only provided for the first.  I clicked on the link, and it stated that I need to install SQL Server 2008 service pack1 (sp1).  No problem, I downloaded sp1 and installed it.  I left the other three application as is.  I crossed my fingers and began the upgrade.  The entire process took approximately 3 and a half hours.  It appeared to get stuck on the step that was actually doing the upgrade, but that was just my impatience. Once it completed, I tested the applications in questions and here are the results:


  • SQL Server 2008 service pack OneSuccess!
  • Visual Studio 2008 service pack OneSuccess!
  • Dell Web Camera SoftwareSuccess!  Even though the Upgrade Advisor said it wouldn’t.   
  • Microsoft Virtual PC – Success! (With a few modifications).  When I initially tried to open Virtual PC I was prompted with the error:



I clicked YES, and then I was prompted with this:


I searched around a bit, but couldn’t find anything.  I clicked CANCEL and the first screen shot from above was shown again.  I clicked NO and this dialogue box appeared:


I clicked OK and Virtual PC was available.  It allowed me to start a Virtual PC and everything worked fine.  But since I can’t accept imperfections, I decided to do a quick web search to see what I could find.  The first page in my results led my here (, which was the answer.  The author, stated that all I needed to do was reinstall Virtual PC and all would be well.  My only concern was that I would lose all my existing Virtual PCs.  I read a little further and the author assured the readers that nothing would be lost.  I decided to roll the dice.  I reinstalled virtual PC and started it up.  Not only did all the prompts go away, but Virtual PC and all my saved PCs were there in tact.  Thanks Virtual PC Guy.


I have been using Windows 7 for about 5 days now and I really do notice significant improvements in the software.  It appears to run nice a quiet in the background (i.e. using very little resources). 

Talk to you soon

Patrick LeBlanc

SQL Down South

Wednesday, August 12, 2009

Database is in Transition

One of my clients called saying they were receiving the following error when trying to access a database: "Error 952 Database is in Transition". 

So I asked, what was the last thing that you tried to do with the database.  He said take it offline, and that it would not go offline. Ahhhhh.  I asked him to run the following query:

    r.command Command,
    t.text SQL_Statment,
    r.blocking_session_id Blocking_Session_ID,
    r.total_elapsed_time/1000 Total_Elapsed_Time_Seconds,
    r.cpu_time CPU_Time,
    s.login_name Login_Name,
    s.[host_name] [Host_Name],
    s.[program_name] [Program_name],
    s.memory_usage Memory_Usage,
    r.status [Status],
    db_name(r.database_id) Database_Name,
    r.wait_type Wait_Type,
    r.wait_time Wait_time,
    r.reads Reads,
    r.writes Writes,
    r.logical_reads Logical_Reads
FROM sys.dm_exec_requests r
INNER JOIN sys.dm_exec_sessions s
    ON r.session_id = s.session_id
CROSS APPLY sys.dm_exec_sql_text(sql_handle) t
    r.session_id <> @@spid

This query returns one row per authenticated session on the SQL Server and by using the CROSS APPLY to the sys.dm_exec_sql_text Table-valued function he was able to see the text of each SQL batch in the result set.  Before I could ask anything about the result he said, "there is a SPID that is blocked by something from the database that I was trying to take offline".  Unfortunately, I did not get a screen shot of the result set before we fixed the problem.  The blocking SPID was a query running in the database that he was trying to take offline.  The SPID that was blocked was trying to access the database he was trying to take offline.  I asked him to look at the text and let me know if we could kill the SPID. We were able to kill the SPID and the blocked session was finalized, either committed or aborted.  Likely aborted since the database was taken offline immediately after
the blocking session was killed and his problem was resolved.

So if you are ever taking a database offline try ensure that it is idle or you could end up spending a little $$ to get someone to help fix your problem, unless you are a DBA of course :).

Talk to you soon
Patrick LeBlanc
SQL Down South

LIVE MEETING REMINDER - Baton Rouge SQL Server User Group

Reminder:  The Baton Rouge Area SQL Server User Group will be broadcasting its user group via live meeting.  If you are interested please attend.  Here are the details:

Wednesday, August 12, 2009
5:45 PM - 8:00 PM

Attendee URL:



Patrick LeBlanc, MCTS


Patrick is a SQL Server DBA for Lamar Advertising in Baton Rouge, LA.  Patrick has worked with SQL Server since 1999, and has worked with versions 7, 2000, 2005 and 2008.


Near Real-time Data Warehousing 1.0

Keeping your warehouse up to date is usually a nightly process. Well not anymore. In this session I will explain how to use SQL Server and .Net to build a warehouse that populates dimensions and facts based on a time interval.


If you have any questions regarding this please email me at


Talk to you soon

Patrick LeBlanc

SQL Down South

Monday, August 10, 2009

Adding Multiple Columns to a Table With a Single T-SQL Statement

You know, I believe the old saying, “If you don’t use it you will lose it”.  Well, I write a lot of Date Manipulation (DM) T-SQL these days and very little (Data Definition) DD.  Since my projects have all completed and rolled to production, I am on the backend writing reports and doing some performance tuning.  Even when I was doing a lot of DML I never really had to do what I was asked by a developer recently.  The question was, “Can you add multiple columns to a Table with a single T-SQL Statement.  I thought to myself I am sure you can, but I wasn’t quite sure how.  So I took at stab at it:

ALTER TABLE dbo.Table1

     ADD Column1 int

     ADD Column2 int


Sorry that’s not correct.   I know what you are thinking, you call yourself a DBA.  Well, I really never had to do this before.  So I went directly to my reference of choice, Books Online (BOL).  I filtered my search on ALTER TABLE, scrolled down a bit and low and behold there was the answer:


ALTER TABLE dbo.Table1


        Column1 int,

        Column2 int


Very simple solution, you only need to add the ADD keyword once, then all you need to do is delimit the column definitions with commas.  Maybe the guys at SQLServerCentral can use this as the Question of the Day.  Thanks again BOL. 


Talk to you soon,

Patrick LeBlanc

SQL Down South

Friday, August 7, 2009

Baton Rouge SQL Saturday – Evaluation Summary

Well I just finished reviewing the evaluations for my presentation, Introduction to SQL Profiler, which I gave at the Baton Rouge SQL Saturday on August 1st.  The reviews were great, but this makes me wonder if some of the attendees were biased because I was involved in organizing the event.  Maybe they were a little generous on the evaluations.  I am not saying that I am not a good speaker, but of the 40 people in attendance not one provided me with any constructive criticism. I am not complaining about the good response, but I always enjoy when I get good and bad comments, this helps me prepare better for my next talk.  Our next user group will be broadcasted via a Live Meeting, so if you want to give me so good constructive criticism  on how I can make my presentations better, send me an email to pleblanc@tsqlscripts and I will send you the invite. 


Wait I spoke to soon, I just read found one of my evaluations stuck in a stack with another presenter. This person is a Microsoft Employee and had this to say:

“I suggest you use ZOOMIT to make the presentation better”. 

That is a great comment.  I deferred from using ZOOMIT in this presentation because I thought if I increased the font size in Management Studio that would suffice.  Apparently not.  I think next time, I will ensure that I continually poll the audience about presentation and demo visibility.


Talk to you soon

Patrick LeBlanc

SQL Down South

Use SQL Compare by Red Gate to Find a Missing Index

A couple of weeks ago one of our developers sent me an email saying, “I have a query that returns in about 3 seconds in one environment, but it never returns in the other environment.  Can you check for blocking?”

To appease the developer, I checked and there wasn’t any.  He had a few more ideas, but since the environment in question has very relaxed security policies I guessed right off what the problem was.  I asked the developer for a copy of the query in question and told him that I would look into it.  I don’t know how many of you use SQL Compare, from Red Gate, but if you have to maintain database consistency across several environments it is a must have.  Before I started digging too much into execution plans and query optimizations I decided to check for missing indexes.  I could have accomplished this task several ways, but it is really easy to compare two database Schemas using SQL Compare.  Not only will it show you missing indexes, but it will show differences between indexes that exist on both the Source and Target databases.  The following is a screen shot of SQL Compare. 


The compare in this screenshot was on a database that contains over 500 objects and it ran in less than a minute.  From the screenshot you can clearly see that one index and two statistics are missing from the Target database, this can be identified by the yellow highlights and the blue arrows.  Also, the Orange highlight notifies me of differences between any objects that exist on both databases, in my case indexes. For example, in the above screenshot there is an Index named CIX_CalendarDate on both the Source and Target databases.  From the compare I was able to quickly determine that I needed to change that index to a Clustered index on my Target database.  Once that change, along with the other discrepancies that I found using SQL Compare, was made I was able to resolve the developers problem within minutes.  Once again, making me look like a SQL Rock Star.  Thanks Red Gate!

Talk to you soon,

Patrick LeBlanc

SQL Down South

Wednesday, August 5, 2009

Call For Speakers – SQL Lunch Live Meeting

The Baton Rouge Area SQL Server User Group hosts Bi-Monthly Live Meetings that are 15-30 minutes long the consist of various technological topics ranging from Writing T-SQL queries to Data Warehouses design tips to Developing .Net applications. We are currently looking for speakers to complete our 2009 schedule. If you are up to the challenge and ready to deliver a “No Fluff Just Stuff” presentation please email at Please send the following:

1. Name
2. Short Bio
3. Abstract
4. Date interested in speaking

If you would like to receive event reminders about upcoming SQL Lunches click here to register or copy this link into your web browser(

Waiting to hear from you.
Patrick LeBlanc
SQL Down South

August 2009 Baton Rouge SQL Server User Group – Near Real-time Data Warehousing 1.0

Fresh off of my presentation of the SQL Profiler at SQL Saturday #17 in Baton Rouge this past Saturday, I am now preparing to speak at our next user group. The topic is Near Real-time Data Warehousing. I gave this talk at the SQL Saturday in Pensacola it was well received. However, before I could really get started presenting in Pensacola one of the participants asked, “Why do you need a Near Real-time Data Warehouse”. I was prepared to answer that question, but instead of using the rehearsed version of the answer I decided to traverse down another path. My answer went like this:

Have you ever actually tried to find a “Needle in a hay stack”? Of course trying to find that needle would by very difficult and time consuming and everyone agreed. I went back to my slide deck and revealed a query that consisted of THIRTY-FIVE joins. The results of that query was simply a detailed customer listing, which included columns like first name, last name, address, phone number, city, state, etc.. I continued by explaining why the query was so long and complicated. Our developers designed somewhat of an Object-Oriented database. As a result, we have a database that is at a level of normalization that does not exist (smile). Therefore, our team was posed with the task of de-normalizing the database to make it more report friendly.

The project was initially tasked as flattening the operational tables into a less normalized state, which was to include lots and lots of T-SQL and SSIS packages. However, as the project progressed someone suggested that we just make it a warehouse and try to populate it at certain intervals though out the day. I had built some very small warehouses, but nothing of this magnitude and visibility. As a result, I accepted the task. The result of the project is six FACT table and 23 dimensions that are built in intervals of 15 minutes using the following technologies:
1. Transactional Replication
2. Triggers
4. Message Queuing
6. A Custom ETL application.

If you want to find out more and you can’t make it to our local SQL Server Group (go to to get information about our local group) send me an email at to receive and invitation to a LIVE MEETING broadcast of our use group. If you can’t make either don’t worry, I am writing an article on the entire process that hopefully will be published somewhere soon. Please note that this is version 1.0, version 2.0 does not rely on items 2,4, and 6 of the aforementioned list. Instead we have incorporated Change Data Capture, which makes life so much easier.

Talk soon Patrick LeBlanc,
SQL Down South.

Tuesday, August 4, 2009

SQL Saturday - Top 5 Suggestions from the Attendees

After spending much of the last two days reading through over 200 SQL Saturday Event Evaluations I have come up with a list of GREAT list of suggestions from many of our attendees. Here we go:

1. Minimize the time between sessions. We allowed fifteen minutes between each session. I did notice that at times people were just standing around waiting for sessions to start. I thought this would be a good time to mix and mingle, chat with sponsors and speakers. However at least 45 people agreed that that time could have been spent as an additional session. Possibly repeating some of the more popular sessions at the end of the day.

2. I was asked several times, who is that speaker, where is he from, who does he work for. Basically the attendees really wanted speaker BIOs. I didn’t think that this was important during the planning of our event. After talking with so many people about each presenter and reading several evaluations asking for BIOs, I have decided that a BIO page will be added to each packet next year. I usually provide a brief introduction of myself before I present, but lately I just kind of make that very abbreviated.

3. One thing that surprised me was the number of request for novice sessions. I think we as technology professionals, especially those of us that have been doing it for many years, assume people understand and know things. This is not true. I was asked by one individual that worked the Help Desk for his company, “What does SQL stand for”. He actually knew what it meant, but his point was that we need a SQL 101 to bring people that aren’t developers and DBAs up to speed.

4. At every event there are typically three event levels, Beginner, Intermediate, and Advance. I made the mistake of scheduling several Beginner sessions in the same time slot. Pay close attention to the event level when building your schedule. You should also pay close attention that the presenter has correctly categorized the event in the correct level.

5. This suggestion may seem a little mundane and some of you may even say that those people were just being picky, but I actually thought this myself after I received my badge. Several of the participants at our event commented that we needed LARGER FONT SIZES on our badges, and I cannot disagree. I like to use a persons’ name when I am talking. I found myself squinting to read the name badges.

This is a very short list of the comments and suggestions that we were able to collect from event evaluations. I am not quite complete (stay tuned for more), but I have to send out an email about my talk coming up next week at our user group “Near Real-time Data Warehousing”. Not sure how many will be in attendance, but if there is only 5 people there it will be worth it to me. I will talk to you more about it tomorrow.

Talk soon Patrick LeBlanc,
SQL Down South.

SQL Saturday #17 – Baton Rouge

At 5:00 AM on August 1, 2009 I was awaken by a ringing cell phone. It was Mark, the official and dedicated facilities person and speaker for SQL Saturday #17, which was going to be (little did we know) one of the biggest SQL and .Net events ever held in our community (OVER 200 ATTENDEES). It was time to pick up 30 dozen donuts. I jumped up and began my day.

When we arrived at Louisiana State University (LSU) there were several people already there unloading tables, chairs, coolers and other stuff. We had a great team and it definitely paid off. Time moved quickly. Everything was going as planned until…. We were doing a final check on the rooms, you know, checking projectors, computers, etc. To our surprise, the projector and computer in the largest room was not working and the welcome speech was only 45 minutes from starting. I started to get a little nervous, but never really lost my cool. Mark made a call to LSU facilities and within 15 minutes someone was there and the situation was resolved.

Lesson Learned: To avoid a possible catastrophic situation, like loss of a ROOM, always secure one or two more rooms (if possible) when scheduling the event with your hosting facility.

I digress. With that situation resolved we noticed a long queue at the registration table, which was a good thing. People were gathering in every corner of the building, chatting, eating donuts and drinking coffee. At approximately 9:35 we announced that the welcome speech would be given in a few minutes by yours truly. We were a little off schedule, but since I was speaking I just stated the specifics about the schedule, thanked a few people and dismissed the audience. This was a good idea, because before the first session began we had registered approximately 130 attendees. With that much body heat it seemed a little warmer in the building than it was when we initially arrived.

Lesson Learned: If possible ask your hosting facility to leave the air on all night or ask can someone turn it on a couple of hours earlier than normal.

During this time of year a warm building and a hot day makes for very disgruntle individuals. Fortunately, it cooled off quickly and the day went on without a hitch. After the first session a suggestion was made that we should go around a bring drinks to the speakers, which turned out to be a great idea. Several speakers commented on the gesture and really appreciated it. Around 12:30 lunch began. We served lunch from Subway, which included a sandwich (Turkey, Ham, Roast Beef or Vegetable), Lays Chips, Cookies, and a soda or water. We did not get one complaint. I guess not it was free.

At the end of the day, we concluded the sessions with a grand raffle. We gave away several books, gift cards, software and many other items. One of the coordinators of the event actually created a small application that was used to pick the numbers. It would have worked out fine if we only had a few items to give away, but since we had over 40 items we had to speed up the process. I grab a bucket with the tickets in it and started calling out numbers. Everyone seemed to enjoy the back and forth between myself and the application.

We ended the day at a nice little daiquiri shop not too far from LSU with some Jambalaya and adult beverages. Fun was had by all and the event was a success. Stay tuned we are meeting this week to discuss: WHAT COULD HAVE BEEN DONE DIFFERENTLY TO TRANFORM A GOOD EVENT INTO A GREAT EVENT.

Talk soon Patrick LeBlanc,
SQL Down South.