## Lookups in PowerPivot Calculated Columns – Using the RELATED Function

In my previous post we looked at how Calculated Columns work in PowerPivot. One limitation you may have noticed though was all of the calculations were done using values in that individual table. What if you wanted to lookup a value in second table, based on a value in the first table, and return a value from that second table. Yes Virginia, not only is there a Santa Clause but there’s also an answer for us in PowerPivot’s RELATED function.

In addition to the standard Excel functions, PowerPivot provides a set of its own functions for working with its data. These new functions are collectively known as Data Analysis eXpressions, or DAX for short. By now you’ve probably guessed that the first function from the DAX toolbox you’ll want to learn is the RELATED function.

Let’s start with the same Excel 2010 workbook we had at the end of the lesson Combining Data from Multiple Sources in PowerPivot. If you recall we had imported data from the AdventureWorksLT2008 database. To that we added data the CountryInfo table, which we’d typed in to an Excel spreadsheet.

At the time we used this to get the CountryAbbr column, and you may have wondered why we also included a DiscountRate column. It’s this lesson where this DiscountRate will come in to play.

If you recall from that post, we used PowerPivot’s Manage Relationships feature to create a link that ultimately connected the SalesOrderDetail table to the CountryInfo table. This groundwork enables us to lookup values very easily. Here is a simple example. Go to the Add Column column of SalesOrderDetail and enter this formula into the fx area:

=RELATED(‘CountryInfo’[CountryAbbr])

When PowerPivot gets done, the abbreviation for each country to be shipped to appears in this column. This can be used to “flatten out” some of your data. However it’s much more useful when used as part of a calculation. Delete the column we just added (right click on the column header and pick Delete Column from the menu).

Looking in the CountryInfo table we see the DiscountRate. A value of 0.04 means our US customers get a discount of 4% off their LineTotal. So in our SalesOrderDetail table we want to take the LineTotal and calculate a new value based on the rate, which is stored in the CountryInfo table. Simple enough using the PowerPivot RELATED function.

=[LineTotal] * (1 – RELATED(‘CountryInfo’[DiscountRate]))

This is fairly simple math, we take the DiscountRate from CountryInfo (for US, 0.04) and subtract from 1 giving us 0.96, or 96%. This is then multiplied by the LineTotal giving us our new discounted LineTotal amount, which I renamed to DiscountedLineTotal.

Hopefully these simple examples will give you a glimpse of the immense power RELATED offers. It will allow you to combine data from multiple tables to create in-depth analysis that previously required a specialist in Business Intelligence to create.

## Calculated Columns in PowerPivot

When importing data into PowerPivot, users often find the data is almost, but not quite what they need. Perhaps the name is not quite formatted as they need, or some calculation, not important in the stored data but very important to their work, is missing. For these situations PowerPivot offers Calculated Columns.

Calculated Columns provide a way for users to add that missing information they require into the source data. The calculations are done on a row by row basis, if you want to do something on the entire table, for example count the number of rows, you will instead need to create a measure in your PivotTable or PivotChart. Measures will be covered in a later post.

Let’s get started by using the same Excel 2010 workbook we ended with in the previous blog post. If you haven’t seen it, please go back and reference my post Combining Data from Multiple Sources in PowerPivot for the full details.

Our first task will be to address our customer names. In the source data, names are broken into five columns: Title, FirstName, MiddleName, LastName, and Suffix. For ease of use we wish to combine these distinct columns into one single column. Assuming you have opened the PowerPivot workbook, select the Customer table from the list of tabs at the bottom. Now go to the right-most column, ModifiedDate. Next to it you’ll see a blank column with the header “Add Column”. Click in it, then go up to the fx box right above the data.

The formula bar:

Into this formula bar we can create some fairly complex expressions. Let’s do one that shows some of the power of text formulas. Into the formula bar enter:

=[Title] & " " & [FirstName] & " " & IF(LEN([MiddleName]) > 0, [MiddleName] & " ", "") & [LastName] & IF(LEN([Suffix]) > 0, " " & [Suffix], "")

As with Excel, formulas need to begin with the equal sign. All literal string values are enclosed in double quote marks. Here we have two, a single space in the form of “ “ and an empty string in the form of “” (two double quotes right next to each other). The ampersand & character is used for concatenation. When using column names in the formulas, they must be enclosed in square brackets [ ] . Finally notice we’ve leveraged some standard Excel functions, first the LEN function which returns the length of the past in field. Then the IF function which evaluates the first statement (for example, LEN([MiddleName]) > 0 ). Then the area after the first comma ([MiddleName] & " " ) is returned if the statement was true, otherwise the area after the second comma ("" ) is returned.

After pressing enter on our formula PowerPivot will then calculate the values for each individual row in the dataset. The downside is this could take quite a while depending on the size of your data. A 100 million rows of data is going to take a while, even on a fast machine. The benefit though is this is the only time the calculation is done, unless of course the underlying data changes. Values are now calculated and available at analysis time.

You may notice the column name changes from Add Column to CalculatedColumn1. Since this is not something we’d want to show other users, or work with ourselves, simply right click on the column header, pick Rename Column, and give our new column a meaningful name. In this example I used FullName.

A quick side note, in the sample data the MiddleName and Suffix columns are not populated very often, as is often true with real data. However it can make browsing through our data a bit difficult. To validate our calculation, click the drop down menu triangle next to the MiddleName column, go to the bottom and uncheck the “Blank” option for data filtering. This will then remove all rows from the viewed data that are missing a middle name.

Note this doesn’t delete the rows, this is merely a filtering option in PowerPivot to help you view only the data you want. The other rows are still there, to prove it just click the menu arrow again and pick the “Clear filter from MiddleName” menu option and all rows will again be visible. For more information on filtering, see my post Import Filters in PowerPivot. The same filtering tools that apply to the data import process also work once the data is imported.

In addition to textual manipulation, PowerPivot also supports complex math calculations. Let’s do a simple example in the SalesOrderDetail tab. For simplicity, let’s decide that our base profit for any sale is 20 percent of the Line Total. However, for each item ordered we gain an extra 2 percent of profit. We’ll click in the Add Column area of the SalesOrderDetail tab and enter the following calculation:

=(.2 + ([OrderQty] * .02))*[LineTotal]

Now we can rename the column to EstimatedProfit using the rename menu option as described above.

We also have the power of the Excel math functions at our disposal. Let’s do something simple, and decide that we want to round the value of our EstimatedProfit column up to the next whole value. Even if the value was 1.01, it would round up to 2 dollars. To accomplish this we can use Excel’s ROUNDUP function:

=ROUNDUP((.2 + ([OrderQty] * .02))*[LineTotal], 0)

Yields these new results:

As you can see, they have indeed been rounded up to the next whole value. The 0 at the end of the formula indicated how many decimals should remain, I indicated none so we could see the results in whole dollars.

We’ve only just begun to explore the value in Calculated Columns. Not only can they fill in missing data, but they can also speed calculations when you reach the Pivot Table stage of your analysis by making aggregations much easier.

## Combining Data from Multiple Data Sources in PowerPivot

Seldom does the user of PowerPivot have all of the data they need in one nice, neat data source. More than often it will be necessary to import data from a variety of sources and make that data work together. It’s time to start building on what we’ve learned over the last few days to accomplish this feat.

First, launch Excel 2010 and use the PowerPivot import wizard to import the following tables from the AdventureWorksLT2008 database: Address, Customer, CustomerAddress, Product, ProductCategory, SalesOrderDetail, SalesOrderHeader. (Note, for a refresher on importing data please see my blog post, Import Filters in PowerPivot.)

Now we need a second source of data. Follow the instructions in my post Creating Tables in PowerPivot to enter the data below into Excel 2010, copy and paste it into a new PowerPivot table.

If you recall when we import data from a relational database, PowerPivot examines the foreign key relationships found in the database to create relationships between the tables it imports. In this situation though, the CountryInfo data didn’t get imported from a database, instead it was pasted in from a manually entered spreadsheet. Thus, PowerPivot has no information with which it can implicitly create a relationship.

We do want to create one however, so we can link the longer country name in the Address table to the CountryInfo data and thus be able to use the briefer country abbreviations. As PowerPivot was designed to work with many sources of data, it has an easy way to create these relationships.

In the PowerPivot window, click on the Table tab at the very top. All the way to the right you will notice a button group named Relationships. Click the Create Relationship button.

As the above dialog shows, this allows you to create a relationship, or a link between two tables in PowerPivot. Here we are creating a link between the Address table and the CountryInfo table on the CountryRegion field. When complete just click Create to create the relationship.

If you want to verify the relationship was indeed created, or review any of the relationships PowerPivot inferred when it imported the tables from the AdventureWorksLT2008 database, just click the Manage Relationships button in the Table Toolbar’s Relationships group.

On the very first row you’ll see the newly created relationship between the Address and CountryInfo tables. You’ll also see the other relationships that were created during the import process from the SQL Server database. The three buttons at the top let us Create new relationships, Edit existing ones, or Delete ones no longer needed. Note that the altering or deleting of relationships has no effect what so ever on the original source data (SQL Server or the Excel 2010 spreadsheet). It only affects the tables as stored in PowerPivot.

Now let’s see the new relationship in action. Close the Manage Relationships window, and on the PowerPivot Home tab create a new PowerPivot table (Pivot Table, Single Pivot Table). Go ahead and put it in a new worksheet.

In the Gemini Task Pane, go to the SalesOrderHeader table and drag the LineTotal field into the Values area. Next, drag the Name field from the Product table into the Row Labels area. Now for the magic, in the CountryInfo table drag the CountryAbbr field into the Column Labels area. Your pivot table should look something like this:

Because of the relationships that were inferred or that we created, PowerPivot was able to link the data like so:

To validate this for yourself, just return to PowerPivot and look at the Manage Relationships dialog to see all the links.

The need to combine data from many sources is a common task, one that will most certainly be done by users of PowerPivot. Using the techniques shown here, you can create and manage the relationships that will link data from these disparate sources together and leverage the power of PowerPivot.

## Creating Tables in PowerPivot

PowerPivot has the ability to import data from a wide variety of sources. But you could run across a situation where you don’t have that data stored anywhere. Perhaps it’s on a piece of paper, or in a text file, or it’s just in the user’s brain and needs to be typed in. Logically then you would want to create a new table in PowerPivot.

Except you can’t. PowerPivot itself doesn’t provide the ability to create tables and enter data directly into it. Now, before you start the usual rending of garments and gnashing of teeth plus a little wailing, there is a simple to implement solution.

Create a new Excel 2010 workbook. In sheet 1 (or any sheet) let’s enter the following information.

Now highlight the above cells and Copy them to the clipboard. Next, launch the PowerPivot window by going to the PowerPivot tab in Excel 2010 and clicking the PowerPivot window button.

Once PowerPivot is open, if you look in the middle group of buttons you’ll see a set named Paste from Clipboard The To New Table button should be activated now that you have data in your clipboard.

Click the To New Table button. When you do, the Paste Preview dialog appears.

This is similar to the preview window you see with the Import Table wizard, only not quite as much functionality. Here, we can view the data and validate that it is correct, which it is. We can also indicate if the first row contains our column headers, which in our case it does so we can just leave that option checked on. Click OK to import the data.

Above is our new data, now pasted into PowerPivot. We have the same abilities with it we have with any other table, we can sort, rename our columns, add new calculated columns, and more. As you will note from the tab at the bottom of the picture, the data was pasted into a table with the rather uninformative name of Table. We can do better than that, so right click on the Table tab and pick Rename from the menu. Overwrite Table with CountryInfo.

Now you can see how easy it is to create new data from scratch and paste it into PowerPivot. In this example I used a limited number of rows for illustrative purposes, but it’s quite possible to import massive amounts of data. In addition, you can add to your table later. In this example all we would have had to do is Paste Append from the toolbar.

In the next blog post we’ll build on what we’ve learned and look at how to combine data imported from multiple sources.

## Introducing Microsoft PowerPivot

What is PowerPivot? Well according to Microsoft:

“PowerPivot is Microsoft Self-Service Business Intelligence”

I can see from the glazed looks you are giving your monitor that was clear as mud. So let’s step back a bit and first define what exactly is Business Intelligence.

Business Intelligence, often referred to as simply “BI”, is all about taking data you already have and making sense of it. Being able to take that information and turn it from a raw jumble of individual facts and transform it into knowledge that you can take informed actions on.

In every organization there is already someone who is doing BI, although they may not realize it. Microsoft (and many IT departments) refer to this person as “that guy”. A power user, who grabs data from anyplace he (or she) can get it, then uses tools like Excel or Access to slice it, dice it, and analyze it. This person might be an actual Business Analyst, but more often it’s someone for who BI is not their main job. Some common examples of people doing their own BI today are production managers, accountants, engineers, or sales managers, all who need information to better do their job. Let’s look at an illustration that will make it a bit clearer.

In this example, put yourself in the role of a sales manager. You have gotten IT to extract all of your sales orders for the last several years into an Excel spreadsheet. In order to determine how well your sales people are doing, you need to measure their performance. You’ve decided that the amount sold will be a good measure, and use Excel to give you totals.

In BI terms, the column “Total Sales” is known as a measure, or sometimes a fact, as it measures something, in this case the sales amount. The grand total sales amount is often called an aggregation, as it totals up the individual rows of data that IT gave us. But now you might be wondering why Andy’s sales are so low? Well, now you want to dig deeper and look at sales by year.

In BI terms, the names of the sales people are a dimension. Dimensions are often either a “who” (who sold stuff) or a “what” (what stuff did we sell). Places (where was it sold) and dates (when was it sold) are also common dimensions. In this case the sales dates across the top (2007, 2008, 2009) are a date dimension. When we use two or more dimensions to look at our measures, we have a pivot table.

Now we can see a picture emerging. It’s obvious that Andy must have been hired as a new salesperson in late 2008, since he shows no sales for 2007 and very small amount in 2008. But for Paul and Kimberly we can look at something called trends in the BI world. Kimberly shows a nice even trend, rising slowly over the last three years and earns a gold star as our top performer.

By being able to drill down into our data, we spot another trend that was not readily obvious when just looking at the grand totals. Paul has been trending downward so fast the speed of light looks slow. Clearly then we now have information to take action on, commonly known as actionable intelligence.

So remind me, why do we need PowerPivot?

As you can see in the above example, “that guy” in your company clearly has a need to look at this data in order to do his job. Not only does he need to review it, he also has the issue of how to share this information with his co-workers. Unfortunately in the past the tools available to “that guy” have had some drawbacks. The two main tools used by our analyst have been either Excel, or a complete BI solution involving a data warehouse and SQL Server Analysis Services.

Excel’s main limitations center around the volume of data needed to do good analysis. Excel has limits to the number of rows it can store, and for large datasets a spreadsheet can consume equally large amounts of disk space. This makes the spreadsheet difficult to share with coworkers. In addition mathematical functions like aggregations could be slow. On the good side, Excel is readily available to most workers, and a solution can be put together fairly quickly.

A full blown BI solution has some major benefits over the Excel solution. A data warehouse is created, and then SQL Server Analysis Services (often abbreviated as SSAS) is used to precalculate aggregations for every possible way an analyst might wish to look at them. The data is then very easy to share via tools like Excel and SQL Server Reporting Services. While very robust and powerful solution, it does have some drawbacks. It can take quite a bit of time to design, code, and implement both the data warehouse and the analysis services pieces of the solution. In addition it can also be expensive for IT to implement such a system.

Faster than a speeding bullet, more powerful than a locomotive, it’s PowerPivot!

PowerPivot combines the best of both worlds. In fact, it’s not one tool but two: PowerPivot for Microsoft Excel 2010, and PowerPivot for SharePoint 2010. What’s the difference you ask? Good question.

PowerPivot for Microsoft Excel 2010

PowerPivot acts as an Add-on for Excel 2010, and in many ways is quite revolutionary. First, it brings the full power of SQL Server Analysis Services right into Excel. All of the speed and power of SSAS is available right on your desktop. Second, it uses a compression technology that allows vast amounts of data to be saved in a minimal amount of space. Millions of rows of data can now be stored, sorted, and aggregated in a reasonable amount of disk space with great speed.

PowerPivot can draw its data from a wide variety of sources. As you might expect, it can pull from almost any database. Additionally it can draw data from news feeds, SQL Server Reporting Services, other Excel sheets, it can even be typed in manually if need be.

Another issue that often faces the business analyst is the freshness of the data. The information is only as good as the date it was last imported into Excel. Traditionally “that guy” only got extracts of the database as IT had time, since it was often a time consuming process. PowerPivot addresses this through its linked tables feature. PowerPivot will remember where your data came from, and with one simple button click can refresh the spreadsheet with the latest information.

Because PowerPivot sits inside Microsoft Excel, it not only can create basic pivot tables but has all the full featured functionality of Excel at its disposal. It can format pivot tables in a wide array of styles, create pivot charts and graphs, and combine these together into useful dashboards. Additionally PowerPivot has a rich set of mathematical functionally, combining the existing functions already in Excel with an additional set of functions called Data Analysis eXpressions or DAX.

PowerPivot for SharePoint 2010

PowerPivot for Excel 2010 clearly solves several issues around the issue of analysis. It allows users to quickly create spreadsheets, pivot tables, charts, and more in a compact amount of space. If you recall though, creation was only half of “that guys” problem. The other half was sharing his analysis with the rest of his organization. That’s where PowerPivot for SharePoint 2010 comes into play.

Placing a PowerPivot Excel workbook in SharePoint 2010 not only enables traditional file sharing, but also activates several additional features. First, the spreadsheet is hosted right in the web browser. Thus users who might not have made the transition to Excel 2010 can still use the PowerPivot created workbook, slicing and filtering the data to get the information they require.

Data can also be refreshed on an automated, scheduled basis. This ensures the data is always up to date when doing analysis. Dashboards can also be created from the contents of a worksheet and displayed in SharePoint. Finally these PowerPivot created worksheets can be used as data sources for such tools as SQL Server Reporting Services.

Limitations

First, let me preface this by saying as of this writing all of the components are either in CTP (Community Technology Preview, a pre-beta) or Beta state. Thus there could be some changes between now and their final release next year.

To use the PowerPivot for Excel 2010 components, all you have to have is Excel 2010 and the PowerPivot add-in. If you want to share the workbook and get all the rich functionality SharePoint has to offer, you’ll have to have SharePoint 2010, running Excel Services and PowerPivot 2010 Services. You’ll also have to have SQL Server 2008 R2 Analysis Services running on the SharePoint 2010 box. Since you’ll have to have a SQL Server instance installed to support SharePoint this is not a huge limitation, especially since SSAS comes with SQL Server at no extra cost.

One thing I wish to make clear, SharePoint 2010 itself can run using any version of SQL Server from SQL Server 2005 on. It is the PowerPivot service that requires 2008 R2 Analysis Services.

One other important item to note: at some point the load upon the SharePoint 2010 server may grow too large if especially complex analysis is being done. Fortunately SharePoint 2010 ships with several tools that allow administrators to monitor the load and plan accordingly. At the point where the load is too big, it is a clear indication it’s time to transition from a PowerPivot solution to a full BI solution using a data warehouse and SQL Server Analysis Services.

What does PowerPivot mean for business users?

For business users, and especially “that guy”, it means complex analysis tools can be created in a short amount of time. Rich functionality makes it easier to spot trends and produce meaningful charts and graphs. It also means this information can be shared with others in the organization easily, without imposing large burdens on the corporate e-mail system or local file sharing mechanisms.

No longer will users be dependent on IT for their analysis, they will have the power to create everything they need on their own, truly bringing “self service BI” to fruition.

What does PowerPivot mean for Business Intelligence IT Pros?

The first reaction many BI developers have when hearing about PowerPivot is “oh no, this is going to put me out of a job!” Far from it, I firmly believe PowerPivot will create even more work for BI Professionals like myself.

As upper management grows to rely on the information provided by PowerPivot, they will also begin to understand the true value BI can bring to an organization. Selling a new BI solution into an organization where none currently exists can be difficult, as it can be hard to visualize how such a solution would work and the value it brings. PowerPivot allows BI functionality to be brought into an organization at a low development cost, proving the value of BI with minimal investment. Thus when there is a need to implement a larger, traditional BI project those same managers will be more forthcoming with the dollars.

Second, as users pull more and more data, they are going to want that data better organized than they will find in their current transactional business systems. This will in turn spur the need to create many new data warehouses. Likewise the IT department will also want data warehouses created, to reduce the load placed on those same transactional business systems.

I also foresee PowerPivot being used by BI Pros themselves to create solutions. The database structure of many transactional database systems can be difficult to understand even for experienced IT people, much less users. BI Pros can use PowerPivot to add a layer of abstraction between the database and the users, allowing business analysts to do their job without having to learn the complexity of a database system.

BI Pros can also use PowerPivot to implement quick turnaround solutions for customers, bringing more value for the customer’s dollar. When a BI Pro can prove him (or her) self by providing rich functionality in a short time frame it’s almost always the case they are brought back in for multiple engagements.

PowerPivot also provides great value to BI Pros who are employed full time in an enterprise organization. They can create solutions much quicker than before, freeing them up to do other valuable tasks. In addition PowerPivot solutions can provide a “stop gap” solution, pushing the date at which the organization needs to spend the dollars for a full blown BI solution and allowing IT to plan better.

Finally I see great value in PowerPivot as a prototyping tool for larger BI projects. Now users can see their data, interact with it, analyze it, and ensure the required measures and dimensions are present before proceeding with the larger project.

I’ll reiterate, if anything I believe PowerPivot will create an explosion of work for the Business Intelligence Professional.

Well right here for one. I have become quite interested in PowerPivot since seeing it at the SQL PASS 2009 Summit. I think it will be a valuable tool for both myself and my customers. This will be the first of many blog posts to come on PowerPivot. I am also beginning a series of presentations on PowerPivot for local user groups and code camp events. The first will be Saturday, November 21st 2009 at the SharePoint Saturday in Birmingham Alabama, but there will be many more to come. (If you’d like me to come speak at your group just shoot me an e-mail and we’ll see what we can arrange.)

There’s also the PowerPivot site itself:

I’ve also found a small handful of blogs on PowerPivot, listed in no particular order:

Summary

Thanks for sticking with me, I know this was a rather long blog post but PowerPivot has a lot of rich functionality to offer. While PowerPivot is still in the CTP/Beta stage as of this writing, I see more and more interest in the community, which will continue to grow as PowerPivot moves closer to release. I hope this post has set you off on the right step and you’ll continue to come back for more information.

## Populating a Kimball Date Dimension

I’m a big fan of the Kimball method of Data Warehousing. A common task most of us setting up a new Data Warehouse face is creating a Date Dimension. In their book, “The Microsoft Data Warehouse Toolkit With SQL Server 2005 and the Microsoft Business Intelligence Toolset”, they have an example of a good date dimension table in their books sample code. My complaint though was not so much with the layout itself, I liked it and found it fairly complete. Instead it was the method they chose to load it. They used an Excel spreadsheet, then a SQL Server Integration Services package to read the Excel file and load the date dimension table.

To me this approach has a couple of drawbacks. First, if you are doing all the loading on the server itself, you may not have Excel loaded. Thus you may be faced with the headache of creating the sheet then figuring out how to get it to a location the server can read. Second, when you go to add more dates in the future, you have to go into the spreadsheet and reset everything, removing what was there before. It can also be quite a headache to go back several years from know and find both SSIS packages and that Excel spreadsheet. Plus after that time changes may be made to both Excel and SSIS that make that solution no longer workable. Finally quite often it’s a DBA setting up the warehouse, and I’ve found there are still a few DBAs who are uncomfortable relying on SSIS, although I’m happy to say that number continues to shrink.

A T-SQL solution was clearly, to me anyway, the superior answer for both ease of use and long term stability. I assumed that as popular as the Kimball method is, someone would have already created a routine to load their style of date dimension, but some Binging and Googling around proved fruitless. I did find some code for loading some very simple date dimensions, but nothing as complete as the Kimball design. So, relishing a good coding challenge, I rolled up my sleeves and went to work. Below is the fruit of my labor, a script for loading a Kimball like date dimension. All you have to do is set the begin and end dates, indicate the offset for your fiscal year, and let ‘er rip. You can easily go back and add more dates by just adjusting the begin and end times.

A few things you should note. First, I did make a few slight modifications to the standard Kimball date dimension table as found in the previously mentioned book. They have a column titled “DateName” which holds the date as a string in YYYY/MM/DD format. As long as I was putting the date in, I decided to add string versions of the date for the US and Europe. These are in MM/DD/YYYY and DD/MM/YYYY formats and the columns are named “DateNameUS” and “DateNameEU” (for European Union) respectively.

Their table also had an audit key, used presumably by the SSIS package. I didn’t really see the need for an audit key for a date table, so I changed it to an identity column so I could have a secondary surrogate key if I needed it, just something to count the number of date rows easily and track the order they were inserted in.

One final, but very important distinction. I was in a post conference session taught by Erik Veerman at SQL PASS 2009. In it he mentioned using Dim and Fact schemas, thus you’d have [Dim].[Date] instead of [dbo].[DimDate]. I liked the idea as it was something I’d been considering myself, so in this version that is what I did. If you use the more traditional naming format of dbo.DimDate you’ll need to tweak the code.

Below is the code to load the Date Dimension table, which is my creation. Under it I placed my modified version of the Kimball Date Dimension table. It’s core code came from the sample code mentioned in the first paragraph then was modified by me. I include it for completeness.

Update: A few readers aptly pointed out I’d missed replacing a static date field when I worked the final version of the code. Made the change to replace the static date with @DateCounter.

Code Sample 1 – Script to load a date dimension.

```/*---------------------------------------------------------------------------*/
/* Loads a Date Dimension                                                    */
/*---------------------------------------------------------------------------*/

-- A few notes, this code does nothing to the existing table, no deletes
-- are triggered before hand. Because the DateKey is uniquely indexed,
-- it will simply produce errors if you attempt to insert duplicates.
-- You can however adjust the Begin/End dates and rerun to safely add
-- new dates to the table every year.
--
-- If the begin date is after the end date, no errors occur but nothing
-- happens as the while loop never executes.

SET NOCOUNT ON -- turn off all the 1 row inserted messages

-- Hold our dates
DECLARE @BeginDate DATETIME
DECLARE @EndDate DATETIME

-- Holds a flag so we can determine if the date is the last day of month
DECLARE @LastDayOfMon CHAR(1)

-- Number of months to add to the date to get the current Fiscal date
DECLARE @FiscalYearMonthsOffset INT

-- These two counters are used in our loop.
DECLARE @DateCounter DATETIME    --Current date in loop
DECLARE @FiscalCounter DATETIME  --Fiscal Year Date in loop

-- Set the date to start populating and end populating
SET @BeginDate = '01/01/2008'
SET @EndDate = '12/31/2010'

-- Set this to the number of months to add to the current date to get
-- the beginning of the Fiscal year. For example, if the Fiscal year
-- begins July 1, put a 6 there.
-- Negative values are also allowed, thus if your 2010 Fiscal year
-- begins in July of 2009, put a -6.
SET @FiscalYearMonthsOffset = 6

-- Start the counter at the begin date
SET @DateCounter = @BeginDate

WHILE @DateCounter <= @EndDate
BEGIN
-- Calculate the current Fiscal date as an offset of
-- the current date in the loop
SET @FiscalCounter = DATEADD(m, @FiscalYearMonthsOffset, @DateCounter)

-- Set value for IsLastDayOfMonth
IF MONTH(@DateCounter) = MONTH(DATEADD(d, 1, @DateCounter))
SET @LastDayOfMon = 'N'
ELSE
SET @LastDayOfMon = 'Y'

-- add a record into the date dimension table for this date
INSERT  INTO [Dim].[Date]
(
[DateKey]
, [FullDate]
, [DateName]
, [DateNameUS]
, [DateNameEU]
, [DayOfWeek]
, [DayNameOfWeek]
, [DayOfMonth]
, [DayOfYear]
, [WeekdayWeekend]
, [WeekOfYear]
, [MonthName]
, [MonthOfYear]
, [IsLastDayOfMonth]
, [CalendarQuarter]
, [CalendarYear]
, [CalendarYearMonth]
, [CalendarYearQtr]
, [FiscalMonthOfYear]
, [FiscalQuarter]
, [FiscalYear]
, [FiscalYearMonth]
, [FiscalYearQtr]
)
VALUES  (
( YEAR(@DateCounter) * 10000 ) + ( MONTH(@DateCounter)
* 100 )
+ DAY(@DateCounter)  --DateKey
, @DateCounter -- FullDate
, CAST(YEAR(@DateCounter) AS CHAR(4)) + '/'
+ RIGHT('00' + RTRIM(CAST(DATEPART(mm, @DateCounter) AS CHAR(2))), 2) + '/'
+ RIGHT('00' + RTRIM(CAST(DATEPART(dd, @DateCounter) AS CHAR(2))), 2) --DateName
, RIGHT('00' + RTRIM(CAST(DATEPART(mm, @DateCounter) AS CHAR(2))), 2) + '/'
+ RIGHT('00' + RTRIM(CAST(DATEPART(dd, @DateCounter) AS CHAR(2))), 2)  + '/'
+ CAST(YEAR(@DateCounter) AS CHAR(4))--DateName
, RIGHT('00' + RTRIM(CAST(DATEPART(dd, @DateCounter) AS CHAR(2))), 2) + '/'
+ RIGHT('00' + RTRIM(CAST(DATEPART(mm, @DateCounter) AS CHAR(2))), 2)  + '/'
+ CAST(YEAR(@DateCounter) AS CHAR(4))--DateName
, DATEPART(dw, @DateCounter) --DayOfWeek
, DATENAME(dw, @DateCounter) --DayNameOfWeek
, DATENAME(dd, @DateCounter) --DayOfMonth
, DATENAME(dy, @DateCounter) --DayOfYear
, CASE DATENAME(dw, @DateCounter)
WHEN 'Saturday' THEN 'Weekend'
WHEN 'Sunday' THEN 'Weekend'
ELSE 'Weekday'
END --WeekdayWeekend
, DATENAME(ww, @DateCounter) --WeekOfYear
, DATENAME(mm, @DateCounter) --MonthName
, MONTH(@DateCounter) --MonthOfYear
, @LastDayOfMon --IsLastDayOfMonth
, DATENAME(qq, @DateCounter) --CalendarQuarter
, YEAR(@DateCounter) --CalendarYear
, CAST(YEAR(@DateCounter) AS CHAR(4)) + '-'
+ RIGHT('00' + RTRIM(CAST(DATEPART(mm, @DateCounter) AS CHAR(2))), 2) --CalendarYearMonth
, CAST(YEAR(@DateCounter) AS CHAR(4)) + 'Q' + DATENAME(qq, @DateCounter) --CalendarYearQtr
, MONTH(@FiscalCounter) --[FiscalMonthOfYear]
, DATENAME(qq, @FiscalCounter) --[FiscalQuarter]
, YEAR(@FiscalCounter) --[FiscalYear]
, CAST(YEAR(@FiscalCounter) AS CHAR(4)) + '-'
+ RIGHT('00' + RTRIM(CAST(DATEPART(mm, @FiscalCounter) AS CHAR(2))), 2) --[FiscalYearMonth]
, CAST(YEAR(@FiscalCounter) AS CHAR(4)) + 'Q' + DATENAME(qq, @FiscalCounter) --[FiscalYearQtr]
)

-- Increment the date counter for next pass thru the loop
SET @DateCounter = DATEADD(d, 1, @DateCounter)
END

SET NOCOUNT ON -- turn the annoying messages back on

-- Select all rows inserted for the final year as a sanity check
SELECT  *
FROM    [Dim].[Date]
WHERE DateKey > (YEAR(@EndDate) * 10000)
```

Code Sample 2 – Modified Kimball code to create a Date dimension.

```/* Make sure the Dim schema exists */
IF SCHEMA_ID('Dim') IS NULL
EXECUTE('CREATE SCHEMA [Dim] AUTHORIZATION [dbo]')
GO

/* Drop table DimDate */
IF EXISTS ( SELECT  *
FROM    dbo.sysobjects
WHERE   id = OBJECT_ID(N'[Dim].[Date]')
AND OBJECTPROPERTY(id, N'IsUserTable') = 1 )
DROP TABLE [Dim].[Date]
GO

/* Create table DimDate */
CREATE TABLE [Dim].[Date]
( [DateKey] BIGINT NOT NULL
, [FullDate] DATETIME NULL
, [DateName] CHAR(11) NULL
, [DateNameUS] CHAR(11) NULL   --US Date FORMAT, MM/DD/YYYY
, [DateNameEU] CHAR(11) NULL   --European Union Date Format DD/MM/YYYY
, [DayOfWeek] TINYINT NULL
, [DayNameOfWeek] CHAR(10) NULL
, [DayOfMonth] TINYINT NULL
, [DayOfYear] SMALLINT NULL
, [WeekdayWeekend] CHAR(7) NULL
, [WeekOfYear] TINYINT NULL
, [MonthName] CHAR(10) NULL
, [MonthOfYear] TINYINT NULL
, [IsLastDayOfMonth] CHAR(1) NULL
, [CalendarQuarter] TINYINT NULL
, [CalendarYear] SMALLINT NULL
, [CalendarYearMonth] CHAR(7) NULL
, [CalendarYearQtr] CHAR(7) NULL
, [FiscalMonthOfYear] TINYINT NULL
, [FiscalQuarter] TINYINT NULL
, [FiscalYear] INT NULL
, [FiscalYearMonth] CHAR(9) NULL
, [FiscalYearQtr] CHAR(8) NULL
, [AuditKey] BIGINT IDENTITY NOT NULL
, CONSTRAINT [PK_DimDate] PRIMARY KEY CLUSTERED ( [DateKey] )
)
ON     [PRIMARY]
GO

EXEC sys.sp_addextendedproperty @name = N'Table Type', @value = N'Dimension',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date'
EXEC sys.sp_addextendedproperty @name = N'View Name', @value = N'Date',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date'
@value = N'Date dimension contains one row for every day, beginning at 1/1/2000. There may also be rows for "hasn''t happened yet."',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date'
EXEC sys.sp_addextendedproperty @name = N'Used in schemas',
@value = N'Sales (3 roles); Finance; Currency Rates; Sales Quota (2 roles; one at Cal Qtr level)',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date'

GO

INSERT  INTO [Dim].[Date]
( DateKey
, FullDate
, [DateName]
, [DateNameUS]
, [DateNameEU]
, [DayOfWeek]
, DayNameOfWeek
, [DayOfMonth]
, [DayOfYear]
, WeekdayWeekend
, WeekOfYear
, [MonthName]
, MonthOfYear
, IsLastDayOfMonth
, CalendarQuarter
, CalendarYear
, CalendarYearMonth
, CalendarYearQtr
, FiscalMonthOfYear
, FiscalQuarter
, FiscalYear
, FiscalYearMonth
, FiscalYearQtr
)
VALUES  ( -1
, NULL
, 'Unknown'
, 'Unknown'
, 'Unknown'
, NULL
, 'Unknown'
, NULL
, NULL
, 'Unknown'
, NULL
, 'Unknown'
, NULL
, 'N'
, NULL
, NULL
, 'Unknown'
, 'Unknown'
, NULL
, NULL
, NULL
, 'Unknown'
, 'Unknown'
)
GO

@value = N'Surrogate primary key', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DateKey' ;
@value = N'Full date as a SQL date (time=00:00:00)', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'FullDate' ;
@value = N'Standard Date Format of YYYY/MM/DD', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DateName' ;
@value = N'Standard US Date Format of MM/DD/YYYY', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DateNameUS' ;
@value = N'Standard European Union Date Format of DD/MM/YYYY', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DateNameEU' ;
@value = N'Number of the day of week; Sunday = 1', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DayOfWeek' ;
@value = N'Day name of week', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DayNameOfWeek' ;
@value = N'Number of the day in the month', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DayOfMonth' ;
@value = N'Number of the day in the year', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DayOfYear' ;
@value = N'Is today a weekday or a weekend', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'WeekdayWeekend' ;
@value = N'Week of year', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'WeekOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Description', @value = N'Month name',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'MonthName' ;
@value = N'Month of year', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'MonthOfYear' ;
@value = N'Is this the last day of the calendar month?',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'IsLastDayOfMonth' ;
@value = N'Calendar quarter', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarQuarter' ;
@value = N'Calendar year', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYear' ;
@value = N'Calendar year and month', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'CalendarYearMonth' ;
@value = N'Calendar year and quarter', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'CalendarYearQtr' ;
@value = N'Fiscal month of year (1..12). FY starts in July',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalMonthOfYear' ;
@value = N'Fiscal quarter', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalQuarter' ;
@value = N'Fiscal year. Fiscal year begins in July.',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'FiscalYear' ;
@value = N'Fiscal year and month', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'FiscalYearMonth' ;
@value = N'Fiscal year and quarter', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'FiscalYearQtr' ;
@value = N'What process loaded this row?', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'AuditKey' ;
EXEC sys.sp_addextendedproperty @name = N'FK To',
@value = N'DimAudit.AuditKey', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'AuditKey' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'20041123', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DateKey' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'11/23/2004', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FullDate' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'23-Nov-2004', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DateName' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'1..7',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'Sunday',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DayNameOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'1..31',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'1..365',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'Weekday, Weekend', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'WeekdayWeekend' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'1..52 or 53', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'WeekOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'November', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'MonthName' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'1, 2, …, 12', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'MonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'Y, N',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'IsLastDayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'1, 2, 3, 4', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'2004',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'2004-01',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'2004Q1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'1, 2, …, 12', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalMonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'1, 2, 3, 4', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values', @value = N'2004',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'FiscalYear' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'FY2004-01', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Example Values',
@value = N'FY2004Q1', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DateName' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DayNameOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'WeekdayWeekend' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'WeekOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'MonthName' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'MonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'IsLastDayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalMonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'FiscalYear' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'SCD  Type', @value = N'1',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DateKey' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'FullDate' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DateName' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'DayNameOfWeek' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'DayOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'WeekdayWeekend' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'WeekOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'MonthName' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'MonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'IsLastDayOfMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'CalendarYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalMonthOfYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalQuarter' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN', @level2name = N'FiscalYear' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearMonth' ;
EXEC sys.sp_addextendedproperty @name = N'Source System', @value = N'Derived',
@level0type = N'SCHEMA', @level0name = N'Dim', @level1type = N'TABLE',
@level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'FiscalYearQtr' ;
EXEC sys.sp_addextendedproperty @name = N'Source System',
@value = N'Derived in ETL', @level0type = N'SCHEMA', @level0name = N'Dim',
@level1type = N'TABLE', @level1name = N'Date', @level2type = N'COLUMN',
@level2name = N'AuditKey' ;
@value = N'In the form: yyyymmdd', @level0type = N'SCHEMA',
@level0name = N'Dim', @level1type = N'TABLE', @level1name = N'Date',
@level2type = N'COLUMN', @level2name = N'DateKey' ;
GO```

## TechMixer University – SSIS for Developers

In addition to help recruit speakers, I also had the privilege of speaking at TechMixer University 2009.

The slide deck and main demo can be found at my Code Gallery site:

https://code.msdn.microsoft.com/Release/ProjectReleases.aspx?ProjectName=SSISForDevs&ReleaseId=2883

The calling of SSIS from .Net demo can be found at:

http://code.msdn.microsoft.com/ssisfromnet

Thanks to everyone who attended TechMixer University. I look forward to seeing you next year!

## SQL Saturday Redmond – October 3 2009

I am fortunate enough to be able to give three presentations at Redmond WA’s SQL Saturday event. The first session is “Introduciton to Data Warehousing / Business Intelligence”. Here is the PDF slide deck for that presentation. (Right click and save as if you want to save a copy for later reference).

The second presentation is SQL Server Full Text Searching. You can find the slide deck in PDF format as well as sample code at http://code.msdn.microsoft.com/SqlServerFTS.

The final presentation of the day was Introduction to SQL Server Integration Services. The sample project, slide deck, and step by step instructions can be found at http://code.msdn.microsoft.com/introssis . In addition I also showed how to call SSIS from a .Net application. You can find that sample at http://code.msdn.microsoft.com/ssisfromnet .

## Calling SSIS from .Net

In a recent DotNetRocks show, episode 483, Kent Tegels was discussing SQL Server Integration Services and how it can be useful to both the BI Developer as well as the traditional application developer. While today I am a SQL Server BI guy, I come from a long developer background and could not agree more. SSIS is a very powerful tool that could benefit many developers even those not on Business Intelligence projects. It was a great episode, and I high encourage everyone to listen.

There is one point though that was not made very clear, but I think is tremendously important. It is indeed possible to invoke an SSIS package from a .Net application if that SSIS package has been deployed to the SQL Server itself. This article will give an overview of how to do just that. All of the sample code here will also be made available in download form from the companion Code Gallery site, http://code.msdn.microsoft.com/ssisfromnet .

In this article, I do assume a few prerequisites. First, you have a SQL Server with SSIS installed, even if it’s just your local development box with SQL Server Developer Edition installed. Second, I don’t get into much detail on how SSIS works, the package is very easy to understand. However you may wish to have a reference handy. You may also need the assistance of your friendly neighborhood DBA in setting up the SQL job used in the process.

Summary

While the technique is straightforward, there are a fair number of detailed steps involved. For those of you just wanting the overview, we need to start with some tables (or other data) we want to work with. After that we’ll write the SSIS package to manipulate that data.

Once the package is created it must be deployed to the SQL Server so it will know about it. This deploy can be to the file system or to SQL Server.

Once deployed, a SQL Server Job must be created that executes the deployed SSIS package.

Finally, you can execute the job from your .Net application via ADO.NET and a call to the sp_start_job stored procedure built into the msdb system database.

OK, let’s get to coding!

Setup the Tables

First we need some data to work with. What better than a listing of previous Dot Net Rocks episodes? I simply went to the Previous Shows page, highlighted the three columns of show number, show name, and date, and saved them to a text file. (Available on the Code Gallery site.)

Next we need a place to hold data so SSIS can work with it. I created a database and named it ArcaneCode, however any database should work. Next we’ll create a table to hold “staging” DNR Show data.

CREATE TABLE [dbo].[staging_DNRShows](
[ShowData] [varchar](250) NOT NULL
) ON [PRIMARY]

This table will hold the raw data from the text file, each line in the text file becoming one row here. Next we want a table to hold the final results.

CREATE TABLE [dbo].[DNRShows](
[ShowNumber] [int] NOT NULL,
[ShowName] [varchar](250) NULL,
[ShowDate] [datetime] NULL,
CONSTRAINT [PK_DNRShows] PRIMARY KEY CLUSTERED
(
[ShowNumber] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]

The job of the SSIS package will be to read each row in the staging table and split it into 3 columns, the show’s number, name, and date, then place those three columns into the DNRShows table above.

The SSIS Package

The next step is to create the SSIS package itself. Opening up Visual Studio / BIDS, create a new Business Intelligence SQL Server Integration Services project. First let’s setup a shared Data Source to the local server, using the ArcaneCode database as our source.

The default package name of “Package.dtsx” isn’t very informative, so let’s rename it ”LoadDNRShows.dtsx”. Start by adding a reference to the shared data source in the Connection Managers area, taking the default. Then in the Control Flow surface add 3 tasks, as seen here:

The first task is an Execute SQL Task that simply runs a “DELETE FROM dbo.DNRShows” command to wipe out what was already there. Of course in a true application we’d be checking for existing records in the data flow and doing updates or inserts, but for simplicity in this example we’ll just wipe and reload each time.

The final task is also an Execute SQL Task, after we have processed the data we no longer need it in the staging table, so we’ll issue a “DELETE FROM dbo.staging_DNRShows” to remove it.

The middle item is our Data Flow Task. This is what does the heavy lifting of moving the staging data to the main table. Here is a snapshot of what it looks like:

The first task is our OLEDB Source, it references the staging_DNRShows table. Next is what’s called a Derived Column Transformation. This will allow you to add new calculated columns to the flow, or add columns from variables. In this case we want to add three new columns, based on the single column coming from the staging table.

As you can see in under Columns in the upper left, we have one column in our source, ShowData. In the lower half we need to add three new columns, ShowNumber, ShowDate, and ShowName. Here are the expressions for each:

ShowNumber
(DT_I4)SUBSTRING(ShowData,1,FINDSTRING(ShowData,"\t",1))

ShowDate
(DT_DBDATE)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",2) + 1,LEN(ShowData) – FINDSTRING(ShowData,"\t",2))

ShowName
(DT_STR,250,1252)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",1) + 1,FINDSTRING(ShowData,"\t",2) – FINDSTRING(ShowData,"\t",1) – 1)

The syntax is an odd blend of VB and C#. Each one starts with a “(DT_”, these are type casts, converting the result of the rest of the expression to what we need. For example, (DT_I4) converts to a four byte integer, which we need because in our database the ShowNumber column was defined as an integer. You will see SUBSTRING and LEN which work like their VB counterparts. FINDSTRING works like the old POS statement, it finds the location of the text and returns that number. The “\t” represents the tab character, here the C# fans win out as the Expression editor uses C# like escapes for special characters. \t for tab, \b for backspace, etc.

Finally we need to write out the data. For this simply add an OLEDB Destination and set it to the target table of dbo.DNRShows. On the mappings tab make sure our three new columns map correctly to the columns in our target table.

Deploy the Package

This completes the coding for the package, but there is one final step we need to do. First, in the solution explorer right click on the project (not the solution, the project as highlighted below) and pick properties.

In the properties dialog, change the “CreateDeploymentUtility” option from false (the default) to True.

Now click the Build, Build Solution menu item. If all went well you should see the build was successful. It’s now time to deploy the package to the server. Navigate to the folder where your project is stored, under it you will find a bin folder, and in it a Deployment folder. In there you should find a file with a “.SSISDeploymentManifest” extension. Double click on this file to launch the Package Installation Wizard.

When the wizard appears there are two choices, File system deployment and SQL Server deployment. For our purposes we can use either one, there are pros and cons to each and many companies generally pick one or the other. In this example we’ll pick SQL Server deployment, but again know that I’ve tested this both ways and either method will work.

Once you pick SQL Server deployment, just click Next. Now it asks you for the server name, I’ve left it at (local) since I’m working with this on a development box; likewise I’ve left “Use Windows Authentication”. Finally I need the package path, I can select this by clicking the ellipse (the …) to the right of the text box. This brings up a dialog where I can select where to install.

In a real world production scenario we’d likely have branches created for each of our projects, but for this simple demo we’ll just leave it in the root and click OK.

Once your form is filled out as below, click Next.

We are next queried to what our installation folder should be. This is where SSIS will cache package dependencies. Your DBA may have a special spot setup for these, if not just click next to continue.

Finally we are asked to confirm we know what we are doing. Just click Next. If all went well, the install wizard shows us it’s happy with a report, and we can click Finish to exit.

Setup the SQL Server Job

We’ve come a long way and we’re almost to the finish line, just one last major step. We will need to setup a SQL Server Job which will launch the SSIS package for us. In SQL Server Management Studio, navigate to the “SQL Server Agent” in your Object Explorer. If it’s not running, right click and pick “Start”. Once it’s started, navigate to the Jobs branch. Right click and pick “New Job”.

When the dialog opens, start by giving your job a name. As you can see below I used LoadDNRShows. I also entered a description.

Now click on the Jobs page over on the left “Select a page” menu. At the bottom click “New” to add a new job step.

In the job step properties dialog, let’s begin by naming the step “Run the SSIS package”. Change the Type to “SQL Server Integration Services Package”. When you do, the dialog will update to give options for SSIS. Note the Run As drop down, this specifies the account to run under. For this demo we’ll leave it as the SQL Server Agent Service Account, check with your DBA as he or she may have other instructions.

In the tabbed area the General tab first allows us to pick the package source. Since we deployed to SQL Server we’ll leave it at the default, however if you had deployed to the file system this is where you’d need to change it to pick your package.

At the bottom we can use the ellipse to pick our package from a list. That done your screen should look something like:

For this demo that’s all we need to set, I do want to take a second to encourage you to browse through the other tabs. Through these tabs you can set many options related to the package. For example you could alter the data sources, allowing you to use one package with multiple databases.

Click OK to close the job step, then OK again to close the Job Properties window. Your job is now setup!

Calling from .Net

The finish line is in sight! Our last step is to call the job from .Net. To make it a useful example, I also wanted the .Net application to upload the data the SSIS package will manipulate. For simplicity I created a WinForms app, but this could easily be done in any environment. I also went with C#, again the VB.Net code is almost identical.

I started by creating a simple WinForm with two buttons and one label. (Again the full project will be on the Code Gallery site).

In the code, first be sure to add two using statements to the standard list:

using System.Data.SqlClient;

using System.IO;

Behind the top button we’ll put the code to copy the data from the text file we created from the DNR website to the staging table.

private void btnLoadToStaging_Click(object sender, EventArgs e)

{

/* This method takes the data in the DNRShows.txt file and uploads them to a staging table */

/* The routine is nothing magical, standard stuff to read as Text file and upload it to a  */

// Note, be sure to change to your correct path

string filename = @"D:\Presentations\SQL Server\Calling SSIS From Stored Proc\DNRShows.txt";

string line;

// If you used a different db than ArcaneCode be sure to set it here

string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

SqlConnection connection = new SqlConnection(connect);

connection.Open();

SqlCommand cmd = connection.CreateCommand();

// Wipe out previous data in case of a crash

string sql = "DELETE FROM dbo.staging_DNRShows";

cmd.CommandText = sql;

cmd.ExecuteNonQuery();

// Now setup for new inserts

sql = "INSERT INTO dbo.staging_DNRShows (ShowData) VALUES (@myShowData)";

cmd.CommandText = sql;

// Loop thru text file, insert each line to staging table

try

{

while (line != null)

{

cmd.Parameters["@myShowData"].Value = line;

cmd.ExecuteNonQuery();

lblProgress.Text = line;

}

}

finally

{

if (sr != null)

sr.Close();

connection.Close();

lblProgress.Text = "Data has been loaded";

}

Before you ask, yes I could have used any number of data access technologies, such as LINQ. I went with ADO.NET for simplicity and believing most developers are familiar with it due to its longevity. Do be sure and update the database name and path to the file in both this and the next example when you run the code.

This code really does nothing special, just loops through the text file and uploads each line as a row in the staging table. It does however serve as a realistic example of something you’d do in this scenario, upload some data, then let SSIS manipulate it on the server.

Once the data is there, it’s finally time for the grand finale. The code behind the second button, Execute SSIS, does just what it says; it calls the job, which invokes our SSIS package.

private void btnRunSSIS_Click(object sender, EventArgs e)

{

string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

SqlConnection connection = new SqlConnection(connect);

connection.Open();

SqlCommand cmd = connection.CreateCommand();

// Wipe out previous data in case of a crash

string sql = "exec msdb.dbo.sp_start_job N’LoadDNRShows’";

cmd.CommandText = sql;

cmd.ExecuteNonQuery();

connection.Close();

lblProgress.Text = "SSIS Package has been executed";

}

The key is this sql command:

“exec” is the T-SQL command to execute a stored procedure. “sp_start_job” is the stored procedure that ships with SQL Server in the MSDB system database. This stored procedure will invoke any job stored on the server. In this case, it invokes the job “LoadDNRShows”, which as we setup will run an SSIS package.

Launch the application, and click the first button. Now jump over to SQL Server Management Studio and run this query:

select * from dbo.staging_DNRShows;

select * from dbo.DNRShows;

You should see the first query bring back rows, while the second has nothing. Now return to the app and click the “Execute SSIS” button. If all went well running the query again should now show no rows in our first query, but many nicely processed rows in the second. Success!

In researching this article I saw many references suggesting writing a stored procedure that uses xp_cmdshell to invoke dtexec. DTEXEC is the command line utility that you can use to launch SSIS Packages. Through it you can override many settings in the package, such as connection strings or variables.

xp_cmdshell is a utility built into SQL Server. Through it you can invoke any “DOS” command. Thus you could dynamically generate a dtexec command, and invoke it via xp_cmdshell.

The problem with xp_cmdshell is you can use it to invoke ANY “DOS” command. Any of them. Such as oh let’s say “DEL *.*” ? xp_cmdshell can be a security hole, for that reason it is turned off by default on SQL Server, and many DBA’s leave it turned off and are not likely to turn it on.

The techniques I’ve demonstrated here do not rely on xp_cmdshell. In fact, all of my testing has been done on my server with the xp_cmdshell turned off. Even though it can be a bit of extra work, setting up the job, etc., I still advise it over the xp_cmdshell method for security and the ability to use it on any server regardless of its setting.

In Closing

That seemed like a lot of effort, but can lead to some very powerful solutions. SSIS is a very powerful tool designed for processing large amounts of data and transforming it. In addition developing under SSIS can be very fast due to its declarative nature. The sample package from this article took the author less than fifteen minutes to code and test.

When faced with a similar task, consider allowing SSIS to handle the bulk work and just having your .Net application invoke your SSIS package. Once you do, there are no ends to the uses you’ll find for SQL Server Integration Services.

## Intro to DW/BI at the Steel City SQL Users Group

Tonight I’ll be presenting “Introduction to Data Warehousing / Business Intelligence” at the Steel City SQL users group, right here in Birmingham Alabama. If you attended my Huntsville presentation last week, I’ve already added some new slides and revised the deck, so it will be worth another look.

My slide deck is IntroToDataWarehouse.pdf . Come join us tonight at 6 pm at New Horizons, there will be pizza and fun for all.

UPDATE: Before the presentation I was showing a video of Sara Ford jumping off a tower to support CodePlex. Got tons of laughs so here’s a link to the video:

http://blogs.msdn.com/saraford/archive/2009/09/14/my-codeplex-jump-from-tallest-building-in-the-southern-hemisphere-the-full-video.aspx

## Intro to DW/BI at the Huntsville User Group meeting

Tonight at the Huntsville User group I am presenting “Introduction to Data Warehousing / Business Intelligence”. The slide deck for my presentation is now available in PDF format at the link below. If you have attended this presentation in the past you may wish to download the slides again as I have updated it with new information.

IntroToDataWarehousing

## Data Warehousing / BI at the next HuntUG Meeting!

Business Intelligence is one of the most in demand skill sets right now. Do you want to know more about it? Be guided through all the terminology and concepts? Do you live in the Huntsville Alabama area? Well here’s your golden opportunity!

On Tuesday, September 8th I will be presenting “Introduction to Data Warehousing / Business Intelligence” at the next meeting of the Huntsville User Group. I’ll demystify all the terms around DW/BI and give a demonstration of the Microsoft SQL Server tools used in the DW/BI process. See their site for meeting time and directions.

## Welcome to COMFRAME

I admit to being remiss lately, my poor blog has been neglected for these past few weeks. I can only plead mea culpa and explain.

A few weeks ago I had an opportunity placed before me that I simply could not refuse. I’d been happy at my old job and wasn’t looking, but a good friend of mine works for a great company called COMFRAME. They are a consulting firm that does a variety of things, including Enterprise Project Management, .Net and Java development projects, SOA, and most important to me, Business Intelligence.

To make a long story short my friend took a lesson from the Godfather movies and “made me an offer I couldn’t refuse”. I am now a COMFRAME employee! The work is very exciting, I’ll be an architect on a BI project that is using Silverlight 3 for it’s front end. We are working with data from Microsoft Project, not only that but it’s the world’s biggest implementation of Project Server, so I’ll get to work with the fine folks at Microsoft even more closely. We’re also a Microsoft Partner, which will give me new avenues for relationships that will compliment my MVP.

I got to meet the customer this week, although brief they seemed very easy to work with, and nice as well. I also got to meet the development team I’ll be working with, I’m impressed with the work they’ve done so far and can’t wait to roll up my sleeves and dive in.

I’ve had a crazy time wrapping up my old job and starting my new one, hopefully I can get back to regular blogging soon. I’ve been doing a lot with SSIS and SSAS which will give me lots of good material to talk about, not to mention any Silverlight 3 work I get to explore.

## SQL Saturday 7 – Introduction to Data Warehousing and Business Intelligence

At the Birmingham SQL Saturday 2009 I am presenting “Introduction to Data Warehousing and Business Intelligence”.

You can download the slide deck for this presentation in PDF format.

Any sample code came from either my Intro to SSIS presentation or the book Programming SQL Server 2008.

## Introduction To Data Warehousing and Business Intelligence

At the Atlanta SQL Saturday 2009 one of the presentations I am doing is “Introduction to Data Warehousing and Business Intelligence”.

You can download the slide deck for this presentation in PDF format.

Any sample code came from either my Intro to SSIS presentation or the book Programming SQL Server 2008.