Two New PowerShell Courses for Developers on Pluralsight

I’m proud to announce I have not one but two, yes TWO new PowerShell courses available on Pluralsight, targeted for developers. These two courses are designed to work in harmony, to show developers how they can leverage PowerShell to automate and assist in their daily work.

The first is “PowerShell 7 Quick Start for Developers on Linux, macOS, and Windows”. It leverages your knowledge as a developer to bring you up to speed on the PowerShell language. It doesn’t waste time explaining concepts you already know, like variables and loops. It simply shows you how to do things in PowerShell.

The second course is “Everyday PowerShell for Developers on Linux, macOS, and Windows”. It begins showing how to combine PowerShell with Docker to create a PHP container and test a simple website. It then proceeds to create an Azure SQL database and load data in it.

In the second you are taught how to code your own classes and modules by creating a simple module that leverages a USPS website API to lookup a zip code and return the city and state it belongs to.

In the final part of the course you are shown how to use the new DataFabricator module to generate realistic looking, but fake data for use in testing your applications.

While originally developed with PowerShell 7.0.3, all code in the course was tested and videoed using PowerShell 7.1.0.

Additionally, the code was tested on a variety of platforms, including Ubuntu 20.04, 20.10, Windows 10, H1 and H2, macOS Catalina and even macOS Big Sur.

If you don’t have a Pluralsight subscription, just go to the Pluralsight page and click the Try for Free link to get a free 10 day trial.

Advertisement

Arcane Fun Fridays–Music to Code By

I love music, it’s great to listen to when I read or program. I do find music with a lot of lyrics distracting when I’m trying to concentrate, thus action/adventure soundtracks are one of my favorite genera’s. Some surprisingly good music can be found from video game soundtracks. I thought I’d share a few of my favorites with you.

image

 

Call of Duty – Modern Warfare 2 – This is an awesome score. I love the game, and love the music even more. It’s fast paced and will really keep your energy flowing.

 

 

 

image

 

Call of Duty – Black Ops – Another great game, and another great soundtrack.

 

 

 

image

 

Army of Two – After finding the Call of Duty soundtracks, I began to explore the soundtracks from other video games. This is the first one that I thought was in the same class as the Call of Duty games. Good stuff.

Calling SSIS from .Net

In a recent DotNetRocks show, episode 483, Kent Tegels was discussing SQL Server Integration Services and how it can be useful to both the BI Developer as well as the traditional application developer. While today I am a SQL Server BI guy, I come from a long developer background and could not agree more. SSIS is a very powerful tool that could benefit many developers even those not on Business Intelligence projects. It was a great episode, and I high encourage everyone to listen.

There is one point though that was not made very clear, but I think is tremendously important. It is indeed possible to invoke an SSIS package from a .Net application if that SSIS package has been deployed to the SQL Server itself. This article will give an overview of how to do just that. All of the sample code here will also be made available in download form from the companion Code Gallery site, http://code.msdn.microsoft.com/ssisfromnet .

In this article, I do assume a few prerequisites. First, you have a SQL Server with SSIS installed, even if it’s just your local development box with SQL Server Developer Edition installed. Second, I don’t get into much detail on how SSIS works, the package is very easy to understand. However you may wish to have a reference handy. You may also need the assistance of your friendly neighborhood DBA in setting up the SQL job used in the process.

Summary

While the technique is straightforward, there are a fair number of detailed steps involved. For those of you just wanting the overview, we need to start with some tables (or other data) we want to work with. After that we’ll write the SSIS package to manipulate that data.

Once the package is created it must be deployed to the SQL Server so it will know about it. This deploy can be to the file system or to SQL Server.

Once deployed, a SQL Server Job must be created that executes the deployed SSIS package.

Finally, you can execute the job from your .Net application via ADO.NET and a call to the sp_start_job stored procedure built into the msdb system database.

OK, let’s get to coding!

Setup the Tables

First we need some data to work with. What better than a listing of previous Dot Net Rocks episodes? I simply went to the Previous Shows page, highlighted the three columns of show number, show name, and date, and saved them to a text file. (Available on the Code Gallery site.)

Next we need a place to hold data so SSIS can work with it. I created a database and named it ArcaneCode, however any database should work. Next we’ll create a table to hold “staging” DNR Show data.

CREATE TABLE [dbo].[staging_DNRShows](
  [ShowData] [varchar](250) NOT NULL
) ON [PRIMARY]

This table will hold the raw data from the text file, each line in the text file becoming one row here. Next we want a table to hold the final results.

CREATE TABLE [dbo].[DNRShows](
  [ShowNumber] [int] NOT NULL,
  [ShowName] [varchar](250) NULL,
  [ShowDate] [datetime] NULL,
  CONSTRAINT [PK_DNRShows] PRIMARY KEY CLUSTERED
  (
  [ShowNumber] ASC
  )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
  ) ON [PRIMARY]

The job of the SSIS package will be to read each row in the staging table and split it into 3 columns, the show’s number, name, and date, then place those three columns into the DNRShows table above.

The SSIS Package

The next step is to create the SSIS package itself. Opening up Visual Studio / BIDS, create a new Business Intelligence SQL Server Integration Services project. First let’s setup a shared Data Source to the local server, using the ArcaneCode database as our source.

The default package name of “Package.dtsx” isn’t very informative, so let’s rename it ”LoadDNRShows.dtsx”. Start by adding a reference to the shared data source in the Connection Managers area, taking the default. Then in the Control Flow surface add 3 tasks, as seen here:

clip_image001

The first task is an Execute SQL Task that simply runs a “DELETE FROM dbo.DNRShows” command to wipe out what was already there. Of course in a true application we’d be checking for existing records in the data flow and doing updates or inserts, but for simplicity in this example we’ll just wipe and reload each time.

The final task is also an Execute SQL Task, after we have processed the data we no longer need it in the staging table, so we’ll issue a “DELETE FROM dbo.staging_DNRShows” to remove it.

The middle item is our Data Flow Task. This is what does the heavy lifting of moving the staging data to the main table. Here is a snapshot of what it looks like:

clip_image002

The first task is our OLEDB Source, it references the staging_DNRShows table. Next is what’s called a Derived Column Transformation. This will allow you to add new calculated columns to the flow, or add columns from variables. In this case we want to add three new columns, based on the single column coming from the staging table.

clip_image003

As you can see in under Columns in the upper left, we have one column in our source, ShowData. In the lower half we need to add three new columns, ShowNumber, ShowDate, and ShowName. Here are the expressions for each:

ShowNumber
    (DT_I4)SUBSTRING(ShowData,1,FINDSTRING(ShowData,"\t",1))

ShowDate
    (DT_DBDATE)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",2) + 1,LEN(ShowData) – FINDSTRING(ShowData,"\t",2))

ShowName
    (DT_STR,250,1252)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",1) + 1,FINDSTRING(ShowData,"\t",2) – FINDSTRING(ShowData,"\t",1) – 1)

The syntax is an odd blend of VB and C#. Each one starts with a “(DT_”, these are type casts, converting the result of the rest of the expression to what we need. For example, (DT_I4) converts to a four byte integer, which we need because in our database the ShowNumber column was defined as an integer. You will see SUBSTRING and LEN which work like their VB counterparts. FINDSTRING works like the old POS statement, it finds the location of the text and returns that number. The “\t” represents the tab character, here the C# fans win out as the Expression editor uses C# like escapes for special characters. \t for tab, \b for backspace, etc.

Finally we need to write out the data. For this simply add an OLEDB Destination and set it to the target table of dbo.DNRShows. On the mappings tab make sure our three new columns map correctly to the columns in our target table.

Deploy the Package

This completes the coding for the package, but there is one final step we need to do. First, in the solution explorer right click on the project (not the solution, the project as highlighted below) and pick properties.

clip_image004

In the properties dialog, change the “CreateDeploymentUtility” option from false (the default) to True.

clip_image006

Now click the Build, Build Solution menu item. If all went well you should see the build was successful. It’s now time to deploy the package to the server. Navigate to the folder where your project is stored, under it you will find a bin folder, and in it a Deployment folder. In there you should find a file with a “.SSISDeploymentManifest” extension. Double click on this file to launch the Package Installation Wizard.

When the wizard appears there are two choices, File system deployment and SQL Server deployment. For our purposes we can use either one, there are pros and cons to each and many companies generally pick one or the other. In this example we’ll pick SQL Server deployment, but again know that I’ve tested this both ways and either method will work.

Once you pick SQL Server deployment, just click Next. Now it asks you for the server name, I’ve left it at (local) since I’m working with this on a development box; likewise I’ve left “Use Windows Authentication”. Finally I need the package path, I can select this by clicking the ellipse (the …) to the right of the text box. This brings up a dialog where I can select where to install.

clip_image007

In a real world production scenario we’d likely have branches created for each of our projects, but for this simple demo we’ll just leave it in the root and click OK.

Once your form is filled out as below, click Next.

clip_image008

We are next queried to what our installation folder should be. This is where SSIS will cache package dependencies. Your DBA may have a special spot setup for these, if not just click next to continue.

Finally we are asked to confirm we know what we are doing. Just click Next. If all went well, the install wizard shows us it’s happy with a report, and we can click Finish to exit.

Setup the SQL Server Job

We’ve come a long way and we’re almost to the finish line, just one last major step. We will need to setup a SQL Server Job which will launch the SSIS package for us. In SQL Server Management Studio, navigate to the “SQL Server Agent” in your Object Explorer. If it’s not running, right click and pick “Start”. Once it’s started, navigate to the Jobs branch. Right click and pick “New Job”.

When the dialog opens, start by giving your job a name. As you can see below I used LoadDNRShows. I also entered a description.

clip_image010

Now click on the Jobs page over on the left “Select a page” menu. At the bottom click “New” to add a new job step.

In the job step properties dialog, let’s begin by naming the step “Run the SSIS package”. Change the Type to “SQL Server Integration Services Package”. When you do, the dialog will update to give options for SSIS. Note the Run As drop down, this specifies the account to run under. For this demo we’ll leave it as the SQL Server Agent Service Account, check with your DBA as he or she may have other instructions.

In the tabbed area the General tab first allows us to pick the package source. Since we deployed to SQL Server we’ll leave it at the default, however if you had deployed to the file system this is where you’d need to change it to pick your package.

At the bottom we can use the ellipse to pick our package from a list. That done your screen should look something like:

clip_image011

For this demo that’s all we need to set, I do want to take a second to encourage you to browse through the other tabs. Through these tabs you can set many options related to the package. For example you could alter the data sources, allowing you to use one package with multiple databases.

Click OK to close the job step, then OK again to close the Job Properties window. Your job is now setup!

Calling from .Net

The finish line is in sight! Our last step is to call the job from .Net. To make it a useful example, I also wanted the .Net application to upload the data the SSIS package will manipulate. For simplicity I created a WinForms app, but this could easily be done in any environment. I also went with C#, again the VB.Net code is almost identical.

I started by creating a simple WinForm with two buttons and one label. (Again the full project will be on the Code Gallery site).

clip_image012

In the code, first be sure to add two using statements to the standard list:

using System.Data.SqlClient;

using System.IO;

Behind the top button we’ll put the code to copy the data from the text file we created from the DNR website to the staging table.

    private void btnLoadToStaging_Click(object sender, EventArgs e)

    {

      /* This method takes the data in the DNRShows.txt file and uploads them to a staging table */

      /* The routine is nothing magical, standard stuff to read as Text file and upload it to a  */

      /* table via ADO.NET                                                                      */

 

      // Note, be sure to change to your correct path

      string filename = @"D:\Presentations\SQL Server\Calling SSIS From Stored Proc\DNRShows.txt";

      string line;

 

      // If you used a different db than ArcaneCode be sure to set it here

      string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

      SqlConnection connection = new SqlConnection(connect);

      connection.Open();

 

      SqlCommand cmd = connection.CreateCommand();

 

      // Wipe out previous data in case of a crash

      string sql = "DELETE FROM dbo.staging_DNRShows";

      cmd.CommandText = sql;

      cmd.ExecuteNonQuery();

 

      // Now setup for new inserts

      sql = "INSERT INTO dbo.staging_DNRShows (ShowData) VALUES (@myShowData)";

 

      cmd.CommandText = sql;

      cmd.Parameters.Add("@myShowData", SqlDbType.VarChar, 255);

 

      StreamReader sr = null;

 

      // Loop thru text file, insert each line to staging table

      try

      {

        sr = new StreamReader(filename);

        line = sr.ReadLine();

        while (line != null)

        {

          cmd.Parameters["@myShowData"].Value = line;

          cmd.ExecuteNonQuery();

          lblProgress.Text = line;

          line = sr.ReadLine();

        }

      }

      finally

      {

        if (sr != null)

          sr.Close();

        connection.Close();

        lblProgress.Text = "Data has been loaded";

      }

 

Before you ask, yes I could have used any number of data access technologies, such as LINQ. I went with ADO.NET for simplicity and believing most developers are familiar with it due to its longevity. Do be sure and update the database name and path to the file in both this and the next example when you run the code.

This code really does nothing special, just loops through the text file and uploads each line as a row in the staging table. It does however serve as a realistic example of something you’d do in this scenario, upload some data, then let SSIS manipulate it on the server.

Once the data is there, it’s finally time for the grand finale. The code behind the second button, Execute SSIS, does just what it says; it calls the job, which invokes our SSIS package.

    private void btnRunSSIS_Click(object sender, EventArgs e)

    {

      string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

      SqlConnection connection = new SqlConnection(connect);

      connection.Open();

 

      SqlCommand cmd = connection.CreateCommand();

 

      // Wipe out previous data in case of a crash

      string sql = "exec msdb.dbo.sp_start_job N’LoadDNRShows’";

      cmd.CommandText = sql;

      cmd.ExecuteNonQuery();

      connection.Close();

      lblProgress.Text = "SSIS Package has been executed";

 

    }

The key is this sql command:

exec msdb.dbo.sp_start_job N’LoadDNRShows’

“exec” is the T-SQL command to execute a stored procedure. “sp_start_job” is the stored procedure that ships with SQL Server in the MSDB system database. This stored procedure will invoke any job stored on the server. In this case, it invokes the job “LoadDNRShows”, which as we setup will run an SSIS package.

Launch the application, and click the first button. Now jump over to SQL Server Management Studio and run this query:

select * from dbo.staging_DNRShows;

select * from dbo.DNRShows;

You should see the first query bring back rows, while the second has nothing. Now return to the app and click the “Execute SSIS” button. If all went well running the query again should now show no rows in our first query, but many nicely processed rows in the second. Success!

A few thoughts about xp_cmdshell

In researching this article I saw many references suggesting writing a stored procedure that uses xp_cmdshell to invoke dtexec. DTEXEC is the command line utility that you can use to launch SSIS Packages. Through it you can override many settings in the package, such as connection strings or variables.

xp_cmdshell is a utility built into SQL Server. Through it you can invoke any “DOS” command. Thus you could dynamically generate a dtexec command, and invoke it via xp_cmdshell.

The problem with xp_cmdshell is you can use it to invoke ANY “DOS” command. Any of them. Such as oh let’s say “DEL *.*” ? xp_cmdshell can be a security hole, for that reason it is turned off by default on SQL Server, and many DBA’s leave it turned off and are not likely to turn it on.

The techniques I’ve demonstrated here do not rely on xp_cmdshell. In fact, all of my testing has been done on my server with the xp_cmdshell turned off. Even though it can be a bit of extra work, setting up the job, etc., I still advise it over the xp_cmdshell method for security and the ability to use it on any server regardless of its setting.

In Closing

That seemed like a lot of effort, but can lead to some very powerful solutions. SSIS is a very powerful tool designed for processing large amounts of data and transforming it. In addition developing under SSIS can be very fast due to its declarative nature. The sample package from this article took the author less than fifteen minutes to code and test.

When faced with a similar task, consider allowing SSIS to handle the bulk work and just having your .Net application invoke your SSIS package. Once you do, there are no ends to the uses you’ll find for SQL Server Integration Services.

SSIS For Developers at CodeStock 2009

At the 2009 CodeStock event I am presenting SQL Server Integration Services for Developers. This class will demonstrate tasks commonly done by VB.Net or C# developers within SQL Server Integration Services.

The sample project and documentation for the lab can be found on the code gallery site at http://code.msdn.microsoft.com/SSISForDevs .

Gentleman, JumpstartTV Your Engines

Thought I’d spread a little link love today, and to start with I will point you to the http://jumpstarttv.com website. JumpstartTV hosts short training videos with one very specific, focused topic per video. When I say short, I mean short. Three to five minutes is the goal for each video. I was honored recently when asked to participate in the site, and have created a series for them on SQL Server Full Text Searching. The first video on installing was featured yesterday, but you don’t have to wait for the videos to be featured, you can see all of them by jumping to my JumpstartTV profile.

One thing to note, you will be asked to create an online profile. This is free, and it turns out very useful. You can use it to track all of the videos you watched. This makes it very convenient to come back later and refresh yourself on something you learned. In addition, the site has a “watch it later” feature. You can go all over the site picking out videos you think would be interesting and clicking the “watch it later” link. Then when you go to your profile, you’ll be able watch the selected videos one after the other. JumpstartTV has videos on both SQL Server and .Net, as well as some interesting ones in the “Misc” category, including bartending, self defense, and more.

The second link for the day is an interesting article on the simple-talk website, “Taking Back Control of your IT Career”. It was written by a friend of mine, Stephan Onisick and chronicles his ordeal of getting laid off from his company of seven years, through a period of retraing himself and ultimately landing a new job that met the needs he set out. Even if your company is nice and stable, you will find good advice for keeping your skills up in this article. Disclaimer, he does mention a presentation I did in the article, but in spite of that it’s still a good read. 😉

Next is a new SQL Server resource brought to us by the fine folks at Quest Software, it’s the new SQLServerPedia. The site is both a wiki and a series of podcast like videos you can subscribe to from your Zune or other music player. I have my Zune setup to automagically download new episodes as they come out. I believe it was @BrentO himself who clued me in on the site.

I’ve written in the past about CodeRush, the tool I refuse to code without. Well the wonderful folks at Devexpress have created a free version called CodeRush Xpress for Visual Studio. Now if you need to code on a budget, you can still enjoy CodeRushy goodness in your 2008 IDE! And it’s not even Christmas yet!

Many of you follow me on Twitter, if you don’t I’d love to invite you, I”m on as @arcanecode . Guy Kawasaki has a great article on How To Pick Up Followers on Twitter. Good article that shows some of the strengths of Twitter, and how to use them to everyone’s advantage.

Speaking of Twitter, thanks to @theronkelso I found a new service called TweetLater. This service lets you schedule a tweet to be delivered to Twitter at a later time. For example, I would like to be able to tweet that our BSDA meeting is about to begin. But as the current President I’m usually up front introducing the guest speaker, and thus not at a keyboard. TweetLater to the rescue, I can set it to auto post the meeting is starting and be in two places at once.

It’s also great as a reminder tool, I can queue up meeting reminder tweets for the entire year ahead of time and forget all about it. Another feature, you can set it to auto reply with a message to new followers, and it can even be setup to automatically follow anyone who is following you. I believe this is a resource I’ll be using a lot.

The next to final link is a reminder really, to the Alabama Tech Events site. This is a community site for posting technical events of interest to folks in the state of Alabama. Please note that the event doesn’t have to be in Alabama, just of reasonable interest to folks in the state. We’ve posted events in Tennesee, Mississippi, Florida and Georgia. If you have a technical event contact me or one of the other user group leaders to get it added.

I’ll wrap up today’s link lovefest with the site analogous to the Alabama Tech Event site, but for the entire country: Community Megaphone. This site lists events from all over the United States. You can filter by state or event type.

Devs and Data Dudes Oh My!

Microsoft has made a big announcement regarding the next version of Visual Studio, Visual Studio 2010. Among other welcome news is that the Developer Edition and Database Editions of Team System will be merging into a single product. This is great news for folks like me (and I suspect many developers) who do a lot of work on both the database side as well as the application side.

The really great news though is that we don’t have to wait for 2010 to take advantage of this. As part of the announcement Microsoft said that effective October 1st, 2008 people who are MSDN Licensed for the 2008 (or 2005) version of Visual Studio Team System Developer will now have access to the Database version, and vice versa.

Database Edition has some great features. One of the ones I use the most is the database comparison tool. It lets me compare data in one database with another and get them into sync. This is great for keeping my local development database that sits on my computer identical with our production system.

I’m sure Database Edition will be new to many developers, so I’d like to mention to books that will help you get up and running with “Data Dude” (as Database Edition is often called).

masteringvstsde The first I have mentioned before, it is SQL MVP Andy Leonard’s Mastering Visual Studio Team System Database Edition Volume 1. This is a great book that focuses exclusively on the database edition. It’s a great resource and one of my favorite books on the subject, I can’t wait for the next volume to come out.

 

 

 

 

apressprovsts The other book is from APress, Pro Visual Studio Team System with Team Edition for Database Professionals. This book covers all aspects of VSTS, including the database tools. I think too often we make life harder on ourselves than we have to, if we took some time to learn the tools available to us, we could be much more productive. I’ve found this book to be a good aid to help me do just that.

My Dev Kit

There’s a new meme of sorts on the web, folks talking about the tools they use to develop with. I first saw it on Shawn Wildermuth’s blog. Shawn’s a great guy, he co-wrote most of those .Net MCTS/MCPD study guides from MS Press, and does a lot of training on Silverlight. So I thought I would keep the meme alive and talk about my own tools.

Hardware

I do a lot on the road, so a laptop is essential. Mine’s getting up there in age, it’s an HP Pavillion dv8000. 2 gig ram, two internal 160 gig hard disks, 17 inch wide screen, single core 64 bit processor. It’s OK, but will hopefully get replaced next year with something with more cores and horsepower. I don’t care much for the keyboard, so I bought an external keyboard from Lenovo. It’s got a trackpoint so I don’t have to take my hands off the keyboard very often, and I use it with both my laptop and the Dell that work supplies me.

At home I use a larger wireless Microsoft mouse, on the road I use one of the smaller Microsoft travel mice. Also in my hardware list is an external Seagate 1TB drive. It hooks up via either firewire or USB, which is nice when my USB ports are all full.

Also in my list is my Zune. Yes my Zune. Cubical farms can get noisy at times, some good tunes on my Zune really help me to zone out and ignore my surroundings, focusing on my code. It’s also nice on my commute or daily walk, I listen to podcasts to keep up my technical knowledge. At night I hook it to my TV via my X-Box 360 to watch video podcasts, or sometimes I lay in bed before going to sleep and watch.

My final piece of hardware is my iPaq, it helps keep my appointments in line and my contacts, plus I have lots of e-books loaded on it for reading. I also used to use it for podcasts prior to getting my Zune.

Operating System and Dev Tools

My laptop currently runs 32 bit Vista Ultimate with Service Pack 1. Since it maxes out at 2 gig, and some 64 bit drivers were not available when Vista first arrived, I saw no benefit to 64 bit and took the path of least resistance. I have quite a few virtual machines in a variety of OS (Server 2008, 2003, XP, Vista, and Ubuntu) for testing, development, and running Beta versions of programs. For a web browser, I bounce back and forth between FireFox and IE7. For a while I was using FF most of the time, but IE7 was a big improvement over 6, and I’m now using them about 50/50. I suspect when IE8 comes out I may be using it more, but will have to see.

Like Shawn I also use Outlook 2007 for my e-mail client. It’s so much easier to organize my mail in Outlook than the g-mail host. But I also use the other features, such as the calendar and task list to help manage my life. I also use the rest of the Office suite for my daily tasks.

I use SnagIt for grabbing still screen captures, awesome tool, and Camtasia for video screen captures. I’m working on several video tutorials now, which is fun but time consuming (which also explains while my blog posts have been off of late). I use Paint.Net for basic photo / image editing. For creating my blog posts, I write them originally in Word 2007, then use Windows Live Writer to post them to my blog.

For quick access to my daily programs, I use one of two things. I really like Bayden Systems SlickRun. I also create a shortcut menu using a technique I blogged about in February.

Developer Tools

As you might expect I use both SQL Server Management Studio and Visual Studio 2008 Team System for day to day development. My top add-ins are Red-Gates SQL Prompt bundle for SSMS and CodeRush for Visual Studio. For a text editor, I absolutely love UltraEdit. Since I have blogged a lot about my dev tools in the past, I will keep this section short.

The Cloud

I’m on a couple of social networking sites, in addition to this blog:

· Twitter

· Posterous

· LinkedIn

· MSDN Code Gallery – One site for SQL Server Full Text Searching and one for SQL Server Compact Edition.

Passing the Baton

OK, your turn, let’s see your blog with your tools!

Ted Neward goes well with cheesy stuffed burritos

I had quite the adventure last week getting to DevLink. Enroute Thursday night part of the electrical system in my old pickup decided to implode, leaving me stranded on the side of the interstate. I got it towed and had to wait for a friend to come get me (thanks Ben!). The closest place to wait was the Mexican Phone Company (aka Taco Bell). I claimed the booth with the wall outlet, setup my laptop, and settled in for 3 hours of waiting. Naturally there was no wi-fi to be found.

Always looking for opportunities to be productive, I worked on editing a Camtasia video I had recorded recently at a Bug.Net meeting. After a bit I decided to take a break and eat. While munching on a cheesy stuffed burrito I watched some videos I had downloaded from the InformIT site. In the first video Ted Neward talked about functional coding and touches on F#, in the second he dives deeper into F#. Once the videos and my cheesy stuffed burritos were done I returned to editing.

So the next day I got my wife’s van out of the shop (transmission had gone out leaving CodeStock – these conferences are getting expensive!) hopped in and took off once again for DevLink, this time making it just in time to see the last session of the day – none other than a presentation by Ted Neward and Amanda Laucher on F#. And I’ll be dog gone if Ted wasn’t wearing the exact same t-shirt he had one while filming the videos I’d watched. Made the experience that much more surreal. Either that or the cheesy stuffed burritos were haunting me, one of the two.

I did at least get one whole day in at DevLink, still well worth going to. And I’m not just saying that because I won a copy of Vista Ultimate and Master Chiefs decapitated head. Oh and if you are looking to learn a little more on F#, Ted’s got a great article in the Sept/Oct 2008 issue of Code Magazine called F# 101. Good reading.

All in all, despite the vehicular issues DevLink was still a good value and I plan to make it an annual trip.

SQL Server Compact Edition Connection Strings

In my recent presentation I talked about an important but subtle difference with connection strings when using SQL Server Compact Edition. It was so important I thought I’d make a special blog post out of it.

There are two methods for programmatically accessing data in SQL Server Compact Edition (SSCE). The first method is using the System.Data.SqlServerCe library. When you create an instance of the SqlCeEngine, you need to pass a connection string formatted like so:

DataSource=”mydatabasename.sdf”; Password=’mypassword’

This method is valid, by the way, for version 3.1 or 3.5 of SSCE. The second method, available with Visual Studio 2008 and the 3.5 version of SSCE is to use LINQ to SQL. When creating the DataContext object, you also need a connection string formatted like so:

Data Source=mydatabasename.sdf; Password=’mypassword’

Note very carefully the two differences. First, the name of the sdf file lacks the double quote marks in the LINQ to SQL version. Second, note the Data Source phrase has a space between the words in the LINQ version, where the SqlCeEngine version lacks the space.

It’s a small distinction, but it’ll drive you nuts if you don’t catch it. I drove myself nuts for quite a while because I didn’t notice the extra space in Data Source when I began experimenting with LINQ to SQL! Hopefully my pain will save others some hair pulling.

Presenting Getting Started with SQL Server Compact Edition 3.5 at BUG.NET Meeting

Just wanted to let everyone know I’ll be doing a presentation this coming Tuesday night, August the 12th for the Birmingham .Net Users Group (BUG.NET). My topic, as you may have guessed from the title, will be using SQL Server Compact Edition.

While I will be using Visual Studio 2008, I will point out which pieces are 2005 compatible. I will also cover the use of both traditional coding techniques as well as how to use LinqToSQL to talk to the Compact Edition.

The meeting takes place at 6:30 pm at New Horizons Training Center in Homewood.

I also plan a new series of blog posts to start later this week on the subject, and will be creating a new Code Gallery site to hold my examples.

Also, don’t forget the regular BSDA meeting this coming Thursday night, the 14th. Also starting at 6:30 pm at New Horizons, Shannon Brooks-Hamilton, a software usability expert, will be there to talk about user interface design. Lots of good thought material on how we can make better UIs for our users.

Arcane Review: Expert SQL Server 2005 Integration Services

If you recall my “Good Reads” post from June 25th, you will remember I am a big believer in books as a learning medium. I like to employ a lot of different ways to learn: user groups, blogs, podcasts, videocasts, and magazines to name a few. But for really in depth coverage, it’s hard to beat a nice book in your hands. I got some good feedback from my mention last week of Andy Leonard’s new e-book on Data Dude, so I thought that I would continue by adding book reviews to the blog every so often.

Expert SQL Server 2005 Integration Services For this review I thought I’d cover a book that seems to constantly be on my desk lately: Expert SQL Server 2005 Integration Services, by Brian Knight and Eric Veerman. This book does a really good job and is specifically targeted toward the data warehousing professional. One entire chapter is devoted to ETL for dimension tables; another chapter focuses on the fact tables. It was great to have coverage so focused on these topics.

Another favorite part of the book is the two chapters on deploying and managing SSIS packages. So often these topics are glossed over, especially the managing piece. The book does a great job in covering all the tools and practices around this subject. I’ll mention one more chapter, one that focuses on package reliability. They cover logging, auditing, event handling, checkpoint files, and even suggestions on testing error handling logic.

There are many more chapters in the book, such as migration from DTS (SQL Server 2000) and Scalability, for you to discover. The other thing I love about this book is the brevity. The authors cover an amazing amount of information in just 382 pages. As a busy, busy person I very much appreciate the conciseness they achieved without sacrificing any clarity.

I’ve met both authors, and have heard them speak. They are both very nice, knowledgeable individuals, and I highly encourage you to attend one of their presentations if you get the chance, or if not at least buy their book from your favorite retailer; you will find it a great investment.

Boy Howdy Those Deep Fried Bytes Are Yummy

Long time readers of my blog or Twitter posts will know I am a big fan of podcasts. There’s a new one worth taking a listen to:

Deep Fried Bytes

Deep Fried Bytes is a new podcast hosted by Mississippi MVP Keith Elder and Chris “Woody” Woodruff. I listened to their inaugural episode on the way to the office this morning and quite enjoyed it. While they will cover all aspects of technology, they will have a heavy focus on .Net development.

The audio quality was superb, it may have been a first episode but their production quality and format made it sound like they’d been podcasting for years. I’ve already added the show to my Zune as a subscription, and recommended it to the Zune Marketplace. I’m eagerly looking forward to the next episode!

Using FormsOf in SQL Server Full Text Searching

In the past I’ve talked about some advanced text searching techniques for SQL Server Full Text Searching. Another you can take advantage of is FormsOf. Let’s say we’ve setup a full text search index on the Adventure Works ProductDescription tables’ Description field. (For more info on how to do this, go to the Arcane Lessons page of this site and scroll down to the “Getting Started With SQL Server 2005 Full Text Searching” section.)

FormsOf has two ways to use it, the Inflectional mode and the Thesaurus mode. Let’s look at an example, and then I’ll explain the differences. For this example, we’ll combine data from several tables to get back some meaningful information from the AdventureWorks database.

— Example 1 – FORMSOF INFLECTION

 

select [Name], ProductNumber, [Description]

  from [Production].[Product] p

    , [Production].[ProductDescription] pd

    , [Production].[ProductModelProductDescriptionCulture] pmpdc

 where p.ProductModelID = pmpdc.ProductModelID

   and pmpdc.ProductDescriptionID = pd.ProductDescriptionID

   and CONTAINS(pd.[Description], ‘FORMSOF(Inflectional, light)’ )

 

— Example 2 – FORMSOF THESAURUS

 

select [Name], ProductNumber, [Description]

  from [Production].[Product] p

    , [Production].[ProductDescription] pd

    , [Production].[ProductModelProductDescriptionCulture] pmpdc

 where p.ProductModelID = pmpdc.ProductModelID

   and pmpdc.ProductDescriptionID = pd.ProductDescriptionID

   and CONTAINS(pd.[Description], ‘FORMSOF(Thesaurus, light)’ )

Both examples use the same, I admit, rather bizarre syntax. Inside the string you pass to the CONTAINS clause, you put FORMSOF, with the word Inflectional or Thesaurus, a comma, then the word or phrase you want to search form. In this example I use Contains, but FormsOf also works with FreeText as well.

So what is the difference between Inflectional and Thesaurus? Inflectional finds all of the tenses of a word. For example, if you passed in Start, Inflectional will find Start, Started, and Starting. For nouns, Inflectional finds the single, plural, and possessive forms.

Thesaurus works like you would expect a thesaurus to. It will find variations of a word, Start, Begin, etc. Essentially, words that have the same meaning.

How do these relate to Contains and FreeText? Normally Contains looks for an exact match. FullText matches the meaning, but not the exact words in the query. FullText is very similar to Thesaurus, and even uses the Thesaurus in its work. FullText will break out the search string into its individual words, if there is more than one word. For each word, it generates the inflectional forms of the word, then identifies a list of matches for each word based on the thesaurus. FormsOf(Thesaurus… on the other hand just uses the thesaurus to do the search, without going through the inflectional step.

Using a combination of Contains, FreeText, and FormsOf you can give your users some real flexibility, ranging from exact matches to wide open searches.

SQL Server Migration Assistant

At work we have an Oracle based system we’ll be retiring sometime next year. We want to keep the data around for reporting, but don’t have the manpower or funds to properly flatten the database into a true data warehouse schema. We are leaning then to copy the data into our SQL warehouse database in almost a direct copy of the legacy’s schema with just a few minor tweaks.

In looking around for an efficient way to handle this, I found a tool called the SQL Server Migration Assistant for Oracle. This handy tool will copy everything, tables, views, stored procedures, triggers, just about everything you’d want.

In my case, all I really want is the table layouts. I will be making a minor tweak to the table layouts so I can combine four databases into a single one. In just a few hours, I was able to use SSMA to generate create table scripts for several hundred tables. I then brought the script into my favorite text editor, UltraEdit, and used it’s regular expression capability to add a source database field as the first field in every create table script. Saved ’em, ran it and in short order have a complete schema ready to hold my data.

Even using the SQL Server Migration Assistant in such a limited fashion, I was still able to save myself weeks of coding and trying to figure out how to manually map Oracle datatypes to SQL Server. If you are looking at any kind of data conversion take a look at this tool. You can find more info on the SSMA site at:

http://www.microsoft.com/sql/solutions/migration/oracle/default.mspx

Birmingham Tech News and Events

There are some big doings going on in the event community over the next few weeks.

First off, this Thursday May 8th at 6:30 pm the Birmingham Software Developers Association (BSDA) will be having “The Variety Show”. Join us for a variety of 15 minute presentations by various club members on what they’ve been working on lately–there should be something for everyone and the floor is open for anyone who’d like to do a short presentation.

On Friday, May 9th the IPSA group will meet during lunch at the McWane center, the topic will be The Social Media Toolbox.

Next week, the Birmingham .Net User Group (Bug.Net) will be having it’s regular meeting on May 13th at 6:30 pm. Stay tuned to their website for speaker and details.

Then, on Wednesday, May 14th at 6:30 pm the BSDA and Bug.Net are pleased to co-present a special event. Regional speaker Michael Neel will be here to talk on DataSets:

DataSets are Evil. They will hog your CPU, steal your RAM, and rob your home. This is the story surrounding DataSets, but what is fact and what is myth? In this session we will look at DataSets and the tools that go with them to see how they can save you development time while not crashing the server. We’ll also dive into DataSets in 2008 with LINQ to DataSets and Unit Testing with DataSets.

Learn more about Michael at vinull.com/profile

Finally, beginning at 5:30 pm on May 20th the Steel City SQL Group will meet. MVP Kevin Bowles will be here to talk about SQL Server 2008 Development. Kevin is a great speaker, his sessions are always loaded with useful information.

With the exception of the IPSA meeting, all of the other events will be held at the New Horizons training center in Homewood. A special thanks to the folks at New Horizons for making their facilities open to the Birmingham user group community!