Fun With PowerShell Objects – Creating Objects from C#

Introduction

This is the next installment in my series on creating objects in PowerShell. In the first installment we showed how to create an object using the class type introduced in PowerShell 5. That covered the basics, and in the last installment of this series we’ll cover some advanced techniques.

In the last two installment of this series, I covered the various ways to create objects using the PSCustomObject. We saw how to create it using the New-Object cmdlet, then how to add your custom properties to it using the Add-Member cmdlet. In the subsequent post we saw how to add new methods to it.

In this post, we’ll cover something new, creating an object based on C# code!

For all of the examples we’ll display the code, then (when applicable) under it the result of our code. In this article I’ll be using PowerShell Core, 7.2.1, and VSCode. The examples should work in PowerShell 5.1 in the PowerShell IDE, although they’ve not been tested there.

Additionally, be on the lookout for the backtick ` , PowerShell’s line continuation character, at the end of many lines in the code samples. The blog formatting has a limited width, so using the line continuation character makes the examples much easier to read. My post Fun With PowerShell Pipelined Functions dedicates a section to the line continuation character if you want to learn more.

To run a snippet of code highlight the lines you want to execute, then in VSCode press F8 or in the IDE F5. You can display the contents of any variable by highlighting it and using F8/F5.

Why C#

You may be asking yourself, why integrate C# code into PowerShell? Why not just define our objects using PowerShell? There are several reasons, especially if you are working with a team of C# developers, or are a C# developer yourself.

First, it can make testing your C# classes easy. You can code the class and save it in a .CS file. Then run some simple PowerShell to create a new object from your class and test its various properties and methods. This lets you make changes easily and test again. All without having to create a full blown C# project just to test.

It also allows for code reuse. Let’s say your C# developers have created a library to do some calculations, and you need to use one of the methods from that library, but that library expects you to pass in an object based on a class.

Let’s further the example, perhaps you have the task of reading in a CSV file, doing a calculation for each row, then outputting another CSV file, or maybe even a JSON file. This is a one time use, so you don’t want to go to the effort of creating a full blown C# project.

Using the techniques in this demo, you could simply access the C# file in which the class was defined, and generate an object from it in PowerShell. Then all you’d have to do is populate the object and pass it into the library to do the calculation, and output the result.

These are just a few simple examples, I’m sure you’ll come up with many more as the need arises. I’ll be honest, this isn’t something you will need to do a great deal, but when you do you’ll appreciate knowing how.

Embedding a C# Class in your PowerShell

In this first method, we’ll define a standard C# class within a here string then add it as a new data type in PowerShell. Once it exists as a type, we can then generate new objects from it. Let’s take a look at a very simple class definition.

$code = @"
using System;

public class SchemaTable
{
  public string DatabaseName;

  public string SchemaTableName(string pSchema, string pTable)
  {
    string retVal = "";  // Setup a return variable

    retVal = pSchema + "." + pTable;

    return retVal;

  } // public SchemaTableName

  public string FullName(string pSchema, string pTable)
  {
    string retVal = "";  // Setup a return variable

    retVal = this.DatabaseName + "." + pSchema + "." + pTable;

    return retVal;

  } // public FullName

} // class SchemaTable

"@

If you’ve read this far I’m going to assume you are familiar with C#, at least the basics of it, so I’ll keep this explanation at a high level.

We start by declaring a class named SchemaTable. The next line declares a variable (which will become a property) called DatabaseName.

I then create the first of two functions (which will become our methods). The SchemaTableName simply takes the two passed in values of pSchema and pTable and concatenates them together with a period between them.

The second, FullName, takes the value in DatabaseName and concatenates them with the schema and table name parameters, again using a period as a separator in the return string.

Next, we need to add this class as a new data type in PowerShell. You are already familiar with many data types, such as int, string, and more. We simply want to add a new data type to our PowerShell environment.

To do so, we will use the Add-Type cmdlet.

Add-Type -TypeDefinition $code `
         -Language CSharp

The first parameter, TypeDefinition, takes the C# code we defined in the here string. The second, Language, is pretty obvious. We simply need to let PowerShell know what language this code was written in.

As of right now, the only supported language is C#. Because of this, if you leave off the -Language CSharp parameter, it will default to C# as the language. I included it here for completeness, but will omit it for future examples in this post.

So now we’ve defined a class in C#, and have added it as a new data type in PowerShell. How then do we create an object from it?

We’ll turn to our old friend, New-Object.

$result = New-Object -TypeName SchemaTable

That’s it, that simple one line will create our new object based on the C# code we defined in the $code variable. You can even use Get-Member to display its properties and methods like you would with any other object.

$result | Get-Member

Result:

Name            MemberType Definition
----            ---------- ----------
Equals          Method     bool Equals(System.Object obj)
FullName        Method     string FullName(string pSchema, string pTable)
GetHashCode     Method     int GetHashCode()
GetType         Method     type GetType()
SchemaTableName Method     string SchemaTableName(string pSchema, string pTable)
ToString        Method     string ToString()
DatabaseName    Property   string DatabaseName {get;set;}

It contains the standard methods and properties built into all objects, but it also has the three we defined: FullName, SchemaTableName, and DatabaseName.

We can use these properties and methods just like ones in any other object. Let’s set the DatabaseName property, then display it.

$result.DatabaseName = 'MyDB'
$result.DatabaseName

Result:

MyDB

Likewise, we can access the methods we created. Here’s the SchemaTableName method.

$result.SchemaTableName('ASchema', 'ATable')

Result:

ASchema.ATable

And for completeness, the FullName method.

$result.FullName('ASchema', 'ATable')

Result:

MyDB.ASchema.ATable

C# Classes with Static Methods

In the original post in this series on basic PowerShell classes, I mentioned the concept of static methods and properties. As static method or property is simply one that can be called without having to generate a new object.

While we’ll circle back around to discuss implementing static methods and properties in PowerShell classes in the final post in this series, it is likely you’ll encounter C# classes with static methods and properties. As such we’ll go ahead and cover them here, while we are talking C#.

In the code below, I’ve defined a class with one method, and labeled it as static. In reality you will encounter many classes that have a mix of static and non-static members, but for this post we’ll keep the example simple.

$code = @"
using System;

public class StaticSchemaTable
{
  public static string FullName(string pSchema, string pTable)
  {
    string retVal = "";

    retVal = pSchema + "." + pTable;

    return retVal;

  } // public static FullName
} // class StaticSchemaTable
"@

As you can see, I’ve simply used the static keyword as part of the FullName function declaration. I also changed the name of the class, otherwise the code is the same as the previous demo.

Now we need to add this as a new type in our current PowerShell session.

Add-Type -TypeDefinition $code

Calling our static method requires different syntax. First, it won’t be necessary to create an object from it. Second, we’ll need to use the full name of our class in brackets, followed by two colons. We then indicate the name of the static function to call, and pass in any parameters.

$result = [StaticSchemaTable]::FullName('MySchema', 'myTable')
$result

Result:

MySchema.myTable

Static methods and parameters aren’t something that’s used a great deal, but they are used so you should know how to handle them.

Creating an Object from A C# File

While it is certainly possible to embed C# code right in your PowerShell, this could lead to some issues. The moment a developer makes a change to their .cs file, you are now out of sync. You don’t want to have to keep cutting and pasting all the time.

It makes far more sense, then, to simply access the C# file with the class definition, and load it at run time.

First, you need to create a C# file with the sample code. Here is what I put in my .cs file:

using System;

public class StaticSchemaTableInFile
{
  public static string FullName(string pSchema, string pTable)
  {
    string retVal = "";

    retVal = pSchema + "." + pTable;

    return retVal;

  } // public static FullName
} // class StaticSchemaTableInFile

This is the same static class you just saw, with the minor change to the class name.

In the next sample, I simply build the path to the file name, then use the Get-Content to read it.

$csPath = 'C:\Users\arcan\OneDrive\BlogPosts\Markdown\'
$file = "$($csPath)Fun-With-PowerShell-Objects-Part 3.cs"
$code = Get-Content $file | Out-String

Get-Content will read the contents of the file. By default Get-Content reads it as an array with each line in the file being an element, so we’ll have to pipe it through the Out-String cmdlet to convert it to a single string. This string is then stored in the $code variable. Of course you’ll need to update the path and file variables you used on your computer.

Now we do just like the previous demo, call Add-Type then run it.

Add-Type -TypeDefinition $code
$result = [StaticSchemaTableInFile]::FullName('mySchema', 'myTable')
$result

Result:

mySchema.myTable

Again, I reused the example from the static demo but we could also have used the first example, or any standard C# file containing class definitions.

Other Ways To Add Types

While outside the scope of this post, I did want to mention there are two other ways you can add new data types to your PowerShell scripts. First, Add-Type will let you load types stored in an assmebly, in other words a .dll file.

Second, if you are running on a Windows computer you can add types stored in native Windows APIs. If you want to learn more about these, I’ll refer you to the Add-Type Documentation at Microsoft.

Persistance

One last thing you need to be aware of. When you add a type it does not persist between sessions. The moment you close your command window or VSCode, that type goes away. You’ll need to recreate it the next time you run your script.

This is probably a good thing, as it’s doubtful you will need to use these types on a daily basis. Should you need to, though, you’ll have to edit your PowerShell profile and have them added within it.

Conclusion

In this post we saw how to create custom objects from C# classes. They could be embedded in our PowerShell code, or stored in external files. The subject of static methods and properties were mentioned, along with a demo on how to use them.

In the next post we’ll see a very useful technique, and cover the ability to add our own custom methods and properties to existing objects that others created, including ones built right into PowerShell.

The demos in this series of blog posts were inspired by my Pluralsight course PowerShell 7 Quick Start for Developers on Linux, macOS and Windows, one of many PowerShell courses I have on Pluralsight. All of my courses are linked on my About Me page.

If you don’t have a Pluralsight subscription, just go to my list of courses on Pluralsight . At the top is a Try For Free button you can use to get a free 10 day subscription to Pluralsight, with which you can watch my courses, or any other course on the site.

Advertisement

Encoding Strings to Base64 in C# – The Movie

Way back in 2007 I wrote a blog post entitled “Encoding Strings to Base64 in C#”. This continues to be my #1 referenced post.

The guys over at Webucator have turned my post into an instructional video, and I couldn’t be happier with the results. You can find the video on YouTube at:

https://www.youtube.com/watch?v=r94gKb-NzLM&feature=youtu.be

Make sure to give ‘em a like for all their hard work!

Unmixing the MIX 10 Content

Every year Microsoft puts on a big conference called MIX, this year was MIX 10. It’s a big developers conference focused mostly on Web development, although this year also had a fair amount of information on the new Windows Phone 7 platform. Microsoft is also kind enough to make the sessions available for free download to anyone interested after the conference is over. As of this blog post the average quality WMV recordings are up, and the high quality ones should be up soon. You can download them at http://live.visitmix.com/Videos .

To make downloading easy I suggest using FireFox with an add-in called Down-Them-All. Yes, I know, you’re probably a die hard Microsoft guy and use IE all the time, but you probably have FireFox installed for testing purposes. Go grab the Down-Them-All add-in. Then go to the Mix website (link above), right click anywhere in the webpage, and pick “DownThemALL!”. In the dialog that pops up it will show all the downloadable content on the page. Set the directory to “Save files in:” in the lower part of the dialog (for the average quality WMV files you’ll need about 12.4 gig), then under filters you can pick Videos to get all the WMVs. You could also go to the fast filtering area and set something like *.pptx to download all the slides. Click “Start!” and the downloads will begin. Then go get coffee. And lunch. And probably dinner as well, as this will take a while.

Once you’ve got it all downloaded you’re left with a lot of files with the session names as the file names. CL01.wmv, CL02.wmv, etc. I find it much easier if the file names also contain the text of the session description. That way when I’m looking over the media inside my Zune library or even just browsing in Windows explorer it’s easy to pick something to watch. So using the column mode capabilities plus a simple macro in a text editor called UltraEdit (easily the best software purchase I’ve ever made in my life) I quickly put together a batch file to rename all the files for me. Here is the contents of it:

rename CL01.wmv "CL01 – Changing our Game – an Introduction to Windows Phone 7 Series.wmv"
rename CL02.wmv "CL02 – Authoring for Windows Phone Silverlight 4 and WPF 4 with Expression Blend.wmv"
rename CL03.wmv "CL03 – Prototyping Rich Microsoft Silverlight Applications with Expression SketchFlow.wmv"
rename CL06.wmv "CL06 – Designing Bing – Heart and Science.wmv"
rename CL07.wmv "CL07 – Microsoft Silverlight 4 Overview – What’s in Store for Silverlight 4?.wmv"
rename CL08.wmv "CL08 – Microsoft Silverlight 4 Business Applications.wmv"
rename CL09.wmv "CL09 – Developing with WCF RIA Services Quickly and Effectively.wmv"
rename CL10.wmv "CL10 – Stepping Outside the Browser with Microsoft Silverlight 4.wmv"
rename CL13.wmv "CL13 – Overview of the Windows Phone 7 Series Application Platform.wmv"
rename CL14.wmv "CL14 – Windows Phone UI and Design Language.wmv"
rename CL15.wmv "CL15 – An Introduction to Developing Applications for Microsoft Silverlight.wmv"
rename CL16.wmv "CL16 – Building Windows Phone Applications with Silverlight Part 1.wmv"
rename CL17.wmv "CL17 – Building Windows Phone Applications with Silverlight Part 2.wmv"
rename CL18.wmv "CL18 – Windows Phone Application Platform Architecture.wmv"
rename CL19.wmv "CL19 – Development and Debugging Tools for Building XNA Games for Windows Phone.wmv"
rename CL20.wmv "CL20 – Distributing and Monetizing Windows Phone Applications and Games.wmv"
rename CL21.wmv "CL21 – Building Windows Phone Games.wmv"
rename CL22.wmv "CL22 – Building a High Performance 3D Game for Windows Phone.wmv"
rename CL23.wmv "CL23 – Designing and Developing for the Rich Mobile Web.wmv"
rename CL24.wmv "CL24 – The Microsoft Silverlight Analytics Framework.wmv"
rename CL25.wmv "CL25 – Microsoft Silverlight Media  – Moving at 60fps.wmv"
rename CL26.wmv "CL26 – Introducing the Silverlight Rough Cut Editor.wmv"
rename CL27.wmv "CL27 – HTML5 – Cross-Browser Best Practices.wmv"
rename CL28.wmv "CL28 – In-Depth Look at Internet Explorer 9.wmv"
rename CL29.wmv "CL29 – HTML5 – High-Performance Best Practices for Web Sites.wmv"
rename CL30.wmv "CL30 – Building Innovative Windows Client Software.wmv"
rename CL50.wmv "CL50 – Search Engine Optimization for Microsoft Silverlight.wmv"
rename CL51.wmv "CL51 – Building an Accessible Microsoft Silverlight Experience.wmv"
rename CL52.wmv "CL52 – Microsoft Silverlight Optimization and Extensibility with MEF.wmv"
rename CL53.wmv "CL53 – Flash Skills Applied to Microsoft Silverlight Design and Development.wmv"
rename CL54.wmv "CL54 – Software and Web Entrepreneurs – Go Big with BizSpark and WebsiteSpark.wmv"
rename CL55.wmv "CL55 – Dynamic Layout and Transitions for Microsoft Silverlight 4 with Microsoft Expression Blend.wmv"
rename CL56.wmv "CL56 – A Case Study – Rapid WordPress Design and Prototyping with Expression Web 3.wmv"
rename CL58.wmv "CL58 – Accessing Web Services in Microsoft Silverlight.wmv"
rename CL59.wmv "CL59 – Unit Testing Silverlight and Windows Phone Applications.wmv"
rename CL60.wmv "CL60 – Silverlight Performance on Windows Phone.wmv"
rename DS01.wmv "DS01 – The Laws of User Experience.wmv"
rename DS02.wmv "DS02 – Treat Your Content Right.wmv"
rename DS03.wmv "DS03 – Running with Wireframes – Taking Information Architecture (IA) into Design.wmv"
rename DS04.wmv "DS04 – Lifecycle of a Wireframe.wmv"
rename DS05.wmv "DS05 – Total Experience Design.wmv"
rename DS06.wmv "DS06 – Touch in Public – Multi-touch Interaction Design for Kiosks and Architectural Experiences.wmv"
rename DS07.wmv "DS07 – The Art Technology and Science of Reading.wmv"
rename DS08.wmv "DS08 – Creating Great Experiences through Collaboration.wmv"
rename DS09.wmv "DS09 – Peanut Butter and Jelly – Putting ‘Content Management’ Back into Context.wmv"
rename DS10.wmv "DS10 – Service Design Goes Social.wmv"
rename DS11.wmv "DS11 – Great User Experiences – Seamlessly Blending Technology and Design.wmv"
rename DS12.wmv "DS12 – Total Experience – A Design Methodology for Agencies.wmv"
rename DS13.wmv "DS13 – The Elephant in the Room.wmv"
rename DS14.wmv "DS14 – The Democratization of the Design Industry.wmv"
rename DS15.wmv "DS15 – The Type We Want.wmv"
rename DS16.wmv "DS16 – An Hour With Bill Buxton.wmv"
rename EX01.wmv "EX01 – Cloud Computing Economies of Scale.wmv"
rename EX02.wmv "EX02 – The Mono Project.wmv"
rename EX03.wmv "EX03 – Modern Web Form Design.wmv"
rename EX04.wmv "EX04 – Robots at MySpace – Massive Scaling a .NET Website with the Microsoft Robotic Studio.wmv"
rename EX06.wmv "EX06 – 10 Ways to Attack a Design Problem and Come Out Winning.wmv"
rename EX07.wmv "EX07 – Principles of Microsoft Silverlight Graphics and Animation.wmv"
rename EX10.wmv "EX10 – Building a Next-Generation Web Application with Microsoft ASP.NET MVC 2 and jQuery.wmv"
rename EX11.wmv "EX11 – Using Storage in the Windows Azure Platform.wmv"
rename EX12.wmv "EX12 – Creating Effective Info Viz in Microsoft Silverlight.wmv"
rename EX13.wmv "EX13 – Microsoft Surface Goes Social – Research Project at the University of Linz.wmv"
rename EX14.wmv "EX14 – Understanding the Model-View-ViewModel Pattern.wmv"
rename EX15.wmv "EX15 – Build Your Own MVVM Framework.wmv"
rename EX16.wmv "EX16 – Securing Microsoft Silverlight Applications.wmv"
rename EX17.wmv "EX17 – IronRuby for the .NET Developer.wmv"
rename EX18.wmv "EX18 – Developing Natural User Interfaces with Microsoft Silverlight and WPF 4 Touch.wmv"
rename EX19.wmv "EX19 – The OpenRasta Framework for Building RESTful Applications.wmv"
rename EX20.wmv "EX20 – Building Great Standards-Based Websites for the Big Wide World with Microsoft ASP.NET 4.wmv"
rename EX21.wmv "EX21 – Syncing Audio Video and Animations in Microsoft Silverlight Applications.wmv"
rename EX22.wmv "EX22 – Six Things Every jQuery Developer Must Know.wmv"
rename EX23.wmv "EX23 – Building Facebook Apps with Microsoft .NET and Deploying to Windows Azure.wmv"
rename EX25.wmv "EX25 – Design the Ordinary Like the Fixie.wmv"
rename EX26.wmv "EX26 – From Comp to Code – A Design Communion.wmv"
rename EX27.wmv "EX27 – Do You Speak My Language? Microsoft Translator and the Power of Collaboration.wmv"
rename EX28.wmv "EX28 – Building Rich and Interactive User Experiences in SharePoint.wmv"
rename EX29.wmv "EX29 – Building Platforms and Applications for the Real-Time Web.wmv"
rename EX30.wmv "EX30 – SVG – The Past Present and Future of Vector Graphics for the Web.wmv"
rename EX31.wmv "EX31 – Developing Multiplayer Games with Microsoft Silverlight 4.wmv"
rename EX32.wmv "EX32 – Smooth Streaming Live in HD – From Camera to Screen.wmv"
rename EX33.wmv "EX33 – Smooth Streaming Live in HD – 2010 Olympic Winter Games.wmv"
rename EX34.wmv "EX34 – Participating in the Web of Data with Open Standards.wmv"
rename EX35.wmv "EX35 – Opening Up Opportunity with Twitter.wmv"
rename EX36.wmv "EX36 – How jQuery Makes Hard Things Simple.wmv"
rename EX37.wmv "EX37 – Adding the Where to the When of Social Applications.wmv"
rename EX38.wmv "EX38 – Building Large-Scale Data-Centric Applications with Silverlight.wmv"
rename EX39.wmv "EX39 – The Tale of JavaScript. I Mean ECMAScript..wmv"
rename EX50.wmv "EX50 – Debugging Microsoft Silverlight Applications.wmv"
rename EX51.wmv "EX51 – Building Finance Applications with Microsoft Silverlight 4.wmv"
rename EX52.wmv "EX52 – Copyright – A Cloudy Subject.wmv"
rename EX53.wmv "EX53 – Storm Clouds – What to Consider About Privacy Before Writing a Line of Code.wmv"
rename EX55.wmv "EX55 – Building the eBay Simple Lister with Silverlight.wmv"
rename EX56.wmv "EX56 – Designing Rich Experiences for Data-Centric Applications.wmv"
rename FT03.wmv "FT03 – Tips and Tricks for Making Web Forms Shine with Microsoft ASP.NET 4.wmv"
rename FT04.wmv "FT04 – What’s New in Microsoft ASP.NET MVC 2.wmv"
rename FT05.wmv "FT05 – The HaaHa Show – Microsoft ASP.NET MVC Security with Haack and Hanselman.wmv"
rename FT06.wmv "FT06 – Deep Dive into Orchard Extensibility for CMS Developers.wmv"
rename FT07.wmv "FT07 – Beyond File – New Company – From Cheesy Sample to Social Platform.wmv"
rename FT08.wmv "FT08 – Improving Software Quality for the Modern Web.wmv"
rename FT09.wmv "FT09 – Pumping Iron on the Web – IronRuby and IronPython.wmv"
rename FT10.wmv "FT10 – Driving Experiences via Services Using the Microsoft .NET Framework.wmv"
rename FT11.wmv "FT11 – Designing and Delivering Scalable and Resilient Web Services.wmv"
rename FT12.wmv "FT12 – OData – There’s a Feed for That.wmv"
rename FT13.wmv "FT13 – Implementing OData – How to Create a Feed for That.wmv"
rename FT14.wmv "FT14 – Web Deployment Made Awesome – If You’re Using XCopy You’re Doing It Wrong.wmv"
rename FT15.wmv "FT15 – Accessing Data in a Microsoft .NET-Connected Web Application.wmv"
rename FT50.wmv "FT50 – Advanced Web Debugging with Fiddler.wmv"
rename FT51.wmv "FT51 – Internet Explorer Developer Tools.wmv"
rename FTL01.wmv "FTL01 – Reactive Extensions for JavaScript.wmv"
rename FTL02.wmv "FTL02 – Building Pivot Collections.wmv"
rename FTL03.wmv "FTL03 – Unlocking Audio-Video Content with Speech Recognition.wmv"
rename FTL50.wmv "FTL50 – Incarnate – Behind the Scenes.wmv"
rename FTL51.wmv "FTL51 – Quickly Implementing New Cross-Browser Features with Ruby and Python.wmv"
rename KEY01.wmv "KEY01 – Keynote Day 1.wmv"
rename KEY02.wmv "KEY02 – Keynote Day 2.wmv"
rename PR01.wmv "PR01 – Designing Corporate Web Sites using SharePoint 2010.wmv"
rename PR02.wmv "PR02 – Designing an Internet-Facing Web Site Using SharePoint 2010.wmv"
rename SVC01.wmv "SVC01 – Using Windows Identity Foundation for Creating Identity-Driven Experiences in Microsoft Silverlight.wmv"
rename SVC02.wmv "SVC02 – Microsoft Project Code Name Dallas – Data For Your Apps.wmv"
rename SVC03.wmv "SVC03 – Using Ruby on Rails to Build Windows Azure Applications.wmv"
rename SVC04.wmv "SVC04 – Lap around the Windows Azure Platform.wmv"
rename SVC05.wmv "SVC05 – Building Web Applications with Windows Azure Storage.wmv"
rename SVC06.wmv "SVC06 – Microsoft Silverlight and Windows Azure – A Match Made for the Web.wmv"
rename SVC07.wmv "SVC07 – Building Web Applications with Microsoft SQL Azure.wmv"
rename SVC08.wmv "SVC08 – Connecting Your Applications in the Cloud with Windows Azure AppFabric.wmv"
rename SVC09.wmv "SVC09 – Building and Deploying Windows Azure-Based Applications with Microsoft Visual Studio 2010.wmv"
rename SVC10.wmv "SVC10 – Building Offline Web Apps Using Microsoft Sync Framework.wmv"
rename SVC12.wmv "SVC12 – Building PHP Applications using the Windows Azure Platform.wmv"
rename SVC50.wmv "SVC50 – Improving the Usability and Security of OpenID.wmv"

Simply take the above content, copy into notepad and save it in the same folder as all the WMV files. (I saved it as WmvRename.bat). Then just run it and your Mix10 filenames will no longer be all MIXed up!

SSIS For Developers at Huntsville New Technology User Group

On Monday night January 18, 2010 I will be presenting “SSIS for Developers” at HUNTUG, The Huntsville New Technology Users Group. The meeting kicks off at 6 pm, and we’d love to see you there. Their website has all the location information.

We’ve all been there. We develop a new system to replace some older system long overdue for retirement. The dread always comes around the data conversion step. Having to figure out the quickest, easiest way to get the data from the old system to the new. Or perhaps they have intense data processing that would be better handled by SQL Server, but is too complex for a T-SQL stored procedure. Importing data into their database from an external source, or performing complex data processing over large datasets.

What most developers don’t realize is that they already have an extremely powerful tool to solve both these and other problems: SQL Server Integration Services. SSIS isn’t just for feeding your warehouse, it’s also a great tool for data migration and processing. Come get an introduction and learn how you can be using SSIS for your projects.

The slide deck and main demo can be found at my Code Gallery site:

https://code.msdn.microsoft.com/Release/ProjectReleases.aspx?ProjectName=SSISForDevs&ReleaseId=2883

The calling of SSIS from .Net demo can be found at:

http://code.msdn.microsoft.com/ssisfromnet

Calling SSIS from .Net

In a recent DotNetRocks show, episode 483, Kent Tegels was discussing SQL Server Integration Services and how it can be useful to both the BI Developer as well as the traditional application developer. While today I am a SQL Server BI guy, I come from a long developer background and could not agree more. SSIS is a very powerful tool that could benefit many developers even those not on Business Intelligence projects. It was a great episode, and I high encourage everyone to listen.

There is one point though that was not made very clear, but I think is tremendously important. It is indeed possible to invoke an SSIS package from a .Net application if that SSIS package has been deployed to the SQL Server itself. This article will give an overview of how to do just that. All of the sample code here will also be made available in download form from the companion Code Gallery site, http://code.msdn.microsoft.com/ssisfromnet .

In this article, I do assume a few prerequisites. First, you have a SQL Server with SSIS installed, even if it’s just your local development box with SQL Server Developer Edition installed. Second, I don’t get into much detail on how SSIS works, the package is very easy to understand. However you may wish to have a reference handy. You may also need the assistance of your friendly neighborhood DBA in setting up the SQL job used in the process.

Summary

While the technique is straightforward, there are a fair number of detailed steps involved. For those of you just wanting the overview, we need to start with some tables (or other data) we want to work with. After that we’ll write the SSIS package to manipulate that data.

Once the package is created it must be deployed to the SQL Server so it will know about it. This deploy can be to the file system or to SQL Server.

Once deployed, a SQL Server Job must be created that executes the deployed SSIS package.

Finally, you can execute the job from your .Net application via ADO.NET and a call to the sp_start_job stored procedure built into the msdb system database.

OK, let’s get to coding!

Setup the Tables

First we need some data to work with. What better than a listing of previous Dot Net Rocks episodes? I simply went to the Previous Shows page, highlighted the three columns of show number, show name, and date, and saved them to a text file. (Available on the Code Gallery site.)

Next we need a place to hold data so SSIS can work with it. I created a database and named it ArcaneCode, however any database should work. Next we’ll create a table to hold “staging” DNR Show data.

CREATE TABLE [dbo].[staging_DNRShows](
  [ShowData] [varchar](250) NOT NULL
) ON [PRIMARY]

This table will hold the raw data from the text file, each line in the text file becoming one row here. Next we want a table to hold the final results.

CREATE TABLE [dbo].[DNRShows](
  [ShowNumber] [int] NOT NULL,
  [ShowName] [varchar](250) NULL,
  [ShowDate] [datetime] NULL,
  CONSTRAINT [PK_DNRShows] PRIMARY KEY CLUSTERED
  (
  [ShowNumber] ASC
  )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
  ) ON [PRIMARY]

The job of the SSIS package will be to read each row in the staging table and split it into 3 columns, the show’s number, name, and date, then place those three columns into the DNRShows table above.

The SSIS Package

The next step is to create the SSIS package itself. Opening up Visual Studio / BIDS, create a new Business Intelligence SQL Server Integration Services project. First let’s setup a shared Data Source to the local server, using the ArcaneCode database as our source.

The default package name of “Package.dtsx” isn’t very informative, so let’s rename it ”LoadDNRShows.dtsx”. Start by adding a reference to the shared data source in the Connection Managers area, taking the default. Then in the Control Flow surface add 3 tasks, as seen here:

clip_image001

The first task is an Execute SQL Task that simply runs a “DELETE FROM dbo.DNRShows” command to wipe out what was already there. Of course in a true application we’d be checking for existing records in the data flow and doing updates or inserts, but for simplicity in this example we’ll just wipe and reload each time.

The final task is also an Execute SQL Task, after we have processed the data we no longer need it in the staging table, so we’ll issue a “DELETE FROM dbo.staging_DNRShows” to remove it.

The middle item is our Data Flow Task. This is what does the heavy lifting of moving the staging data to the main table. Here is a snapshot of what it looks like:

clip_image002

The first task is our OLEDB Source, it references the staging_DNRShows table. Next is what’s called a Derived Column Transformation. This will allow you to add new calculated columns to the flow, or add columns from variables. In this case we want to add three new columns, based on the single column coming from the staging table.

clip_image003

As you can see in under Columns in the upper left, we have one column in our source, ShowData. In the lower half we need to add three new columns, ShowNumber, ShowDate, and ShowName. Here are the expressions for each:

ShowNumber
    (DT_I4)SUBSTRING(ShowData,1,FINDSTRING(ShowData,"\t",1))

ShowDate
    (DT_DBDATE)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",2) + 1,LEN(ShowData) – FINDSTRING(ShowData,"\t",2))

ShowName
    (DT_STR,250,1252)SUBSTRING(ShowData,FINDSTRING(ShowData,"\t",1) + 1,FINDSTRING(ShowData,"\t",2) – FINDSTRING(ShowData,"\t",1) – 1)

The syntax is an odd blend of VB and C#. Each one starts with a “(DT_”, these are type casts, converting the result of the rest of the expression to what we need. For example, (DT_I4) converts to a four byte integer, which we need because in our database the ShowNumber column was defined as an integer. You will see SUBSTRING and LEN which work like their VB counterparts. FINDSTRING works like the old POS statement, it finds the location of the text and returns that number. The “\t” represents the tab character, here the C# fans win out as the Expression editor uses C# like escapes for special characters. \t for tab, \b for backspace, etc.

Finally we need to write out the data. For this simply add an OLEDB Destination and set it to the target table of dbo.DNRShows. On the mappings tab make sure our three new columns map correctly to the columns in our target table.

Deploy the Package

This completes the coding for the package, but there is one final step we need to do. First, in the solution explorer right click on the project (not the solution, the project as highlighted below) and pick properties.

clip_image004

In the properties dialog, change the “CreateDeploymentUtility” option from false (the default) to True.

clip_image006

Now click the Build, Build Solution menu item. If all went well you should see the build was successful. It’s now time to deploy the package to the server. Navigate to the folder where your project is stored, under it you will find a bin folder, and in it a Deployment folder. In there you should find a file with a “.SSISDeploymentManifest” extension. Double click on this file to launch the Package Installation Wizard.

When the wizard appears there are two choices, File system deployment and SQL Server deployment. For our purposes we can use either one, there are pros and cons to each and many companies generally pick one or the other. In this example we’ll pick SQL Server deployment, but again know that I’ve tested this both ways and either method will work.

Once you pick SQL Server deployment, just click Next. Now it asks you for the server name, I’ve left it at (local) since I’m working with this on a development box; likewise I’ve left “Use Windows Authentication”. Finally I need the package path, I can select this by clicking the ellipse (the …) to the right of the text box. This brings up a dialog where I can select where to install.

clip_image007

In a real world production scenario we’d likely have branches created for each of our projects, but for this simple demo we’ll just leave it in the root and click OK.

Once your form is filled out as below, click Next.

clip_image008

We are next queried to what our installation folder should be. This is where SSIS will cache package dependencies. Your DBA may have a special spot setup for these, if not just click next to continue.

Finally we are asked to confirm we know what we are doing. Just click Next. If all went well, the install wizard shows us it’s happy with a report, and we can click Finish to exit.

Setup the SQL Server Job

We’ve come a long way and we’re almost to the finish line, just one last major step. We will need to setup a SQL Server Job which will launch the SSIS package for us. In SQL Server Management Studio, navigate to the “SQL Server Agent” in your Object Explorer. If it’s not running, right click and pick “Start”. Once it’s started, navigate to the Jobs branch. Right click and pick “New Job”.

When the dialog opens, start by giving your job a name. As you can see below I used LoadDNRShows. I also entered a description.

clip_image010

Now click on the Jobs page over on the left “Select a page” menu. At the bottom click “New” to add a new job step.

In the job step properties dialog, let’s begin by naming the step “Run the SSIS package”. Change the Type to “SQL Server Integration Services Package”. When you do, the dialog will update to give options for SSIS. Note the Run As drop down, this specifies the account to run under. For this demo we’ll leave it as the SQL Server Agent Service Account, check with your DBA as he or she may have other instructions.

In the tabbed area the General tab first allows us to pick the package source. Since we deployed to SQL Server we’ll leave it at the default, however if you had deployed to the file system this is where you’d need to change it to pick your package.

At the bottom we can use the ellipse to pick our package from a list. That done your screen should look something like:

clip_image011

For this demo that’s all we need to set, I do want to take a second to encourage you to browse through the other tabs. Through these tabs you can set many options related to the package. For example you could alter the data sources, allowing you to use one package with multiple databases.

Click OK to close the job step, then OK again to close the Job Properties window. Your job is now setup!

Calling from .Net

The finish line is in sight! Our last step is to call the job from .Net. To make it a useful example, I also wanted the .Net application to upload the data the SSIS package will manipulate. For simplicity I created a WinForms app, but this could easily be done in any environment. I also went with C#, again the VB.Net code is almost identical.

I started by creating a simple WinForm with two buttons and one label. (Again the full project will be on the Code Gallery site).

clip_image012

In the code, first be sure to add two using statements to the standard list:

using System.Data.SqlClient;

using System.IO;

Behind the top button we’ll put the code to copy the data from the text file we created from the DNR website to the staging table.

    private void btnLoadToStaging_Click(object sender, EventArgs e)

    {

      /* This method takes the data in the DNRShows.txt file and uploads them to a staging table */

      /* The routine is nothing magical, standard stuff to read as Text file and upload it to a  */

      /* table via ADO.NET                                                                      */

 

      // Note, be sure to change to your correct path

      string filename = @"D:\Presentations\SQL Server\Calling SSIS From Stored Proc\DNRShows.txt";

      string line;

 

      // If you used a different db than ArcaneCode be sure to set it here

      string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

      SqlConnection connection = new SqlConnection(connect);

      connection.Open();

 

      SqlCommand cmd = connection.CreateCommand();

 

      // Wipe out previous data in case of a crash

      string sql = "DELETE FROM dbo.staging_DNRShows";

      cmd.CommandText = sql;

      cmd.ExecuteNonQuery();

 

      // Now setup for new inserts

      sql = "INSERT INTO dbo.staging_DNRShows (ShowData) VALUES (@myShowData)";

 

      cmd.CommandText = sql;

      cmd.Parameters.Add("@myShowData", SqlDbType.VarChar, 255);

 

      StreamReader sr = null;

 

      // Loop thru text file, insert each line to staging table

      try

      {

        sr = new StreamReader(filename);

        line = sr.ReadLine();

        while (line != null)

        {

          cmd.Parameters["@myShowData"].Value = line;

          cmd.ExecuteNonQuery();

          lblProgress.Text = line;

          line = sr.ReadLine();

        }

      }

      finally

      {

        if (sr != null)

          sr.Close();

        connection.Close();

        lblProgress.Text = "Data has been loaded";

      }

 

Before you ask, yes I could have used any number of data access technologies, such as LINQ. I went with ADO.NET for simplicity and believing most developers are familiar with it due to its longevity. Do be sure and update the database name and path to the file in both this and the next example when you run the code.

This code really does nothing special, just loops through the text file and uploads each line as a row in the staging table. It does however serve as a realistic example of something you’d do in this scenario, upload some data, then let SSIS manipulate it on the server.

Once the data is there, it’s finally time for the grand finale. The code behind the second button, Execute SSIS, does just what it says; it calls the job, which invokes our SSIS package.

    private void btnRunSSIS_Click(object sender, EventArgs e)

    {

      string connect = "server=localhost;Initial Catalog=ArcaneCode;Integrated Security=SSPI;";

      SqlConnection connection = new SqlConnection(connect);

      connection.Open();

 

      SqlCommand cmd = connection.CreateCommand();

 

      // Wipe out previous data in case of a crash

      string sql = "exec msdb.dbo.sp_start_job N’LoadDNRShows’";

      cmd.CommandText = sql;

      cmd.ExecuteNonQuery();

      connection.Close();

      lblProgress.Text = "SSIS Package has been executed";

 

    }

The key is this sql command:

exec msdb.dbo.sp_start_job N’LoadDNRShows’

“exec” is the T-SQL command to execute a stored procedure. “sp_start_job” is the stored procedure that ships with SQL Server in the MSDB system database. This stored procedure will invoke any job stored on the server. In this case, it invokes the job “LoadDNRShows”, which as we setup will run an SSIS package.

Launch the application, and click the first button. Now jump over to SQL Server Management Studio and run this query:

select * from dbo.staging_DNRShows;

select * from dbo.DNRShows;

You should see the first query bring back rows, while the second has nothing. Now return to the app and click the “Execute SSIS” button. If all went well running the query again should now show no rows in our first query, but many nicely processed rows in the second. Success!

A few thoughts about xp_cmdshell

In researching this article I saw many references suggesting writing a stored procedure that uses xp_cmdshell to invoke dtexec. DTEXEC is the command line utility that you can use to launch SSIS Packages. Through it you can override many settings in the package, such as connection strings or variables.

xp_cmdshell is a utility built into SQL Server. Through it you can invoke any “DOS” command. Thus you could dynamically generate a dtexec command, and invoke it via xp_cmdshell.

The problem with xp_cmdshell is you can use it to invoke ANY “DOS” command. Any of them. Such as oh let’s say “DEL *.*” ? xp_cmdshell can be a security hole, for that reason it is turned off by default on SQL Server, and many DBA’s leave it turned off and are not likely to turn it on.

The techniques I’ve demonstrated here do not rely on xp_cmdshell. In fact, all of my testing has been done on my server with the xp_cmdshell turned off. Even though it can be a bit of extra work, setting up the job, etc., I still advise it over the xp_cmdshell method for security and the ability to use it on any server regardless of its setting.

In Closing

That seemed like a lot of effort, but can lead to some very powerful solutions. SSIS is a very powerful tool designed for processing large amounts of data and transforming it. In addition developing under SSIS can be very fast due to its declarative nature. The sample package from this article took the author less than fifteen minutes to code and test.

When faced with a similar task, consider allowing SSIS to handle the bulk work and just having your .Net application invoke your SSIS package. Once you do, there are no ends to the uses you’ll find for SQL Server Integration Services.

Welcome to COMFRAME

I admit to being remiss lately, my poor blog has been neglected for these past few weeks. I can only plead mea culpa and explain.

A few weeks ago I had an opportunity placed before me that I simply could not refuse. I’d been happy at my old job and wasn’t looking, but a good friend of mine works for a great company called COMFRAME. They are a consulting firm that does a variety of things, including Enterprise Project Management, .Net and Java development projects, SOA, and most important to me, Business Intelligence.

To make a long story short my friend took a lesson from the Godfather movies and “made me an offer I couldn’t refuse”. I am now a COMFRAME employee! The work is very exciting, I’ll be an architect on a BI project that is using Silverlight 3 for it’s front end. We are working with data from Microsoft Project, not only that but it’s the world’s biggest implementation of Project Server, so I’ll get to work with the fine folks at Microsoft even more closely. We’re also a Microsoft Partner, which will give me new avenues for relationships that will compliment my MVP.

I got to meet the customer this week, although brief they seemed very easy to work with, and nice as well. I also got to meet the development team I’ll be working with, I’m impressed with the work they’ve done so far and can’t wait to roll up my sleeves and dive in.

I’ve had a crazy time wrapping up my old job and starting my new one, hopefully I can get back to regular blogging soon. I’ve been doing a lot with SSIS and SSAS which will give me lots of good material to talk about, not to mention any Silverlight 3 work I get to explore.

Podcast Junkie Week – Dot Net Development

If you are a .Net Developer it’s real easy to feed your Podcast addiction and become a Podcast Junkie like me. Here’s my favorites for keeping up with the wonderful world of .Net development.

DotNetRocks – The grand daddy of .Net podcasts, Richard and Carl now have well over 450 shows behind them. One of the oldest, some of the older shows are well worth listening to even now as they talk about practices and principals we still use today, not to mention the current shows delivering up to date news on .Net.

DNRTV – Hand in hand with the audio podcast is the DotNetRocks TV show. When audio is not enough, DNRTV steps in to show what you heard about in DotNetRocks.

DeepFriedBytes – I love this show, Keith and Woody interview folks and do it with a real flare of Southern Hospitality. Not only do they have interviews with interesting people but you get good ideas for new deep fried foods! I also love the “rusty washers” segments.

Hanselminutes – Each week Scott Hanselman does an interesting, creative interview with people all over the coding spectrum. It’s typically half an hour in length so easy for those with a short commute. Scotts style is very relaxed and enjoyable.

The Thirsty Developer – While most of the episodes lean toward .Net occasionally they throw in a show about non Microsoft topics, such as the recent show on HAML. I think it makes a nice balance to see what folks are doing who may be using tools different from mine.

Herding Code – Similar to the previous podcast, this one primarily focuses on .Net but takes occasional forays into other subjects.

Misfit Geek – This is a relatively new podcast with only two episodes under it’s belt. Both have been interesting though so I’m hopeful to see more in the near future.

Channel 9 – The site for Microsoft produced podcasts this is a firehose of information for all .Net developers. No matter what your specialty if you work with Microsoft tools there will be something here for you.

Whiile these ought to keep you busy for a while, I know tomorrow you’ll be eager for more. While keeping up with programming is important, I also think it’s vital to keep up with industry trends. The roll out of new computers and OS’s has a big effect on the life of a developer. In addition I also think it’s valuable to remember there are a lot of folks who work with computers all day who are not coders. Thus tomorrow we’ll take a look at the list of podcasts useful for knowing what’s going on in the computer industry.

We interrupt this blog to get Ramped Up!

I promise to wrap up the FILESTREAM series shortly, I just want to ensure all of the code samples are complete and properly documented. Meanwhile I have a cool website I want to pass along.

Last week my friend Doug Turnure of Microsoft was on Dot Net Rocks! He and his co-worker Johanna White were talking about a new training site, Ramp Up! Available at http://myrampup.com this is a very full featured training site. There are a series of topics to train on, and each topic is actually a complete training course. Each course is a series of lessons in a variety of formats.

Doug and Johannna have taken a unique approach to this site. For example, there are three different courses available to learn ASP.NET. One is for people coming from an ASP background, another for experienced JAVA developers, and a third for people with no web experience. What a great idea!

In addition they have retained or recycled material for developers who may not be working on the cutting edge. There is a course for people coming from VS2002/2003 to VS2005, for example. This is great, I meet a lot of developers who are just now shifting to the .Net 2.0 platform and are looking for good training material.

Congrats to Doug, Johanna and everyone involved in creating this site. It’s  a great idea, totally free, and make sure to visit it frequently as they will be adding more material as time goes by.

Generating a PDF file from a Reporting Services Report Viewer Control

In yesterday’s post, I demonstrated how to generate a SQL Server Reporting Services report without having to have SQL Server Reporting Services. The sample application used the Microsoft Report Viewer control to display the report. A common need in business is to generate documents, such as PDFs, that will later be archived. It turns out if you are using a report viewer control, this is easy to do programmatically.

First, you need to add one using statement to the top of your class, in addition to the others that were added yesterday.

using System.IO;

Next, we only need a few lines of code to generate the PDF.

      Warning[] warnings;

      string[] streamids;

      string mimeType;

      string encoding;

      string extension;

 

      byte[] bytes = reportViewer1.LocalReport.Render(

        "PDF", null, out mimeType, out encoding, out extension,

        out streamids, out warnings);

 

      FileStream fs = new FileStream(@"D:\ReportOutput.pdf", FileMode.Create);

      fs.Write(bytes, 0, bytes.Length);

      fs.Close();

This code snippet came right from the MSDN Books on Line, and is pretty simple. I could have selected another format by changing the first value passed into the Render method, for example “EXCEL” would have rendered it as a Microsoft Excel document.

In the code samples I placed the above sample in it’s own button, but I could just have easily placed it under one of the other demo buttons.

This ability brings up some interesting possibilities. For example, the report viewer control does not have to be visible to the user in order for this to work. Thus you could create an application that every night generated a series of reports and saved them as PDFs to some central location, such as a web server or document control server. All the user (assuming one was around) would have to see is a progress bar, the reports themselves never get displayed.

Using SQL Server Reporting Services in Client Mode

Recently I did a presentation at the March BSDA meeting. I showed how to use SQL Server Reporting Services without a SQL Server, or more specifically a SQL Server running Reporting Services. It got an enthusiastic response so I thought I’d add to it here by adding some reminder documentation, as much for myself as for all of you wonderful readers.

Using Reporting Services in Client, or Local mode is a 4 step process. First, you will need an XSD schema file to create the report on. Once you have the XSD you will be able to move to the second step, creating the report. Third you will need to place a Report Viewer control on your windows form, WPF form, or ASP.NET page. Finally you will need to write some code that generates an ADO.NET dataset, loads the report in the report viewer control, then binds it all together. Lets look at this step by step.

Normally when you create a report you connect to a database, then base it off of some object like a query, view, or stored procedure. The report is then uploaded to a Reporting Services server, which takes care of hosting it, displaying it, and generating the data for it. With client mode you have no server available, so we have to instead create a surrogate. That’s where our XSD file comes in.

Right click in Solution Explorer and “Add a new item”, and from the list of goodies select “XML Schema”. Name it something appropriate, letting the default extension be XSD. For this example I will be getting customer order data, so I’ve given it the name CustomerOrders.xsd. Visual Studio will think about it then add it to the project, and even helpfully open it for you. I don’t know about you, but hand typing XML Schema’s isn’t my idea of fun, so you should glace at it, go “that’s nice” then close it.

Now right click on the XSD file in Solution Explorer, and pick “Open with….”. In the dialog that appears, select “Dataset Editor”. When you do, Visual Studio presents a big scary warning message letting you know that you could lose contents, and that this will forever be a dataset XSD file. We have nothing in the file, so we’re cool with this, just click OK.

You will now be presented with a big surface area. In the middle it tells you to drag items from the server explorer or right click. If you have a table, view, or stored procedure you are free to drag it in, but most of the time you’ll want to base this off of a SQL query. Right click on the surface, and select Add…., Table Adapter from the menu. The first screen asks you for the database connection. This is the only time you’ll actually need a connection, in this example I am using the good old Northwind database. I pointed at Northwind and clicked next.

image

Next we are asked how we are going to access the data. Since we have a SQL Statement just pick the default of “Use SQL statements” and click Next.

image

Now take your SQL Statement and paste it in, and click Next.

image

OK, click Finish to wrap up the addition of the XSD. By default the adapter has a generic name, we should give ours something more meaningful. Click in the top bar, then enter a new name. Since my example report is for customer order data, I’ll name it CustomerOrders. I then went to the bottom bar and renamed the TableAdapter1 to CustomerOrderTableAdapter. You should now see something like:

image

Note that this will become your Data Source for the report. The data source will have the name of the XSD followed by the name of the source, in this case it will read CustomerOrders_CutomerOrders. OK, now it’s time to create the report.

Go back to Solution Explorer, right click and pick Add New Item. Navigate to the Reporting area and pick Report Wizard. Note the file extension should end in RDLC. If you have used Reporting Services before, you will know that reports typically end in RDL. However, client mode reports have just a slightly different syntax to them, thus the RDLC extension to differentiate the two. While you can modify an RDL to become an RDLC and vice versa, you have to do so by hacking the XML behind the report.

Note you can also choose just Report, but then you’ll have to setup everything manually. For this simple example though, we’ll just use the Report Wizard.

image

Give your report a meaningful name and click Add. The report wizard then shows you a welcoming screen if you’ve never run it before, just click Next.

Now we need to pick the data source. In this example, you want the CustomerOrders branch, so select it and click Next.

image

The next screen asks if we want a Tabular or Matrix report. Select the one for you, in my example I picked Tabular and clicked Next. The next screen asks how we want to display the data. For my example, I opted to group by the customers company name and contact name, then the order data went into the details area. Fill out as appropriate for your report and click Next.

image

The next screen asks how we want things laid out. This affects the look and feel of the report. For my example I just took the default and clicked next, however you are free to play with this to experiment with the different looks and feels your reports might have.

Likewise the next screen is also a look and feel one, asking what colorings we want to apply. Pick one that makes you happy and click next. You can always change it later, many times I pick the Generic one (which adds no colors) then fix it up afterward.

The final screen is the wrap up. Give your report a meaningful name and click Finish.

image

OK, you have a report, now you need a container. Open up the user interface you want to place the report viewer control on. In my example I went with a very simple Windows Forms application.

In my toolbox, I navigated to the Reporting section, where I only found one control, the MicrosoftReportViewer control. (Note I am using Visual Studio 2008 SP1, if you are on an earlier version your names may differ slightly). Grab it and drop it onto your design surface. I also added a Button control to the form to kick off the report display process.

image

Now it’s time for the last step, adding some code. In this example I’ve used a Windows Form. Opening it, the first thing we find in the form load area is::

      this.reportViewer1.RefreshReport();

(Note I left my report viewer control named reportViewer1.) Delete it, we’ll have it refresh elsewhere.

Now we need to add some using statements to the top of our class.

//Add these to the standard list above

using System.Data.Sql;

using System.Data.SqlClient;

using Microsoft.Reporting.WinForms;

The first two will be used in accessing our Northwind database, you may need to use different libraries if you were going to another database. I’ve also included a referenced to the Reporting.WinForms library so we can manipulate the report programmatically.

Now let’s go to the code for the button click event. First, we need to reset the report viewer in case we’d been using it to host another report.

      // Reset in case report viewer was holding another reportViewer1

      reportViewer1.Reset();

Next We need to set the report viewer to local mode. This tells it we’ll be supplying the report name from a local file, and binding the report to a local ADO.NET datasource.

      // Set the processing mode for the ReportViewer to Local

      reportViewer1.ProcessingMode = ProcessingMode.Local;

Our third step is to create a local report variable, and set it’s reference to the report viewer’s local report. This will make it easier to work with. Then we’ll set the location of the report we want to use.

      LocalReport localReport = reportViewer1.LocalReport;

      localReport.ReportPath = @"D:\Presentations\SQL Server\SSRS RDLC\SSRS_RDLC\Report2.rdlc";

Now we need to create an ADO.Net dataset, and populate it. I implemented most of that functionality in a method called GetCustomerOrders, which I’ll append at the bottom of these instructions. It’s very straight forward code.

      DataSet dataset = new DataSet("Northwind");

 

      // Get the sales order data

      GetCustomerOrders(ref dataset);

At this stage we have told it where our report is, and have created the dataset. Now we need to create a datasource for the report itself. We’ll use the ReportDataSource object. For the name, we’ll use the same name as the XSD schema, CustomerOrders_CustomerOrders. Then for the value we will give it the table from the dataset we created in code. It’s possible for a report to have multiple datasets, in the report we’d give each one it’s own name (based on the XSD) then here we’d bind the dataset table to the name we’d used in the report. Once done we will then add the new ReportDataSource to the local reports DataSources collection. Finally, we’ll referesh the report viewer to make it generate the report.

      // Create a report data source for the sales order data

      ReportDataSource dsCustomers = new ReportDataSource();

      dsCustomers.Name = "Customers_Customers";

      dsCustomers.Value = dataset.Tables["Customers"];

 

      localReport.DataSources.Add(dsCustomers);

 

      // Refresh the report

      reportViewer1.RefreshReport();

You can download a copy of these instructions, along with the entire sample project including code and reports, at the Microsoft Code Gallery site http://code.msdn.microsoft.com/SqlServerRSClient . As promised, below is a copy of the GetCustomerOrders routine, for your reference.

    private void GetCustomerOrders(ref DataSet dsNorthwind)

    {

      string sqlCustomerOrders = "SELECT c.[CustomerID]"

        + " ,c.[CompanyName]"

        + " ,c.[ContactName]"

        + " ,c.[ContactTitle]"

        + " ,c.[Address]"

        + " ,c.[City]"

        + " ,c.[Region]"

        + " ,c.[PostalCode]"

        + " ,c.[Country]"

        + " ,c.[Phone]"

        + " ,c.[Fax]"

        + " ,o.[OrderID]"

        + " ,o.[CustomerID]"

        + " ,o.[EmployeeID]"

        + " ,o.[OrderDate]"

        + " ,o.[RequiredDate]"

        + " ,o.[ShippedDate]"

        + " ,o.[ShipVia]"

        + " ,o.[Freight]"

        + " ,o.[ShipName]"

        + " ,o.[ShipAddress]"

        + " ,o.[ShipCity]"

        + " ,o.[ShipRegion]"

        + " ,o.[ShipPostalCode]"

        + " ,o.[ShipCountry]"

        + "  FROM [Northwind].[dbo].[Customers] c"

        + "  join [Northwind].[dbo].[Orders] o on c.CustomerID = o.CustomerID";

 

      SqlConnection connection = new

        SqlConnection("Data Source=(local); " +

                      "Initial Catalog=Northwind; " +

                      "Integrated Security=SSPI");

 

      SqlCommand command =

          new SqlCommand(sqlCustomerOrders, connection);

 

      SqlDataAdapter customerOrdersAdapter = new

          SqlDataAdapter(command);

 

      customerOrdersAdapter.Fill(dsNorthwind, "CustomerOrders");

 

    }

Alabama Code Camp with Speaker Idol

Update: Due to a scheduling conflict, the date of the Alabama Code Camp has been pushed back to January 31st. We apologize for the inconvenience but hope the extra time will make it easier for everyone to plan to attend.

The next Alabama Code Camp will be Saturday January 17th 31st, 2009 in Montgomery Alabama. Call for speakers is open, so check back to the site soon for registration, session schedule, and more.

Also at this event we’ll be having the Alabama Speaker Idol contest. The object of this contest is to encourage new speakers to step up to the Code Camp level. During lunch we’ll be giving new speakers each five minutes to do a presentation. The detailed rules are below, but the big news is the grand prize: a 1 year MSDN Premium Subscription!

Update: Registration for the event is now open, you can register at http://www.clicktoattend.com/?id=134437

Alabama Speaker Idol Rules

1. Presentations are targeted to be five minutes in length, presentations can vary from 4:50 to 5:10 without penalty. Presentations under 4:50 or over 5:10 will be penalized.

2. As the object of the contest is to encourage new speakers, professional speakers, MVPs, and speakers at previous code camps are not eligible. However, someone may participate and also be a speaker at the Jan. 17th 31st, 2009 code camp as long as this is their first time speaking at a code camp.

3. In addition to length, speakers will also be judged on clarity, technical accuracy and content.

4. In addition to the judges, code camp attendees and other members of the pubic may be in the audience during presentations.

5. Judges will be composed of attending MVPs and other notable attendees. As with most of these things, scoring is largely subjective. The result of the judges is final. No whining, crying, or complaining.

6. To participate in the contest you must e-mail your name, title of your presentation, and contact cell number (just in case) to arcanecode@gmail.com with Alabama Speaker Idol in the subject header. All submissions must be received by noon, central standard time, Friday January 16th 30th. (edit to reflect date change)

7. Due to time restrictions, the contest will be limited to 15 participants. In the event more than 15 submissions are received, 15 contestants will be randomly drawn from the pool of submissions. All others will be placed on the stand by list. In the event of a no show on the day of the contest, we will randomly draw a replacement speaker from the standby list.

8. The order of the speakers will be drawn at random just before the presentations begin. All presentations will take place during lunch at the code camp.

9. The winner will be announced at the end of the code camp day.

10. These rules may be amended as the Code Camp Committee deems fit in order to facilitate a more orderly, better code camp.

Oslo – Not just for Norwegians any more

I’ve been looking heavily into Oslo, the new technology announced at PDC 2008. So what exactly is Oslo? Well I couldn’t find a simple explanation, so after digging into it all weekend let me see if I can take a stab at it.

If you are familiar with Sharepoint, you know that it provides you a bunch of web templates. You can take these and create certain types of lists. Documents, lists, forums, etc. What many don’t realize is that all of this gets stored in a “repository” that is in SQL Server.

Oslo takes this concept to the next level. It allows you to create your own “lists” if you will, of fairly complex data types. These are stored in a repository in SQL Server. Along with your data is a lot of meta-data. Oslo also provides a query tool to easily get data back out of the repository, along with runtime components you can use with your favorite programming language. Or, because it’s all in SQL Server you can bypass Oslo runtimes and go directly into the SQL Server repository using traditional tools like ADO.NET or Linq To SQL.

So how does Oslo accomplish this? By providing several tools to us: M the programming language; Quadrant, the graphical tool; and the Repository itself. Lets take a brief look at each one.

M is a new programming language that has three components: MSchema, MGrammer, and MGraph. MSchema is used to define a new chunk of data, it is a representation of how you want the data stored. The product of an MSchema definition is directly translated into T-SQL as a Create Table statement and stored in the Repository.

MGrammer is used to create a translation between one layout of information and the schema created with MSchema. Let’s say you had created an MSchema definition for album names, artists, and ratings. Then let’s say you had an input file that looked something like:

The Thirteenth Hour by Midnight Syndicate rates 5 stars.

Greatest Tuba Hits of 1973 by The Tuba Dudes rates 1 star.

You could create a language template in MGrammer that looked for the words “by” and “rates” and divides up the input into the appropriate fields in your schema. Then run the input file through the MGrammer layout and you’ve now got all that data into a format known as MGraph.

MGraph is a tree like structure that represents the transformed data. If I understand it correctly, you take your data, run it through the DSL you setup with MGrammer and it produces an MGraph. This MGraph can then be loaded into a database schema created with MSchema, passed off to a calling routine, and more.

Quadrant is the tool used to look at data once it’s in the Repository. You can browse data, and create different representations of the data in a tool similar to what you see with Office. For example, you can render a table created by MSchema as a tree, as a grid, as a list, or even as a graph. You can use it to show relationships between MSchemas, and write queries with it. Quadrant could be used by developers or advanced users to create a template representation of the data that could be given to other users to do their data analysis.

Quadrant is also highly extensible and customizable. You can write your own modules to add to it. Although to do so you have to write them in Python, which I have to admit leaves me scratching my head. I don’t have anything against Python, but I would have to imagine most developers who work with Microsoft tools are much more familiar with VB.Net or C#. I have to wonder why they picked a language most Microsoft developers are unfamiliar with and would have to learn in order to extend the Quadrant tool.

The final piece of the puzzle I have mentioned several times, it’s the Repository. The Repository is a database that holds everything about your schemas and data. Currently Oslo only supports SQL Server as the database for a Repository. Interesting thing though, Microsoft will be distributing Oslo under the OSP (Open Specifications Promise). This means a third party vendor could develop a back-end Repository engine so that an Oslo Repository could be stored in something like MySQL or Oracle.

Finally I will mention Oslo will be callable from your favorite .Net language, indeed the Runtime components as they are called are a critical piece of Oslo. There are .Net APIs which can be used to get and retrieve data from the Repository.

Microsoft is serious about Oslo. In a Channel 9 interview about M, I believe it was Chris Anderson who said there were 180 folks working on the Oslo team. Even though it’s early in its development, I get the strong impression Oslo will be a key factor in future of Microsoft development technologies, which is why I intend to invest time now to get up and running with it.

For more information about Oslo, and to download the current Oslo SDK CTP, see the site at http://msdn.microsoft.com/oslo .

SQL Server Compact Edition Connection Strings

In my recent presentation I talked about an important but subtle difference with connection strings when using SQL Server Compact Edition. It was so important I thought I’d make a special blog post out of it.

There are two methods for programmatically accessing data in SQL Server Compact Edition (SSCE). The first method is using the System.Data.SqlServerCe library. When you create an instance of the SqlCeEngine, you need to pass a connection string formatted like so:

DataSource=”mydatabasename.sdf”; Password=’mypassword’

This method is valid, by the way, for version 3.1 or 3.5 of SSCE. The second method, available with Visual Studio 2008 and the 3.5 version of SSCE is to use LINQ to SQL. When creating the DataContext object, you also need a connection string formatted like so:

Data Source=mydatabasename.sdf; Password=’mypassword’

Note very carefully the two differences. First, the name of the sdf file lacks the double quote marks in the LINQ to SQL version. Second, note the Data Source phrase has a space between the words in the LINQ version, where the SqlCeEngine version lacks the space.

It’s a small distinction, but it’ll drive you nuts if you don’t catch it. I drove myself nuts for quite a while because I didn’t notice the extra space in Data Source when I began experimenting with LINQ to SQL! Hopefully my pain will save others some hair pulling.