Monday 12 October 2009

TSQL: Programmatically Create DDL scripts for Foreign Keys

I'll admit it, I'm lazy... I hate typing exercises. I don't need the practice and I'm sure the person paying my somewhat exhorbitant fee doesn't want to fork out for something so mundane either. That means I'm always on the hunt for a shortcut or two, I mean, process improvement.

These two snippets will generate the DDL to create all the foreign keys in the database.
The first will check if it exists first, the second drops it first. Use whichever suits.

SELECT 'IF NOT EXISTS (SELECT * FROM sys.FOREIGN_KEYS WHERE name = ''' + fk.Name + ''' AND Parent_Object_Id = object_id(''['+ s.name + '].[' + t.name + ']'')) BEGIN '
+ char(13) + char(9) + 'ALTER TABLE ' + s.name + '.' + t.name + ' WITH CHECK '
+ char(13) + char(9) + 'ADD CONSTRAINT ['+fk.Name +'] '
+ char(13) + char(9) + 'FOREIGN KEY(['+c.name+']) REFERENCES [' + fks.name + '].['+fkt.name+'] (['+fkc.name+']) '
+ char(13) + 'END'
from sys.foreign_keys fk
inner join sys.foreign_key_columns fkcm on fkcm.constraint_object_id = fk.object_id
inner join sys.tables t on t.object_id = fkcm.Parent_Object_Id
inner join sys.tables fkt on fkt.object_id = fkcm.Referenced_Object_Id
inner join sys.columns c on c.object_id = t.object_id and c.column_id = fkcm.parent_column_id
inner join sys.columns fkc on fkc.object_id = fkt.object_id and fkc.column_id = fkcm.referenced_column_id
inner join sys.schemas s on s.schema_id = t.schema_id
inner join sys.schemas fks on fks.schema_id = fkt.schema_id
ORDER BY 1


And the second one (ie does a drop first)...

SELECT DISTINCT 'IF EXISTS (SELECT * FROM sys.FOREIGN_KEYS WHERE name = ''' + fk.Name + ''' AND Parent_Object_Id = object_id(''['+ s.name + '].[' + t.name + ']'')) BEGIN '
+ char(13) + char(9) + 'ALTER TABLE ' + s.name + '.' + t.Name + ' DROP CONSTRAINT ' + fk.Name
+ char(13) + 'END'
+ char(13) + 'ALTER TABLE ' + s.name + '.' + t.name
+ char(13) + char(9) + 'WITH CHECK ADD CONSTRAINT ['+fk.Name +'] '
+ char(13) + char(9) + 'FOREIGN KEY(['+c.name+']) REFERENCES [' + fks.name + '].['+fkt.name+'] (['+fkc.name+']) '
+ char(13)
from sys.foreign_keys fk
inner join sys.foreign_key_columns fkcm on fkcm.constraint_object_id = fk.object_id
inner join sys.tables t on t.object_id = fkcm.Parent_Object_Id
inner join sys.tables fkt on fkt.object_id = fkcm.Referenced_Object_Id
inner join sys.columns c on c.object_id = t.object_id and c.column_id = fkcm.parent_column_id
inner join sys.columns fkc on fkc.object_id = fkt.object_id and fkc.column_id = fkcm.referenced_column_id
inner join sys.schemas s on s.schema_id = t.schema_id
inner join sys.schemas fks on fks.schema_id = fkt.schema_id
ORDER BY 1

Integration Services Gotcha #6631984: The Table Variable Data Flow Affair

I encountered a teeny little SSIS glitch today that completely did my head in for a good 30 minutes. Ok, I'll admit it, more like an hour.

In my data flow, I had an OLEDB Source which declared and populated a table variable then returned some data, which included a join on my table variable.

The source query worked perfectly, returning plenty of data when tested through SSMS. As soon as I ran the data flow, it would be successful but zero data.

That freakin' table variable was messin' with me. Took it out, replaced it with a subquery and whammo! Data.

Oh SSIS, you pernickety old man, you!

Monday 20 July 2009

SQL 2008: Using MERGE to do a Type 2 Change

I'm going to do a brief overview of the SQL 2008 MERGE statement. Primarily for my own purposes - being to document the usage for type 2 changes so I don't forget it.

First let's begin with the basic type 1 usage.


All fairly self-explanatory. We have a source and a target. We join the two tables, identify new records, changed records and deleted records. And deal with them appropriately.

So let's move onto the type 2 changes. Here we need to:

  • Insert New Records
  • Disable changed records
  • Add a new record for the changed records
That's 2 inserts and an update. The MERGE won't allow that so we have to nest our merge inside an insert. Luckily this works because MERGE allows output!


Ignore the outer INSERT for the moment. We have a MERGE statement which is very similar to our type 1 MERGE. Compare the Source and Target, INSERT new records as required. The difference being that when the matched record contains a change, instead of updating the changed field, we set the CurrentFlag and the EndDate so as to disable the record.


Of course that means we still don't have a record which contains our updated values. So, now lets turn our attention to the OUTPUT clause.

This will be familiar to anyone who used it in SQL 2005 in much the same context - to identify rows which have been modified/added/deleted. Really the origin of MERGE seems to be strongly rooted in the SQL 2005 OUTPUT clause. On the upside, unlike the SQL 2005 version and UPDATE is an UPDATE and not a DELETE and an INSERT. Small but handy! Anyway, where were we? Let's return the columns we need to create a new record and a special $Action column - this tells us what actually happened ie UPDATE, INSERT or DELETE. We need this because the OUTPUT will return everything that occured in the MERGE.


Now we have a set of rows that have been actioned by the MERGE. Still, we haven't actually done anything with them.

To do that, we enclose the entire MERGE statement, including it's OUTPUT clause of course, into brackets so we can make it the FROM clause for an INSERT. Our OUTPUT clause is return the columns we need for the INSERT remember? So now we just add our INSERT INTO at the front, add an alias to the closed bracket of the FROM clause AND, drumroll... whack a WHERE clause on the end. This where clause just filters the OUTPUT from the MERGE so we only insert new records for those that were updated, ignoring those that may have been inserted etc in the MERGE.

Easy!


-------------------------------------------------------------------------------------------------

For the purposes of copy/paste-ability, here is a text version of the TSQL. Images have been used above for improved readability.

--A simple type 1 change
MERGE INTO DimCustomer c
USING StageCustomer stg ON c.CustomerId = stg.CustomerId
WHEN NOT MATCHED BY TARGET THEN --ie new
INSERT (CustomerId,CustomerName,IsActive) VALUES (
stg.CustomerId
,stg.CustomerName
,stg.IsActive
)
WHEN MATCHED AND c.CustomerName <> stg.CustomerName THEN --ie changed
UPDATE SET c.CustomerName = stg.CustomerName
WHEN NOT MATCHED BY SOURCE THEN --ie deleted
DELETE


--a type 2 change
INSERT INTO dbo.DimCustomer
SELECT CustomerId, CustomerName, IsActive, IsCurrent, Eff_Date, End_Date
FROM
( MERGE DimCustomer c
USING StageCustomer stg ON c.CustomerId = stg.CustomerId
WHEN NOT MATCHED THEN INSERT VALUES (
stg.CustomerId
,stg.CustomerName
,stg.IsActive
,1 --> IsCurrent
,GETDATE() --> Eff_Date
,NULL --> End_Date
)
WHEN MATCHED AND c.IsCurrent = 1 AND (c.CustomerName <> stg.CustomerName) THEN UPDATE SET
c.IsCurrent = 0
,c.End_date = GETDATE()
OUTPUT $Action as RowAction
,stg.CustomerId
,stg.CustomerName
,1 AS IsCurrent
,GETDATE() as Eff_Date
,NULL as End_Date
) m
WHERE m.RowAction = 'UPDATE'

Friday 17 July 2009

Thursday 18 June 2009

Sharepoint Saturday

Register now for Sharepoint Saturday in Sydney on August 8th.

http://www.sharepointsaturday.org/sydney/default.aspx

Can Dynamics CPM pick up where PerformancePoint left off?

Can Dynamics CPM pick up where PerformancePoint left off? And how does all of this fit in with SSRS Report Builder and Project Gemini, a key part of Microsoft's self-service BI drive? When you say things like "Microsoft Forecaster is a budgeting and planning application that allows companies to build budgets based on specific objectives", one has to wonder!

And "Enterprise Reporting is designed for sophisticated group reporting and consolidation needs, and includes advanced multi-dimensional consolidations, eliminations and multicurrency capabilities. Enterprise Reporting’s planning environment is easily customizable to fit diverse budgeting and forecasting needs across all industries and organizations. Enterprise Reporting provides strong analytical modules to make it extremely easy to create new ad-hoc reports on the fly"

Seriously? That's quite a bundle of tasks. Enquiring minds want to know more!

Thursday 11 June 2009

PerformancePoint Planning: Bug in the bsp_DI_ValidateLabelTable Proc

Somewhat irrelevant due to shut down of PerformancePoint Planning but it's an issue I encountered so I will record it for posterity and future generations (*snigger*).


There is a bug in the bsp_DI_ValidateLabelTable proc which causes it to fail when you are using a specific hierarchy in the model rather than the default 'all members' set. The problem arises when you pass in an incorrect model name.

The proc does not validate that the model name is actually valid before continuing (naughty naughty) which means you can actually cause errors if it just so happens that the value you pass in is a valid object - just not the name of the model itself. Let me show you want I mean.

First let's just mock up some object labels (please note these are NOT real. I would never be so generic but for the purpose of examination...).
Application: PPS
RootModelSite: PPS_Planning
Model: PPS
When calling PPSCMD you often refer to PPS:PPS_Planning.
When executing the bsp_DI_ValidateLabelTable proc, it asks for modelname as a parameter. In my example, I will have a scope in FK_relationships for PPS:PPS_Planning to identify the modelsite but that's not the one I need. Nope. I need PPS_Planning:PPS, which identifies the model.

But I'm half asleep and I've typed PPS:PPS_Planning in the past 24 hours so many times that I just don't think. Uh oh, all of a sudden the proc fails - but not after having made a few updates already.

Silly mistake to make, no question. Still, it does seem a bit dodgy to take an string input from a user, which is also a unique label used to identify a specific record, and not validate that it actually exists before making updates using that value!

SSIS: Write to the Output Window from a Script Task

Want to write to the Output window during execution of a script task?

DTS.Events.FireInformation(1,"YOUR TITLE","Your Message","",0, False)
Really handy for debugging!

SSIS: Error when executing a data flow using the OLEDB SSAS 9.0 Provider

I found an interesting problem today.


I had an SSIS package which contained, amongst other things, a data flow which had 3 OLEDB data flow sources which later get merged into one for insert into the destination. Each data source points to the same OLEDB connection which uses the Analysis Services 9.0 provider to connect to a mining structure. Each data source used the SQL Command access mode and a simple DMX statement in the command text.

When previewing the results of each query all worked fine and results were as expected. When it came to execute time on the server I kept receiving the most frustratingly vague error.
"Error code = 0x80040E05, External Code = 0x00000000:."
Mmmm. Yes... I know it well...

So I presumed it was somehow a permissions issue as the previews under my profile all looked fine. Only, the perms were the same. I made sure the connection property "RetainSameConnection" was set to FALSE. No difference.

So then I deleted 2 of the data flow sources and tried again. This time success. Somehow the connections were locking each other out. I set the RetainSameConnection to TRUE. This time even the previews failed.

So my workaround? I used a ForEach Loop to do them one at a time using the "SQL Command From Variable" data access mode on the data flow source.

Problem solved!


Wednesday 10 June 2009

SQL SSIS: Vote for data viewer suggestion

Vote for a feature request on Microsoft Connect for greater flexibility over when and where an SSIS data viewer is shown.

https://connect.microsoft.com/SQLServer/feedback/ViewFeedback.aspx?FeedbackID=465835

Tuesday 9 June 2009

SQL: Retrieving the Metadata for a Query

Sometimes you need to know what the metadata for a query looks like before runtime. This is particularly relevant with SSIS.


I use it rarely but every time I do have a need for it, I can never remember the name of the option and it takes me an hour at least to track it down again. So I'm hereby cementing it in stone so I don't forget it again!

SET FMTONLY ON

That's the one. It will return the columns and data types to the query engine. Note: it does not actually run the query or return any results. It's purely to return metadata about the query.

It's surprisingly handy from time to time.

Wednesday 3 June 2009

SharePoint Magazine - Part 3 Out Now!

My next article for SharePoint Magazine is available now.

http://sharepointmagazine.net/technical/administration/kpis-–-who-am-i-and-where-do-i-come-from-part-3-of-6

SQL 2005: Scalar Resultsets and SSIS Variables

Found an interesting SSIS tidbit today which I've amazingly never encountered before.  Not quite sure how I've never managed to hit this problem before - it seems surprising to me but there you go, I guess it happens from time to time.

In an ExecuteSQLTask, I was returning a scalar value to a variable.  The value returned was a bigint.  My SSIS variable was a bigint.   The step failed with an invalid type error.  What the?  Changed the type to an object.  Success this time but not much help to me down the control flow because I can't read it without a ForEach - which I really don't wanna do for a scalar value.  I mean that would just be silly.

According to Technet "With an OLE DB or Excel connection manager, the connection manager explicitly converts values of the following types, DBTYPE_NUMERIC, DBTYPE_GUID, and DBTYPE_BYTES, to strings".  Seems crazy to me!  I changed the SSIS variable to a string and what do you know, success.

Nuts!

Friday 24 April 2009

SQL 2008 SP1 - a must have!

Some time ago I installed SQL 2008 on a VPC which was also running MOSS and PerformancePoint.  Soon enough the VPC started to run out of memory.  I tore my hair out for a few days because nothing I did made a good dent in the problem.  


Till I found the silver bullet.

Consistently SSAS was high on the list of memory-hungry services - plateauing around the 500Mb mark.  Yeah, that's a lot.  With only a couple of cubes and no queries running it did seem to be hugely excessive. 

What to do?  Install SP1.

That's it.

Usage dropped down to somewhere around 30-80Mb.  Much more memory friendly!

Interestingly, of the many bug fixes included, in SP1 (full list),  a memory leak in SSAS is not one of them! 

Thursday 9 April 2009

TSQL: Truncating Datetime

It crops up all the time.  You have a datetime column.  In your select, you don't care about the time, you only care about the date because you want to group all records for a particular date, for example.  So what's the quickest way to go about it?


Well let's start with the options...
  1. CONVERT
  2. FLOOR
  3. DATEADD/DATEDIFF
If you are using SQL 2008, you could add another option and CAST to DATE.  Perfect if the recipient of the data will accept it.  I'm gonna assume however, as is most likely at this point in time, that you aren't using SQL 2008 yet as that does seem to still be the case in most organisations.  I'm also going to assume that you need to return a datetime.

So, to option 1: CONVERT.

A popular choice because it's easily readable.  Let me give you an example.

CAST((CONVERT(varchar(11),getdate(),120)) as DATETIME)
This will return 2009-04-09 00:00:00.000

Option 2:  FLOOR

CAST(FLOOR(CAST(Getdate() as FLOAT)) as DATETIME)
Returns 2009-04-09 00:00:00.000

Obviously faster than convert because we aren't working with any strings.

Option 3: DATEADD

DATEADD(dd, DATEDIFF(dd, 0 getdate()), 0)
Returns 2009-04-09 00:00:00.000 (shock horror!)

Guess what?  This version is even faster again because we don't have any conversion to do at all.

To what degree are these options faster?

Well, a quick test by plonking a million records into a temp table which contains only a datetime column showed my execution times to be fairly similar but my CPU times  were increasingly quicker.

CONVERT - 1591
FLOOR - 434
DATEADD - 382

The convert is horrible obviously and should never be considered even at the expense of readability.  Clearly there's not a vast difference between FLOOR and DATEADD but I guess I like the DATEADD method for it's lack of conversion as well as it's performance gain, however small.  It just seems cleaner to me.

Friday 6 February 2009

SQL 2005 SSIS: Unpivot Transformation - Destination Column

Came across an interesting little bug today in SSIS...

When using the Unpivot transformation, if you attempt to see the results of the Destination Column in a data viewer well, you can't... the column just isn't there. However, when the output is sent to an actual data destination it is created as expected. You just have no way of knowing that until you actually look at the destination itself.

Wasted quite a bit of time on that one!

Thursday 29 January 2009

No more PPS Planning...

The news, as I'm sure everyone has now heard, is that PerformancePoint Planning is being retired. See the official press release at:

http://www.microsoft.com/presspass/features/2009/jan09/01-27KurtDelbeneQA.mspx

PerformancePoint Monitoring will be bundled into SharePoint as part of Office 14 - and be known as PerformancePoint Services for SharePoint. Wacko.

Friday 23 January 2009

PerformancePoint Planning: Data Integration

Quick Overview of the steps required in an ETL package to load a PerformancePoint Planning application:

  1. Sync Dimensions from App to Staging (call ppscmd)
  2. Sync Models from App to Staging (call ppscmd)
  3. Stage Dimensions into Label table (data flow)
  4. Load Dimensions (call ppscmd)
  5. Stage Hierarchies into Label table (data flow)
  6. Load Dimensions (call ppscmd)
  7. Reindex Dimension tables (sql command)
  8. Stage Fact into Label table (data flow)
  9. Load Fact (call ppscmd)
  10. Reindex MG tables
  11. Deploy Model (call ppscmd)
This presumes you are using the 'Label' table approach.