SSAS – Negative total value

Standard

We encountered a strange issue when connecting to analysis services through Excel where all of the row level sums were correct, however the total value was a negative, as per below:

IntBarrier3.PNG

As you can see the row level sums are all positives, but the grand total is a negative, why?

The underlying data is all positive, however the underlying datatype in the database is a small int, however there are 6 billion rows in the table.

The issue is that when SSAS sums those 6 billion rows of small int values the total ends up breaking the INT barrier (2,147,483,647) which Excel displays as a negative.

The measure group will inherit its data type from the underlying view and attempt to use that:

IntBarrier1.PNG

To fix this issue you need to alter the data type in the Source and then either leave the DataType in Advanced as Inherit, or change it to BigInt:

IntBarrier.PNG

Now when you deploy and process your cube, the issue should be resolved!

SSMS – Auto Save is finally here!!

Standard

If you haven.t already upgraded to SSMS 2016 you should do immediately. The guys at Microsoft have decoupled SSMS from the SQL Server release cycle and one of the best features by far is the auto-save\recover functionality.

Remember all those times you were putting the finishing touches on the best code you’ve ever written and SSMS crashed? Well worry no more. I did post previously about how SSMSBoost can help you to recover recent sessions, but now SSMS does it out of the box.

When SSMS crashed you’ll be asked to recover recent work and will see something like the below:

AutoSave.PNG

All you need to do is connect them back up and viola you’re away!

Of course you could always (and you should always) save your work as soon as you create a new window, but who am I to tell you how to live your life….

SSIS – Unicode data is odd byte size for column 4. Should be even byte size.

Standard

This one was interesting and if it wasn’t for a quite obscure article on page 3 of a Google search (who even goes to page 3 anymore) it would have taken me a lot longer to figure out the problem.

Basically we’re hitting a Kafka queue in SSIS to get and process messages as they appear. There’s some info here on how to connect SSIS to Kafka.

The messages that are coming down from the queue are in UTF8 format and I wanted to store them in an NVARCHAR(MAX) column, in case we ever need to process multilingual messages.

As we get the messages down I’m storing them as a string in UTF8, which as per the below isn’t unicode:

string text = Encoding.UTF8.GetString(msg.Payload, 0, msg.Payload.Length);

And then when we push them to the output buffer we’re using UTF8, still not unicode:

JSonOutputBuffer.AddRow();

JSonOutputBuffer.JSonRaw.AddBlobData(System.Text.Encoding.UTF8.GetBytes(text));

The problem came when I tried to push the field into an NVARCHAR(MAX) column in the database, now this IS unicode So we end up getting an error when attempting to push the value from the buffer into the table:

“Unicode data is odd byte size for column 4. Should be even byte size.”

It was an easy fix, just use Unicode as the encoding when you’re pushing the value to the otuput buffer.

JSonOutputBuffer.AddRow();

JSonOutputBuffer.JSonRaw.AddBlobData(System.Text.Encoding.Unicode.GetBytes(text));

This is caused by SQL Server expecting an even number of bytes in an NVARCHAR column as per the below:

“Notice that compressed Unicode strings are always an odd number of bytes. This is how SQL Server determines that the string has actually been compressed, because an uncompressed Unicode string—which needs 2 bytes for each character—will always be an even number of bytes” Source (You’ll need to search for ‘odd’ to find this section..

If nothing else this was an interesting error that I couldn’t find any more info on! So hopefully this helps someone..