INR number to words with ARRAYFORMULA in Google Sheets


Amit Agarwal has put together a great little script for Google Sheets that turns INR (Indian Rupees) amounts into words. Like you’d need to use if you were writing a cheque.

You can find the original code here: INR Amount to Words

However in its current state it won’t work if you want to use it in an ARRAYFORMULA. You only need to make a small change though and the below code will help with that.

We just need to check if an array is being sent in the input and then either process each of the rows in the array, or process the single incoming row if appropriate.

function INR(input)
    return INR_(input);

function INR_(input) {........


As you can see I’ve renamed the original function to INR_ so that we can add the new functionality without breaking the original functionality.

The same logic can then be used to turn any function into one that will work in an ARRAYFORMULA. Because if you’re still coping VLOOKUPS all the way to the bottom of your sheet you’re wasting your time!


Tableau + Google Sheets -Line chart with missing dates


Recently I was tasked to created a line chart that showed movement of animals in an animal sanctuary over time. This data came in the form of survey counts taken each day for each area of the sanctuary like this:

Area Date Count
Kennel 1 03/01/2019 100
Kennel 1 04/01/2019 102
Kennel 1 05/01/2019 101
Kennel 1 08/01/2019 120
Kennel 1 09/01/2019 121

You’ll notice that there’s a gap in the data between the 5th of January and the 8th of January. Not a big problem when we’re looking at one single area as Tableau will auto fill our missing dates:


It’s pretty obvious we’re missing a day’s worth of data, and that’s fine here. We’re rightfully continuing on our data set. But what happens when we’ve got one set of records with every date recorded and one without? We take a big hit on the days we’re missing data:


How can we solve this problem? There are a bunch of different scenarios:

  • Areas that have all days
  • Areas that have all days up to a certain point x days ago
  • Areas that are missing some dates up to the current day
  • Areas that are missing some dates and the last record is y days ago

As you can see our data set becomes rapidly complex. Not particularly difficult to solve with SQL Server and a Tally table, but what happens when we’re restricted to the use of a Google Sheet that’s driven from a Google Form?

First create a calendar table with all of the dates in our data set:


Then we need to deal with any islands. Each record in our data set can be sorted by Area Name and Date. This allows us to find the date of the record previous to this one, we do this by sorting our data by Area Name and Date descending:

=sort('Form responses 1'!C2:K,1,true,2,false)

Then we need to add an ARRAYFORMULA to figure out the previous date:


The needs to go in the first row and runs for each column and simply does:

If the Area Name for the current record matches the Area Name from the previous date’s row (this is why we sorted descending), we use the previous date, else we use the current row’s date. So we get output that looks like this:

Area Date Count Previous Date
Kennel 1 09/01/2019 121 08/01/2019
Kennel 1 08/01/2019 120 05/01/2019
Kennel 1 05/01/2019 101 04/01/2019
Kennel 1 04/01/2019 102 03/01/2019
Kennel 1 03/01/2019 100 03/01/2019
Kennel 2 09/01/2019 76 08/01/2019
Kennel 2 08/01/2019 79 07/01/2019
Kennel 2 07/01/2019 72 06/01/2019
Kennel 2 06/01/2019 74 05/01/2019
Kennel 2 05/01/2019 74 04/01/2019
Kennel 2 04/01/2019 76 03/01/2019
Kennel 2 03/01/2019 76 03/01/2019

Now we have a date range that can be used to join to our calendar table in Tableau. If we join to our calendar table where the calendar date (in which we have every single date) is after the previous date, and less than or equal to the date of the record like below:


We now get a result that looks like this:


As you can see the Calendar Date fills in the gaps.

Dealing with data that’s not up to date uses the exact same technique, we just need to reverse the dates. So the Previous Date is now the date of the record and the Date is TODAY()-1, i.e. yesterday. There are a couple of steps to get this done in Sheets:

1: Get the all of the records where the last record we have is before yesterday:

=query(query('Form responses 1'!C:D,"select C,max(D) where C <> '' group by C"),"where Col2 < date '"&TEXT(DATEVALUE(today()-1),"yyyy-mm-dd")&"'")

So here we’re making a sub QUERY to get the max date for each Area Name, and then another QUERY to get just the records where the date is before TODAY()-1, i.e. yesterday. This would be easy in SQL Server by using a HAVING clause. We then need to get the count value for each of these records:

={"Heading";ArrayFormula(iferror(vlookup(A2:A&B2:B, {'Form responses 1'!C2:C&'Form responses 1'!D2:D, 'Form responses 1'!C2:L}, 9, 0 ), 0))}

I understand it’s difficult without an example, so when I get a chance I’ll put one together and provide a link. If you need one in the meantime, leave a comment and I’ll put something together.

The result is a line chart that has the latest count for each day and can show what would have been the count on any particular day chosen.



Embedded Google Maps not working with Google App Engine


This one was actually pretty easy to solve, but was interesting anyway.

Developing locally I was OK using http requests for my script files as my local server has now issues with using either http or https for requests. However, App Engine has other ideas. In my scripts file I was including the below reference:

This worked no problem on my machine, but when I published to Google App Engine I saw the below:


Once I changed the script to use HTTPS:

I also needed to add my new site to the list of allowed referrers, by going to the Google Cloud Dashboard > APIs and Services > Selecting my project from the dropdown > clicking Maps Javascript API > Credentials > Editing the credential and adding in a record for my site.

Then I was presented with what I wanted to see:


GCloud SQL – Got an error reading communication packets


I’m writing an app at the moment and to enable users to test the beta version I’ve decided to host the whole shebang on GCloud, Google’s cloud offering.

The whole process is really straight forward to set up, however there doesn’t seem to be a lot of support from the blogging community…. yet.

My app is written in Node.js with Express and a MySQL back end (to make things easier when it comes to reading the data out with Tableau for my reporting developer who won’t be able to understand NoSQL structures).

A couple of things I’ve noticed:

ALL of your require statements need to be the same case as the names of the physical files. So when your app is looking for routes\admin, your folder best be called routes and your file best be called admin, because if they’re called Routes\Admin (note the upper case first letters) your app hosted in Google’s App Engine won’t start. The same goes for your database schema name and all of your stored procedures. So best to use STRICT mode when you’re developing.

The specific error this manifested itself as in Google SQL was:


I’m using PaperChase for logging and saw the real error there:


Because my database is called AAU not aau, I was getting access denied errors. These probably should be can’t find routine errors, but in any case I was able to figure out the problem.

Hopefully it helps you too.

SQL Server – Monitoring a Rollback


Recently bumped into an issue where we needed to kill a large insert, obviously causing SQL Server to perform a rollback. I needed to find out what was going on, so looked to Pinal Dave’s blog post about finding out which queries are running, which you can find here. Which uses the sys.dm_exec_requests and sys.dm_exec_sql_text DMVs.

The below script adds a few little things, to tell you for a given SPID, what the SQL text is that’s being executed and most importantly the expected finish time.

--Set how long you want the script to run for in HH:MM:SS format.
--The longer you leave it running the more accurate it will be
DECLARE @Delay VARCHAR(20)	= '00:00:10';

--Add your SPID, which you can get from sp_who2

DECLARE @StartPercent DECIMAL(18,9), @Statement VARCHAR(MAX), @EndPercent DECIMAL(18,9), @DelaySeconds INT, @PercentToGo DECIMAL(18,9), @Increase DECIMAL(18,9);

--Set the starting percentage complete
SELECT @StartPercent = req.percent_complete, @Statement = sqltext.TEXT
FROM sys.dm_exec_requests req
CROSS APPLY sys.dm_exec_sql_text(sql_handle) AS sqltext
WHERE session_id = @SPID

--Wait for the specified delay time

--Get the end percentage complete
SELECT @EndPercent = req.percent_complete
FROM sys.dm_exec_requests req
WHERE session_id = 206

SET @PercentToGo = 100 - @EndPercent;
SET @Increase = @EndPercent - @StartPercent
SET @DelaySeconds = DATEDIFF(SECOND, 0, @Delay)

--Return the output including the expected finish time
SELECT @SPID [SPID], @Statement [SQLText], @DelaySeconds [DelaySeconds], @StartPercent [StartPercent], @EndPercent [EndPercent], @Increase [Increase], DATEADD(SECOND,(@PercentToGo / @Increase) * @DelaySeconds, GETDATE()) [ExpectedFinishTime]


You can use the same query to monitor the below events (source):


SSAS – Negative total value


We encountered a strange issue when connecting to analysis services through Excel where all of the row level sums were correct, however the total value was a negative, as per below:


As you can see the row level sums are all positives, but the grand total is a negative, why?

The underlying data is all positive, however the underlying datatype in the database is a small int, however there are 6 billion rows in the table.

The issue is that when SSAS sums those 6 billion rows of small int values the total ends up breaking the INT barrier (2,147,483,647) which Excel displays as a negative.

The measure group will inherit its data type from the underlying view and attempt to use that:


To fix this issue you need to alter the data type in the Source and then either leave the DataType in Advanced as Inherit, or change it to BigInt:


Now when you deploy and process your cube, the issue should be resolved!