T-SQL Tuesday #22: Data Presentation

TSQL Tuesday Logo

Robert Pearl hosts No.22

It seems like it hasn’t been that long since last month’s T-SQL Tuesday post; I suppose time flies when you’re having fun and trying to finish up the same ETL project you’ve been working on since March.

This month’s SQL blog party is being hosted by Robert Pearl (blog| @PearlKnows), on the topic of “Data Presentation.” This is a good topic for me at this point, as I’ve all but finished my transition from DBA to BI Monkey (that’s something else I need to write about…). I think Robert is looking for specific examples of ways to present data, but since, as usual, I don’t have anything specific that I can actually publish, I’m left to speak generally about the topic.

Data Presentation: Just as Important as the Data Itself

In a previous life, I was responsible for almost everything data-related for the systems that we ran. As a result, I would get a lot of requests for data. One of my favorite requests would come in the form of, “can you give me some numbers for <X system>?” I would try to keep my response at least marginally non-snarky, but it would generally include two questions:

  1. What, exact, “numbers” do you want? (this is especially where I would have snark problems)
  2. What do you want the data to look like?

Of course, the first one is an important question—if the requestor cannot articulate what it is they actually want (or even what question they’re trying to answer), little else is going to matter. I’ll not dwell on this particular item too much, but suffice to say, sometimes getting a good answer to this seemingly easy question is anything but. I’ve basically come to the conclusion that this is normal.

Once over that hurdle, the conversation can move on to the presentation of whatever data/”numbers” it is the requestor wants. There are almost as many options for presenting data as there are for way to write the T-SQL to retrieve it. Just like writing the SQL in a way that is performance- and resource-conscious, care should be taken when working on the presentation design. It is imperative for the data to be presented in such a way that is understandable and digestible by its intended audience.

Notice I didn’t say “digestible by the party asking for it.” Don’t forget that the request originator may not be the party who is ultimately going to be parsing the provided data. If the audience is not clear in the original request, add a third question to the two that I have listed above: “Who is going to be acting on this data?”

Options for What Happens Next

When the “What do you want it to look like” question is asked, chances are decent that you’ve an idea about what the answer is going to be. If this is a one-off, ad-hoc request, Excel is a popular option. Alternatively, if a robust reporting system is in place, or this request will be a recurring one, developing a report to present the data might be a stronger choice. There are of course other options: the data could be destined for a statistical analysis application, where a CSV file would be more suitable. I would consider this an outlier, though—most of the time, data is prepared for direct human consumption.

Excel is such a popular option that you could almost call it Data’s Universal Distribution Engine (DUDE). Sending data over in Excel is less about the “make it pretty” side of good presentation as it is the “make it useful” side. I’ve found that Excel is a choice a lot of the time because the requestor wants to do more manipulation of the data once they get it. I’ll leave whether or not that is a good thing to the side; the truth is, such activity happens all the time. As a result, when preparing data for an Excel sheet, I like to have an idea of what the user is going to do with it. This sometimes helps to determine what data the user is looking for (if they don’t have a clear idea) but also can help with some formatting or “extras” to include. These “extras” could take the form of running subtotals, percent changes for Year over Year situations, or anything else that is easier to add via SQL instead of someone having to putz around in Excel.

Writing a report to present data has a different set of opportunities than pasting data into Excel. One of the things that I like to see in a solid reporting environment is a set of standards that apply to the reports themselves. Things like common header contents (report name, date/time stamp, name of the data source/DB the data is from, etc), standard text formatting, a common set of descriptors, etc, etc. In addition to making individual reports easier to read & feel more familiar, it can make it easier to compare data etween reports the hard way (one on each monitor), if one has to.

It's only worth 1,000 words if the first ones that come to mind are work safe

One thing each of these two tools gives you is the ability to present data in the form of pretty pictures. There’s a time and a place for everything, but the old cliché, “a picture is worth a thousand words” can/does apply. Sometimes it’s just flat-out hard to beat a good trendline. I have a much easier time seeing even the simplest of trends when data’s plotted out in a histogram. Conversely, one of my coworkers can look at a pile of numbers, not even sorted chronologically, and tell you what is going on in about three seconds.

Knowing where to put your effort goes back to knowing who your intended audience is. Likewise, knowing when to say “no” to visualization is a terribly useful skill. Every data element on the chart should be discernable, or else it doesn’t convey the information it is supposed to, and now the visualization is working against itself. The pie chart to the right? Don’t do that.

Summary

That’s about all I’ve got. In short: Presentation is important. Unfortunately, it can also be complicated. It’s important to ask questions early on in the process and to know your audience. Standardize if you can; help out a little with the complicated work if it can be done in SQL. Also, add visual representations without going overboard. I’ve always found turning “data” into “information” for people to be fun; if it can make someone else’s job easier/more fun, too, then all for the better.

Excel’s CONCATENATE Function

If you’re not already familiar with this function, I may be about to become your hero. Or, you’ll keep using the Input/Output wizard, some T-SQL, or straight-up SSIS to do this sort of thing and this post won’t matter. Either way, this was new to me a while back and its simplicity and usefulness (at least in our environment) blew my mind at first.

The CONCATENATE function in Excel itself is pretty straightforward; it’s a one-trick pony that does exactly what you think it does: concatenates strings together. In fact, Excel’s description of the function isn’t that much more complicated: “Joins several text strings into one text string.”

Couple notes about CONCATENATE…

The first one has to do with its behavior. CONCATENATE will combine anything in Excel cells, even things that classify as “numbers” that can be used in other arithmetic-based functions. Translated, this means if you have “12” in one cell and “78” in another, using CONCATENATE with those two cells will have a result of “1278.”

The second point is about its source data. Obviously this can be used to concatenate cell contents together into another cell, but strings can be put directly in the function definition, too. This means text can be inserted into the result directly via the function call—it doesn’t need to already be in the Sheet somewhere.

What’s it good for?

The primary use case for this function in my life is when I’m on a project that involves either adding new or updating existing data based on some Excel sheet of Business data that gets handed my way. I’ll use this to build UPDATEs or an INSERT statement to get this data into the DB quickly and easily.

I’m going to build a quick example using the Sales.SalesPerson table in AdventureWorks.

For the sake of argument, let’s say that everyone is getting their commission rate adjusted and it’s going to be easier to do it directly in the table instead of through the UI (this would be more plausible if the company were larger and had more than 17 rows in this table, but that’s my story and I’m sticking to it). Or, maybe there isn’t a UI, because “when or why would we ever need to change the commission rates? Don’t waste time on that interface.” But that never happens.

Anyway, the VP of Sales sends you the following Workbook that contains new commission percentages for all of the sales folk:

RawBusinessData

I’m making this somewhat easy by including the BusinessEntityID column, which is the Natural Key for this data. Such data makes this exercise quite easy, and it’s usually possible to get the business to include this important piece of data as long as you know this activity is coming and can talk to them ahead of time.

When building statements with CONCATENATE, I usually start out in SSMS and manually type out what the first statement would be:

UPDATE sales.SalesPerson
  
SET CommissionPct = 0.021
  
WHERE BusinessEntityID = 274

Most of the time, I type out the Excel function call manually based on this initial statement, but it can be copied in as well, after stripping off some of its formatting (line breaks & tabs). The values for the SET & WHERE clauses are replaced by the cells that contain those particular pieces of data. Due to the way the CONCATENATE function works, the “pieces” to concatenate are separated by commas. Wrap plain text to include in the final result in double quotes. The end result should look something like this:

CONCATENATE("UPDATE sales.SalesPerson set CommissionPct = ", C2, " where BusinessEntityID = ",A2)

Put an equals sign at the begging & paste that into Excel. Some magic happens and, bam, UPDATE statement:

First UPDATE

Copy that out and paste it in SSMS underneath the one written earlier & compare the two. If they match, the formula is good. The rest is simply a matter of filling the series down. All of the necessary UPDATE statements are done! Don’t skip checking those over to make sure nothing crazy is going on.

The same thing can be done with INSERT statements if a big block of new data is going into a table. Even if you prefer to do a single INSERT statement instead of discrete ones, CONCATENATE is still useful to build SELECTs and UNION/-ALLs to feed into a single INSERT:

CONCATENATE("SELECT '", B3, "', getdate(), ", A3, " UNION ALL")

Useful?

I think it is. Even if you usually use SSIS to do this type of work, I bet there are still times when using Excel to cook up some UPDATE statements in a hurry is both quicker and easier than writing out a package to do it, or maybe even than the Import wizard.

I use this on a regular basis and believe it’s one of the more useful non-BI or Expense Report things you can do with Excel 😉