SharePoint Integrated Mode Pesky Images Rendering in SSRS

April 2nd, 2012 / No Comments » / by admin

Recently I have been busy creating a series of dashboard reports for one of the clients. As the reports are to be integrated into a custom, in-house developed web portal, the report server needed to be installed in the SharePoint integrated Mode (SharePoint 2010). As the application backend is based on SQL Server 2008 R2 and the core functionality has been developed in .NET, SSRS was easy to integrate using ReportViewer webpart. Majority of the reports use stored procedure to fetch data from tables, which already contain pre-aggregated data (between 300 and 1200 rows and approximately 10 columns).

Most of the reports perform very well, with the execution times ranging from 900 milliseconds to 2 seconds. The exception to this rule consist of a batch of reports which contain visually rich elements e.g. charts and gauges embedded within a matrix. Quick, back-of-the-napkin calculation of the number of images to display (8 x 3 x 7) by two matrix objects, this gives us 336 images to render each time the report is run.

Also, since the images are being created dynamically by SSRS, they are given a unique identifier which is not cached – so the end result may be the same image that is already displayed multiple times. Given the fact that all these are relatively small in size – under 1KB – and that rendering takes advantage of a pretty gutsy backend hardware you can imagine how unimpressed I was when the report took 48 seconds to fully display. Also, majority of those will be run by the client’s customers over WAN spread out all over the country so there is an extra cost associated with traversing the Internet, not just our LAN. Below is a short screen capture footage displaying the report’s unacceptable execution speed.

According to Microsoft, in SP Integrated Mode, there are a lot more WFE API calls, as well as WFE-SSRS API calls. These contribute to the overall rendering time, which we later confirmed by running a trace using WireShark as well as in-browser Google Developer Tools. Below are the screen captures from GDT run in Chrome depicting the size of the images loaded as well as the final call for image after nearly 44 seconds of blocking due to the synchronous execution nature of the transfer.

Running same reports in Native Mode the performance was respectable and easily allied to the client KPIs and SLAs. As far as I’m aware there is no fixes or improvements in this realm released from Microsoft, even though they are aware of the issue. SQL Server 2012 (codename Denali) is supposed to have major improvements in this area but if you installed or recently upgraded to version 2008 (R2), you’re out of luck. There are even some BI departments I ‘m aware of that withheld their upgrades from version 2005 due to the impact this issue would have on their reporting standards.

Tags: ,

Generating Files through bcp Utility in SQL Server

April 2nd, 2012 / 2 Comments » / by admin

In this post I would like to show how anyone can easily generate files (CSV, XML, TXT and more) through the use of bcp utility on Microsoft SQL Server. This technique can be easily incorporated into an SSIS package as part of its workflow as we can build on this solution to further post-process the data. The bcp utility bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the ‘queryout’ option, the utility requires no knowledge of Transact-SQL and can easily be incorporated into an SSIS package.

As always, let’s create a sample data set to work with. First, run these queries to create a dummy table and populate it with some test data.

CREATE TABLE tbl_Test
(
Middle_Name nvarchar (50) NULL,
Last_Name nvarchar (50) NOT NULL,
Address1 nvarchar (100) NOT NULL,
Address2 nvarchar (100) NULL,
City nvarchar (50) NOT NULL,
State nvarchar (50) NOT NULL
)

INSERT INTO tbl_Test
VALUES ('Scott', NULL, 'Johnson', '1 Parade Drive', NULL, 'Sydney', 'NSW')
INSERT INTO tbl_Test
VALUES ('Mary', 'Jane', 'Frederik', '33 Spalding Court', NULL, 'Sydney' ,'NSW')
INSERT INTO tbl_Test
VALUES ('Jessie', 'Kane', 'Smith', '14 Melrose Place', 'Unit 4a', 'Melbourne', 'VIC')
INSERT INTO tbl_Test
VALUES ('Radash', 'Kumar', 'Mahid', '55 Sunshine Avenue', NULL, 'Cairns', 'QLD')

Once we have the data in place, let’s assume that we would like to export it into a file (CSV is the format in this instance) onto the local drive or a network share. This is where the simplicity of bcp comes into play, although it can also be a bit limiting and cumbersome to work with e.g. having to put all attribute names on one line. As this tutorial deals with bcp executed from either Management Studio or BIDS, I will use some common SQL syntax and expressions to declare a couple of variables and finally execute SQL command to spit out the file.

--DECLARE VARIABLE WHICH STORES THE OUTPUT FILE LOCATION
DECLARE @Path         varchar (100)

--DECLARE A VARIABLE WHERE THE QUERY WILL BE STORED
DECLARE @bcpCommand   varchar (8000)

--ASSIGN THE PATH TO THE VARIABLE. IN THIS CASE THE FILE WILL BE STORED ON THE LOCAL c:\ DRIVE
SET @Path =  'c:\somefile.csv'

--EXPORT THE FILE USING bcp (FIRST LINE - CREATE COLUMN HEADINGS, SECOND FILE - CREATE COLUMN DATA)
SET @bcpCommand = ‘bcp “SELECT  “First_Name” as H1, “Middle_Name” as H2, “Last_Name” as H3, “Address1” as H4, “Address2” as H5, “City” as H6, “State” as H7’
SET @bcpCommand = @bcpCommand + ‘UNION ALL SELECT First_Name, Middle_Name, Last_Name, Address1, Address2, City, State FROM tbl_Test” QUERYOUT’ + @Path +’c –T –t, -U”login” –P”password”’

--SPAWNS  A WINDOWNS COMMAND SHELL  AND PASSES  IN A STRING FOR  EXECUTION
EXEC master..xp_cmdshell @bcpCommand

Also, if further postprocessing is required on the newly output file e.g. FTP task, it is best if we include a ‘WAITFOR DELAY’ command to ensure that the file ends up in the folder specified. Otherwise, due to a short delay between the SQL execution and file creation the package may fail.

One of the common situation which can take advantage of this process is creating a file with a name reflecting certain rules. Usually this involves formatting the file name to include the date when it was created e.g. yyyyMMdd_File_Name.csv. In such case we can easily adjust the above bcp command by adding a few variables and changing the SQL code to take them into account. To achieve this, we can add @FileName and @CurrentDate variables and alter the script to output the file with the current date included in its name as per below.

DECLARE @Path         varchar (100)
DECLARE @bcpCommand   varchar (8000)
DECLARE @CurrentDate varchar (20)
DECLARE @FileName varchar (50)
SET @CurrentDate  = (SELECT CONVERT (varchar (20), GetDate(),112) as [yyyyMMdd])
SET @FileName = @CurrentDate + '_File_Name.csv'
SET @PATH = REPLACE(‘c:\somefile.csv’, ‘somefile.csv’, @FileName) 
SET @bcpCommand = ‘bcp “SELECT  “First_Name” as H1, “Middle_Name” as H2, “Last_Name” as H3, “Address1” as H4, “Address2” as H5, “City” as H6, “State” as H7’
SET @bcpCommand = @bcpCommand + ‘UNION ALL SELECT First_Name, Middle_Name, Last_Name, Address1, Address2, City, State FROM tbl_Test” QUERYOUT’ + @Path +’c –T –t, -U”login” –P”password”’
EXEC master..xp_cmdshell @bcpCommand

The inclusion of the @FileName and @CurrentDate variables, coupled with a simple ‘REPLACE’ SQL statement will output 20120415_File_Name.csv file into our c:\ drive.

In the next few posts I will explain how to programmatically check if the file has been created. This could be nicely incorporated into this process when developing a larger or more complex solution.

Tags: , , ,