{"id":3121,"date":"2017-10-27T02:26:51","date_gmt":"2017-10-27T02:26:51","guid":{"rendered":"http:\/\/bicortex.com\/?p=3121"},"modified":"2018-02-28T12:14:59","modified_gmt":"2018-02-28T12:14:59","slug":"automating-tableau-workbook-exports-using-python-and-tabcmd-command-tool","status":"publish","type":"post","link":"http:\/\/bicortex.com\/bicortex\/automating-tableau-workbook-exports-using-python-and-tabcmd-command-tool\/","title":{"rendered":"Automating Tableau Workbook Exports Using Python and tabcmd Command Tool"},"content":{"rendered":"<p class=\"Standard\" style=\"text-align: justify;\">Tableau comes with a plethora of functionality out of the box but sometimes the following will inevitably be asked\/proposed to meet business reporting requirements:<\/p>\n<ul>\n<li>End users will ask for features which do not exist (at least until version X is released or comes out of preview)<\/li>\n<li>In order to meet those bespoke demands, some level of custom development i.e. &#8216;hacking the status quo&#8217; will be required to meet those requirements<\/li>\n<\/ul>\n<p class=\"Standard\" style=\"text-align: justify;\">In case of Tableau automated reports generation, where pagination and more comprehensive controls are required e.g. producing the same report as a series of PDF documents based on report filter value, the software falls short of providing this level of control. However, given the fact that I am not aware of any other application which provides such fine level of functionality (it\u2019s always a fine balance between features selection and usability) and Tableau\u2019s ability to allow users to dive \u2018under the hood\u2019 to breech such gaps, for technically savvy analysts it\u2019s fairly easy to accommodate those demands with a little bit reverse-engineering.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Tableau server allows end users to query its internal workgroup database sitting atop of PostgreSQL DBMS. I have previously written about PostgreSQL to Microsoft SQL Server schema and data synchronisation <a href=\"http:\/\/bicortex.com\/tableau-server-workgroup-postgresql-database-schema-and-data-synchronization\/\" target=\"_blank\" rel=\"noopener\">HERE<\/a> so I will not go into details of how Tableau stores its metadata but it\u2019s fair to say that anyone who\u2019s ever worked with a relational data stores should have no trouble cobbling together a simple query together to acquire the details on workbooks, views, jobs etc. that Tableau database stores and manages.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Tableau server also allows users to administer some tasks through its built-in tabcmd command-line utility e.g. creating and deleting users, exporting views or workbooks, publishing data sources etc. It\u2019s a handy tool which can simplify administrative efforts and in this specific example we will be using it for creating report exports in an automated fashion.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">In order to demo this solution on a typical business-like data, let\u2019s create a sample data set and a very rudimentary Tableau report. The following SQL builds a single table database running on a local Microsoft SQL Server platform and populates it with dummy sales data.<\/p>\n<pre class=\"brush: sql; title: ; notranslate\" title=\"\">\r\n\r\n\/*==============================================================================\r\nSTEP 1\r\nCreate SampleDB databases on the local instance\r\n==============================================================================*\/\r\nUSE &#x5B;master];\r\nGO\r\nIF EXISTS (SELECT name FROM sys.databases WHERE name = N'SampleDB')\r\nBEGIN\r\n    -- Close connections to the StagingDB database\r\n    ALTER DATABASE SampleDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE;\r\n    DROP DATABASE SampleDB;\r\nEND;\r\nGO\r\n-- Create SampleDB database and log files\r\nCREATE DATABASE SampleDB\r\nON PRIMARY\r\n       (\r\n           NAME = N'SampleDB',\r\n           FILENAME = N'C:\\Program Files\\Microsoft SQL Server\\MSSQL13.MSSQL2016DEV\\MSSQL\\DATA\\SampleDB.mdf',\t\t   \r\n           SIZE = 10MB,\r\n           MAXSIZE = 1GB,\r\n           FILEGROWTH = 10MB\r\n       )\r\nLOG ON\r\n    (\r\n        NAME = N'SampleDB_log',\r\n        FILENAME = N'C:\\Program Files\\Microsoft SQL Server\\MSSQL13.MSSQL2016DEV\\MSSQL\\DATA\\SampleDB_log.LDF',\r\n        SIZE = 1MB,\r\n        MAXSIZE = 1GB,\r\n        FILEGROWTH = 10MB\r\n    );\r\nGO\r\n--Assign database ownership to login SA\r\nEXEC SampleDB.dbo.sp_changedbowner @loginame = N'SA', @map = false;\r\nGO\r\n--Change the recovery model to BULK_LOGGED\r\nALTER DATABASE SampleDB SET RECOVERY BULK_LOGGED;\r\nGO\r\n\r\n\/*======================================================================================\r\nSTEP 2\r\nCreate sample_data table on SampleDB database and insert 10000 dummy records\r\nimitating mocked up sales transations across random dates, postcodes and SKU numbers \r\n======================================================================================*\/\r\nUSE SampleDB;\r\nGO\r\nSET NOCOUNT ON;\r\nIF OBJECT_ID('sample_data') IS NOT NULL\r\nBEGIN\r\n    DROP TABLE sample_data;\r\nEND;\r\n-- Create target object \r\nCREATE TABLE sample_data\r\n(\r\n    id INT IDENTITY(1, 1) NOT NULL,\r\n    product_sku VARCHAR(512) NOT NULL,\r\n    quantity INT NOT NULL,\r\n    sales_amount MONEY NOT NULL,\r\n    postcode VARCHAR(4) NOT NULL,\r\n    state VARCHAR(128) NOT NULL,\r\n    order_date DATETIME NOT NULL\r\n);\r\n-- Populate target object with dummy data\r\nDECLARE @postcodestate TABLE\r\n(\r\n    id INT IDENTITY(1, 1),\r\n    postcode VARCHAR(4),\r\n    state VARCHAR(128)\r\n);\r\nINSERT INTO @postcodestate\r\n(\r\n    postcode,\r\n    state\r\n)\r\nSELECT '2580', 'NSW' UNION ALL SELECT '2618', 'NSW' UNION ALL SELECT '2618', 'NSW' UNION ALL SELECT '2581', 'NSW' UNION ALL\r\nSELECT '2582', 'NSW' UNION ALL SELECT '2580', 'NSW' UNION ALL SELECT '2550', 'NSW' UNION ALL SELECT '2550', 'NSW' UNION ALL\r\nSELECT '2450', 'NSW' UNION ALL SELECT '3350', 'VIC' UNION ALL SELECT '3350', 'VIC' UNION ALL SELECT '3212', 'VIC' UNION ALL\r\nSELECT '3215', 'VIC' UNION ALL SELECT '3880', 'VIC' UNION ALL SELECT '3759', 'VIC' UNION ALL SELECT '3037', 'VIC' UNION ALL\r\nSELECT '3631', 'VIC' UNION ALL SELECT '4006', 'QLD' UNION ALL SELECT '4069', 'QLD' UNION ALL SELECT '4171', 'QLD' UNION ALL\r\nSELECT '4852', 'QLD' UNION ALL SELECT '4852', 'QLD' UNION ALL SELECT '4352', 'QLD' UNION ALL SELECT '4701', 'QLD' UNION ALL\r\nSELECT '4218', 'QLD' UNION ALL SELECT '5095', 'SA'\tUNION ALL SELECT '5097', 'SA'  UNION ALL SELECT '5573', 'SA'  UNION ALL\r\nSELECT '5700', 'SA'  UNION ALL SELECT '5214', 'SA'\tUNION ALL SELECT '6209', 'WA'  UNION ALL SELECT '6054', 'WA'  UNION ALL\r\nSELECT '6068', 'WA'  UNION ALL SELECT '6430', 'WA'  UNION ALL SELECT '6770', 'WA'  UNION ALL SELECT '7054', 'TAS' UNION ALL\r\nSELECT '7253', 'TAS' UNION ALL SELECT '7140', 'TAS' UNION ALL SELECT '7179', 'TAS' UNION ALL SELECT '7109', 'TAS' UNION ALL\r\nSELECT '0870', 'NT'  UNION ALL SELECT '0872', 'NT'  UNION ALL SELECT '0852', 'NT'  UNION ALL SELECT '2617', 'ACT' UNION ALL\r\nSELECT '2617', 'ACT' UNION ALL SELECT '2913', 'ACT' UNION ALL SELECT '2905', 'ACT' UNION ALL SELECT '2903', 'ACT';\r\nDECLARE @allchars VARCHAR(100) = 'ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789';\r\nDECLARE @max_row_number INT;\r\nSET @max_row_number = 10000;\r\nDECLARE @count INT = 1;\r\nWHILE @count &amp;lt;= @max_row_number\r\nBEGIN\r\n    INSERT INTO SampleDB.dbo.sample_data\r\n    (\r\n        product_sku,\r\n        quantity,\r\n        sales_amount,\r\n        postcode,\r\n        state,\r\n        order_date\r\n    )\r\n    SELECT RIGHT(LEFT(@allchars, ABS(BINARY_CHECKSUM(NEWID()) % 35) + 1), 1)\r\n           + RIGHT(LEFT(@allchars, ABS(BINARY_CHECKSUM(NEWID()) % 35) + 1), 1)\r\n           + RIGHT(LEFT(@allchars, ABS(BINARY_CHECKSUM(NEWID()) % 35) + 1), 1)\r\n           + RIGHT(LEFT(@allchars, ABS(BINARY_CHECKSUM(NEWID()) % 35) + 1), 1)\r\n           + RIGHT(LEFT(@allchars, ABS(BINARY_CHECKSUM(NEWID()) % 35) + 1), 1),\r\n           CAST(RAND() * 5 + 3 AS INT),\r\n           CAST(RAND() * 50 AS INT) \/ 2.5,\r\n           (\r\n               SELECT TOP 1 postcode FROM @postcodestate ORDER BY NEWID()\r\n           ),\r\n           (\r\n               SELECT TOP 1 state FROM @postcodestate ORDER BY NEWID()\r\n           ),\r\n           GETDATE() - (365 * 3 * RAND() - 365);\r\n    SET @count = @count + 1;\r\nEND;\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">This table will become the basis for the following Tableau dashboard depicting random sales data aggregated by state, postcode, date and product SKU. The sample report looks as per the image below and is located on a Tableau server instance which we will be using to generate the extracts from. You can also notice that the dashboard contains a filter called &#8216;state&#8217; (as highlighted) with a number of values assigned to it. This filter, along with the filter values representing individual states, will become the basis for producing separate workbook extracts for each of the filter value e.g. NT.pdf, VIC.pdf, NSW.pdf etc.<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-3132\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard.png\" alt=\"\" width=\"580\" height=\"317\" srcset=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard.png 1351w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard-300x164.png 300w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard-768x420.png 768w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_Sample_Dashboard-1024x559.png 1024w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Next, let\u2019s create a very simple database which will hold necessary metadata used for subsequent reports generation. SQLite is more than adequate for such low volumes of data. The database schema is an amalgamation of Tableau\u2019s own workbook and view tables and other objects defined to store tabcmd options, attributes and filters we\u2019d like to use in this example. The schema is generated and populated by the following Python script. The actual SQL used to define the DDL and DML statements can be found on my OneDrive folder <a href=\"https:\/\/1drv.ms\/f\/s!AuEyKKgH71pxg9lPhPtIBD6weguYvw\" target=\"_blank\" rel=\"noopener\">HERE<\/a> and is required as part of the overall solution.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\nimport configparser\r\nimport sqlite3\r\nimport os\r\nimport sys\r\nimport argparse\r\nimport psycopg2\r\n\r\nconfig = configparser.ConfigParser()\r\nconfig.read('params.cfg')\r\n\r\n#pg tables selection\r\npg_tblsNames = &#x5B;'workbooks', 'views']\r\n\r\n#sqlite args\r\ndbsqlite_location = config.get('Sqlite',os.path.normpath('dbsqlite_location'))\r\ndbsqlite_fileName = config.get('Sqlite','dbsqlite_fileName')\r\ndbsqlite_sql = config.get('Sqlite','dbsqlite_sql')\r\n\r\nparser = argparse.ArgumentParser(description='Tableau data extraction solution by bicortex.com')\r\n\r\n#tableau server args\r\nparser.add_argument('-n','--hostname', help='Tableau postgresql server name', required=True)\r\nparser.add_argument('-d','--dbname', help='Tableau postgresql database name', required=True)\r\nparser.add_argument('-u','--username', help='Tableau postgresql user name', required=True)\r\nparser.add_argument('-p','--passwd', help='Tableau postgresql password', required=True)\r\nargs = parser.parse_args()\r\n\r\nif not args.hostname or not args.dbname or not args.username or not args.passwd:\r\n    parser.print_help()\r\n\r\ndef run_DB_built(dbsqlite_sql, dbsqlite_location, dbsqlite_fileName, dbname, username, hostname, passwd, *args):\r\n    with sqlite3.connect(os.path.join(dbsqlite_location, dbsqlite_fileName)) as sqlLiteConn:\r\n        sqlLiteConn.text_factory = lambda x: x.unicode('utf-8', 'ignore')              \r\n\r\n        operations=&#x5B;]\r\n        commands=&#x5B;]\r\n        sql={}\r\n\r\n        #get all SQL operation types as defined by the '----SQL' prefix in tableau_export.sql file\r\n        print('Reading tableau_export.sql file...')\r\n        with open(dbsqlite_sql, 'r') as f:        \r\n            for i in f:\r\n                if i.startswith('----'):\r\n                    i=i.replace('----','')\r\n                    operations.append(i.rstrip('\\n'))\r\n        f.close()                \r\n\r\n        #get all SQL DML &amp;amp; DDL statements from tableau_export.sql file\r\n        tempCommands=&#x5B;]\r\n        f = open(dbsqlite_sql, 'r').readlines()\r\n        for i in f:\r\n            tempCommands.append(i)\r\n        l = &#x5B;i for i, s in enumerate(tempCommands) if '----' in s]\r\n        l.append((len(tempCommands)))    \r\n        for first, second in zip(l, l&#x5B;1:]):\r\n            commands.append(''.join(tempCommands&#x5B;first:second]))       \r\n        sql=dict(zip(operations, commands))\r\n\r\n        #run database CREATE SQL\r\n        print('Building TableauEX.db database schema... ')\r\n        sqlCommands = sql.get('SQL 1: Create DB').split(';')\r\n        for c in sqlCommands:\r\n            try:\r\n                sqlLiteConn.execute(c)\r\n            except sqlite3.OperationalError as e:\r\n                print (e)\r\n                sqlLiteConn.rollback()\r\n                sys.exit(1)\r\n            else:\r\n                sqlLiteConn.commit()\r\n\r\n        #acquire PostgreSQL workgroup database data and populate SQLite database schema with that data  \r\n        print('Acquiring Tableau PostgreSQL data and populating SQLite database for the following tables: {0}...'.format(', '.join(map(str, pg_tblsNames)))) \r\n        pgConn = &quot;dbname={0} user={1} host={2} password={3} port=8060&quot;.format(dbname, username, hostname, passwd)       \r\n        pgConn = psycopg2.connect(pgConn)         \r\n        pgCursor = pgConn.cursor()\r\n        for tbl in pg_tblsNames:\r\n            try:\r\n                tbl_cols={}\r\n                pgCursor.execute(&quot;&quot;&quot;SELECT ordinal_position, column_name \r\n                                FROM information_schema.columns \r\n                                WHERE table_name = '{}' \r\n                                AND table_schema = 'public'\r\n                                AND column_name != 'index'\r\n                                ORDER BY 1 ASC&quot;&quot;&quot;.format(tbl)) \r\n                rows = pgCursor.fetchall()\r\n                for row in rows:                \r\n                    tbl_cols.update({row&#x5B;0]:row&#x5B;1]})\r\n                sortd = &#x5B;tbl_cols&#x5B;key] for key in sorted(tbl_cols.keys())]\r\n                cols = &quot;,&quot;.join(sortd)   \r\n                pgCursor.execute(&quot;SELECT {} FROM {}&quot;.format(cols,tbl)) \r\n                rows = pgCursor.fetchall()\r\n                num_columns = max(len(rows&#x5B;0]) for t in rows)            \r\n                pgsql=&quot;INSERT INTO {} ({}) VALUES({})&quot;.format(tbl&#x5B;:-1],cols,&quot;,&quot;.join('?' * num_columns))\r\n                sqlLiteConn.executemany(pgsql,rows)\r\n            \r\n            except psycopg2.Error as e:\r\n                print(e)\r\n                sys.exit(1)  \r\n            else:               \r\n                sqlLiteConn.commit()\r\n\r\n        #update SQLite bridging tables based on DML statements under 'SQL 2: Update DB' opertation type header                  \r\n        print('Updating SQLite database bridging tables...')\r\n        sqlCommands = sql.get('SQL 2: Update DB').split(';')\r\n        for c in sqlCommands:\r\n            try:\r\n                sqlLiteConn.execute(c)\r\n            except sqlite3.OperationalError as e:\r\n                print (e)\r\n                sqlLiteConn.rollback()\r\n                sys.exit(1)\r\n            else:\r\n                sqlLiteConn.commit()\r\n    \r\n    sqlLiteConn.close()\r\n    pgConn.close()    \r\n\r\nif __name__ == &quot;__main__&quot;:  \r\n    run_DB_built(dbsqlite_sql, dbsqlite_location, dbsqlite_fileName, args.dbname, args.username, args.hostname, args.passwd, pg_tblsNames) \r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">I have run this code against Tableau version 10.5 so there is a slight chance that some SQL statements will require adjustments depending on Tableau\u2019s PostgreSQL database changes in subsequent releases. The script uses the aforementioned <a href=\"https:\/\/1drv.ms\/f\/s!AuEyKKgH71pxg9lPhPtIBD6weguYvw\" target=\"_blank\" rel=\"noopener\">SQL file<\/a> as well as one configuration file where some of the required parameters are stored as per the image below. Please also note that this SQL file contains data for the filter table comprised of Australian states (corresponding to Tableau workbook filter and its values as indicated above) so your actual data may contain other filter names and values or may even not use them at all. Also, Tableau workgroup database schema does not store filter names and\/or values so in order to have them referenced in the script they need to be added manually.<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Config_File_Content.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-3152\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Config_File_Content.png\" alt=\"\" width=\"580\" height=\"182\" srcset=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Config_File_Content.png 914w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Config_File_Content-300x94.png 300w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Config_File_Content-768x241.png 768w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Once executed, the following schema should be built and populated.<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2018\/02\/Tableau_PDF_Extracts_TableauEXDB_SQLite_Schema_ERD.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-3137\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2018\/02\/Tableau_PDF_Extracts_TableauEXDB_SQLite_Schema_ERD.png\" alt=\"\" width=\"580\" height=\"1184\" srcset=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_TableauEXDB_SQLite_Schema_ERD.png 767w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_TableauEXDB_SQLite_Schema_ERD-147x300.png 147w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2018\/02\/Tableau_PDF_Extracts_TableauEXDB_SQLite_Schema_ERD-502x1024.png 502w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Now that we have laid down the foundations for this solution and have the SQLite database created and populated as well as a sample Tableau dashboard deployed on Tableau server pointing to our dummy data stored in SQL Server we can run Tableau reports generation script. For this example, let\u2019s assume that we need to generate a separate PDF copy of the dashboard for each of the states defined in the filter table. The query will look as per the following.<\/p>\n<pre class=\"brush: sql; title: ; notranslate\" title=\"\">\r\nSELECT\r\n  w.name  AS workbook_name,\r\n  v.name  AS view_name,\r\n  f.name  AS filter_name,\r\n  f.value AS filter_value\r\nFROM workbook w\r\n  JOIN view v ON w.id = v.fk_workbook_id\r\n  JOIN view_filter_bridge vfb ON v.id = vfb.fk_view_id\r\n  JOIN filter f ON vfb.fk_filter_id = f.id\r\nWHERE w.name = 'Sample_Tableau_Report'\r\n      AND v.sheettype = 'dashboard'\r\n      AND v.state = 'active';\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">Entering the same query into a Python script, which than further breaks down the returned attributes and their values into tabcmd command arguments in a looped fashion, produces a series of statements which in turn produce PDF version of the nominated dashboard. Below is the Python script which saves a copy of Sales_Overview_Dashboard report in a PDF format for each state to the specified location.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\nimport configparser\r\nimport sqlite3\r\nimport os\r\nimport sys\r\nimport argparse\r\nimport subprocess\r\nimport pandas as pd\r\n\r\nconfig = configparser.ConfigParser()\r\nconfig.read('params.cfg')\r\n\r\n#sqlite &amp;amp; tabcmd args\r\ndbsqlite_location = config.get('Sqlite',os.path.normpath('dbsqlite_location'))\r\ndbsqlite_fileName = config.get('Sqlite','dbsqlite_fileName')\r\ntabcmd_location = config.get('Tableau_CMD','tableau_cmd_tool_path')\r\ntabcmd_url = config.get('Tableau_CMD','tableau_svr_url')\r\npdfs_location = config.get('PDFs', 'save_path')\r\n\r\nparser = argparse.ArgumentParser(description='Tableau report(s) generation script by bicortex.com')\r\n\r\n#tableau server args\r\nparser.add_argument('-u', '--username', help='Tableau server user name', required=True)\r\nparser.add_argument('-p', '--passwd', help='Tableau server password', required=True)\r\nparser.add_argument('-o', '--option', help='Other options and arguments provided by tabcmd', required=False)\r\nargs = parser.parse_args()\r\n\r\nif not args.username or not args.passwd:\r\n    parser.print_help()\r\n\r\n#tableau login function\r\ndef tab_login(tabcmd_location, tabcmd_url, username, passwd): \r\n    try:    \r\n        p=subprocess.run('{0} login -s {1} -u {2} -p {3} -no-certcheck'\\\r\n        .format(os.path.join(os.path.normpath(tabcmd_location),'tabcmd'),\\\r\n        tabcmd_url, args.username, args.passwd ),shell=True)         \r\n        r=p.returncode\r\n        return r\r\n    except subprocess.SubprocessError as e:\r\n            print(e)\r\n            sys.exit(1)\r\n\r\n#tableau logout function\r\ndef tab_logout(tabcmd_location):\r\n    try:    \r\n        p=subprocess.run('{0} logout'.format(os.path.join(os.path.normpath(tabcmd_location),'tabcmd')),shell=True)         \r\n    except subprocess.SubprocessError as e:\r\n            print(e)\r\n            sys.exit(1)\r\n\r\n#tabcmd report export function\r\ndef run_extracts(pdfs_location, tabcmd_location, username=args.username, passwd=args.passwd, option=args.option):\r\n    standard_export_options = '--pdf --pagelayout landscape --no-certcheck --timeout 500'\r\n    login_ok = tab_login(tabcmd_location, tabcmd_url, username, passwd)  \r\n    if login_ok==0:\r\n        with sqlite3.connect(os.path.join(dbsqlite_location, dbsqlite_fileName)) as sqlLiteConn:\r\n            sqliteCursor = sqlLiteConn.cursor()\r\n            sqliteCursor.execute( &quot;&quot;&quot;\r\n                                    SELECT\r\n                                    w.name                     AS workbook_name,\r\n                                    v.name                     AS view_name,\r\n                                    f.name                     AS filter_name,\r\n                                    f.value                    AS filter_value\r\n                                    FROM workbook w\r\n                                    JOIN view v ON w.id = v.fk_workbook_id\r\n                                    JOIN view_filter_bridge vfb ON v.id = vfb.fk_view_id\r\n                                    JOIN filter f ON vfb.fk_filter_id = f.id                            \r\n                                    WHERE w.name = 'Sample_Tableau_Report'\r\n                                    AND v.sheettype = 'dashboard'\r\n                                    AND v.state = 'active';\r\n                                &quot;&quot;&quot;)\r\n            result_set = sqliteCursor.fetchall()\r\n            if result_set:                \r\n                df = pd.DataFrame(result_set)\r\n                col_name_list = &#x5B;tuple&#x5B;0] for tuple in sqliteCursor.description]\r\n                df.columns = col_name_list\r\n                print('\\nThe following attributes and values were returned from the SQL query:')                \r\n                print(df)\r\n                for row in result_set:\r\n                    workbook_name           = row&#x5B;0]\r\n                    view_name               = row&#x5B;1]\r\n                    filter_name             = row&#x5B;2]\r\n                    filter_value            = row&#x5B;3]\r\n\r\n                    if filter_name:\r\n                        if ' ' in row&#x5B;2]==True:\r\n                            filter_name = row&#x5B;2].replace(' ', '%20')\r\n                        if ' ' in row&#x5B;2]==True:\r\n                            filter_value = row&#x5B;2].replace(' ', '%20')  \r\n                        if not option:\r\n                            option_value = standard_export_options\r\n                            command = '{0} export &quot;{1}?{2}={3}&quot; -f &quot;{4}{5}.pdf&quot; {6} '\\\r\n                            .format(os.path.join(os.path.normpath(tabcmd_location),'tabcmd'),\\\r\n                            workbook_name + '\/' + view_name, filter_name, filter_value, pdfs_location, filter_value, option_value)     \r\n                            try:    \r\n                                p=subprocess.run(command, shell=True)                                 \r\n                            except subprocess.SubprocessError as e:\r\n                                print(e)\r\n                                sys.exit(1)                                        \r\n                    else:  \r\n                        if not option:\r\n                            option_value = standard_export_options\r\n                            command = '{0} export &quot;{1}&quot; -f &quot;{2}{3}.pdf&quot; {4} '\\\r\n                            .format(os.path.join(os.path.normpath(tabcmd_location),'tabcmd'),\\\r\n                            workbook_name + '\/' + view_name, pdfs_location, view_name, option_value)     \r\n                            try:    \r\n                                p=subprocess.run(command, shell=True)                                 \r\n                            except subprocess.SubprocessError as e:\r\n                                print(e)\r\n                                sys.exit(1)\r\n        tab_logout(tabcmd_location)\r\n\r\nif __name__ == &quot;__main__&quot;:  \r\n    if ' ' in tabcmd_location:\r\n        tabcmd_location = '&quot;'+ os.path.normpath(tabcmd_location) + '&quot;' \r\n    else:\r\n        tabcmd_location = os.path.normpath(tabcmd_location)    \r\n    run_extracts(pdfs_location, tabcmd_location, args.username, args.passwd)         \r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">This script assumes that tabcmd utility is installed on the local machine if not run from the Tableau server environment (which has it installed by default) and produces the following output when run from the command prompt (click on image to expand).<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Python_Extract_Console_Output.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-3153\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Python_Extract_Console_Output.png\" alt=\"\" width=\"580\" height=\"774\" srcset=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Python_Extract_Console_Output.png 854w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Python_Extract_Console_Output-225x300.png 225w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Python_Extract_Console_Output-768x1024.png 768w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">The following is the folder view of where the individual reports have been saved as well as a partial view of South Australia state report (SA.pdf) opened depicting SA data only, with remaining states filtered out of the context.<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Folder_and_PDF_View.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter size-full wp-image-3146\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2017\/10\/Tableau_PDF_Extracts_Folder_and_PDF_View.png\" alt=\"\" width=\"580\" height=\"749\" srcset=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Folder_and_PDF_View.png 580w, http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2017\/10\/Tableau_PDF_Extracts_Folder_and_PDF_View-232x300.png 232w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Similar scripts can be run for all states i.e. no filter assign. The difference being the SQL statement we pass in the Python script. For example, if we would like to run Sales Overview Dashboard report across all states on a single PDF sheet we could run the following SQL and embed it into the script.<\/p>\n<pre class=\"brush: sql; title: ; notranslate\" title=\"\">\r\nSELECT\r\n  w.name AS workbook_name,\r\n  v.name AS view_name,\r\n  NULL   AS filter_name,\r\n  NULL   AS filter_value\r\nFROM workbook w\r\n  JOIN view v ON w.id = v.fk_workbook_id\r\nWHERE w.name = 'Sample_Tableau_Report'\r\n      AND v.sheettype = 'dashboard'\r\n      AND v.state = 'active';\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">This technique can be used across a variety of different combinations and permutations of workbooks, sheets, filters, option etc. and is just a small example of how one can generate a simple solution for an automated workbook extracts. For the full scope of tabcmd commands and associated functionality please refer to Tableau online documentation.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Tableau comes with a plethora of functionality out of the box but sometimes the following will inevitably be asked\/proposed to meet business reporting requirements: End users will ask for features which do not exist (at least until version X is released or comes out of preview) In order to meet those bespoke demands, some level [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[56,63,53],"tags":[41,49,29],"class_list":["post-3121","post","type-post","status-publish","format-standard","hentry","category-programming","category-tableau","category-visualisation","tag-python","tag-sql","tag-tableau"],"aioseo_notices":[],"jetpack_featured_media_url":"","_links":{"self":[{"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/3121","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/comments?post=3121"}],"version-history":[{"count":24,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/3121\/revisions"}],"predecessor-version":[{"id":3157,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/3121\/revisions\/3157"}],"wp:attachment":[{"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/media?parent=3121"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/categories?post=3121"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/tags?post=3121"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}