{"id":4806,"date":"2023-06-08T11:09:13","date_gmt":"2023-06-08T01:09:13","guid":{"rendered":"http:\/\/bicortex.com\/?p=4806"},"modified":"2024-06-13T20:36:11","modified_gmt":"2024-06-13T10:36:11","slug":"kicking-the-tires-on-azure-sql-database-external-rest-endpoints-sample-integration-solution-architecture","status":"publish","type":"post","link":"https:\/\/bicortex.com\/bicortex\/kicking-the-tires-on-azure-sql-database-external-rest-endpoints-sample-integration-solution-architecture\/","title":{"rendered":"Kicking the Tires on Azure SQL Database External REST Endpoints &#8211; Sample Integration Solution Architecture"},"content":{"rendered":"<h3 style=\"text-align: center;\">Introduction<\/h3>\n<p class=\"Standard\" style=\"text-align: justify;\">SQL Server in-database REST API integration was always roll-your-own, bubble gum and duct tape type of affair \u2013 it was possible but never easy. Some may argue it\u2019s for all the right reasons as imposing strict distinction between database and application layers created a well-defined separation of concerns and delegated the former to do one thing and one thing only \u2013 data storage and management. However, as vendors&#8217; competition increased, more innovative features were added to and around these platforms to expand their capabilities and accessibility &#8211; new formats e.g. Apache Arrow, new Machine Learning features e.g. vector support or even new ways of merging applications and data storage paradigms e.g. WebAssembly-compiled (in-browser) RDBMS. As such, the word database, though synonymous with its primary function of data storage and management, has taken on a new meaning and with that, as set of new capabilities as partly discussed in the post.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Azure SQL Database external REST endpoint integration has not long ago come out of Public Preview and represents an improved way to natively (to Azure ecosystem) query REST API endpoints with little fuss. External REST Endpoint Invocation makes it possible for developers to call REST\/GraphQL endpoints from other Azure Services from right within the Azure SQL Database. With a quick call to sp_invoke_external_rest_endpoint system stored procedure, you can have data processed via an Azure Function, update a PowerBI dashboard, or even talk to Cognitive Service or OpenAI.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">For a full list of supported services, you can peruse Microsoft documentation but in order to explore real-world application of this functionality, let\u2019s build a simple solution and see how easy or difficult it is to put it to work.<\/p>\n<h3 style=\"text-align: center;\">Example Architecture and Use Case Scenario<\/h3>\n<p class=\"Standard\" style=\"text-align: justify;\">Let\u2019s assume that we have a telephone conversations data arriving in Azure Blob Storage as a JSON file. Next, we\u2019d like to persist it in our SQL database in near real time and enrich it with sentiment analysis data using Azure Cognitive Services. Additionally, if the sentiment is negative, perhaps indicating customer complaint or dissatisfaction, we would like an email sent to a member of a customer service team to triage and follow up on.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">The following diagram (click on image to enlarge) represent a proposed solution architecture behind this requirement, with emphasis on activities number 3, 6 and 9 as these correspond to using SQL Server sp_invoke_external_rest_endpoint system stored procedure to communicate with external services. The idea here is that SQL Server engine can act a connecting tissue for most of integration work, allowing simple workflows to be built and executed directly from the underlying database. And, as you will see, most of this functionality can be achieved using vanilla T-SQL with a combination of stored procedures and triggers, something which was very difficult to solution before this feature was made available.<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4836\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture.png\" alt=\"\" width=\"580\" height=\"825\" srcset=\"https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture.png 840w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture-211x300.png 211w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture-720x1024.png 720w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Sample_Solution_Architecture-768x1093.png 768w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Also, please note that I do not advocate for building high-volume, high-velocity, real-time pipelines using database triggers and SQL Server system stored procedures. Microsoft clearly outlines limits imposed on throttling for the number of concurrent connections to external endpoints as well as limitations in the HTTP request and response payload supported media types and size, URL length, header size etc., so it\u2019s clearly not a panacea for all your integration needs. However, for sporadic and limited use cases \u2013 think in-database Zapier &#8211; this can significantly cut development time and allow DBAs and database devs to reach into other corners of Azure ecosystem with little fuss.<\/p>\n<h3 style=\"text-align: center;\">Solution Implementation<\/h3>\n<p class=\"Standard\" style=\"text-align: justify;\">To start with, we need Azure Storage Account with the input container to store our incoming JSON files. Once we have one created, we can develop a small Azure Function which executes on blob being persisted in the target location and executes Azure SQL Database stored procedure responsible for data acquisition. The following is a small Python script calling our first stored procedure \u2013 usp_load_from_azure_blob \u2013 every time a new blob is created. For simplicity\u2019s sake, the code does not do any file schema validation or pre-processing and its sole role is to execute SQL Server stored procedure.<\/p>\n<pre class=\"brush: python; title: ; notranslate\" title=\"\">\r\nimport logging\r\nimport pyodbc\r\nimport azure.functions as func\r\nfrom os import path\r\n\r\n# For demo only - any cryptographic keys should be stored in Secrets Store e.g. Azure Key Vault!\r\n_SQL_SERVER = 'Your_Azure_Server_Name'\r\n_SQL_DB = 'Your_Your_DB_Name'\r\n_USERNAME = 'Your_DB_User_Name'\r\n_PASSWORD = 'Your_DB_User_Name_Password'\r\n_DRIVER = '{ODBC Driver 18 for SQL Server}'\r\n_TARGET_TABLE_NAME = 'customer_interactions'\r\n_TARGET_SCHEMA_NAME ='dbo'\r\n_TARGET_STORED_PROC_NAME = 'usp_load_from_azure_blob'\r\n\r\n\r\ndef main(inputblob: func.InputStream):\r\n    \r\n    logging.info('Python blob trigger function processed blob {blob_name}'.format(blob_name = inputblob.name))\r\n    try:\r\n        cnxn = pyodbc.connect('DRIVER='+_DRIVER+';SERVER='+_SQL_SERVER +\r\n                              ';PORT=1433;DATABASE='+_SQL_DB+';UID='+_USERNAME+';PWD='+_PASSWORD)\r\n        if cnxn:\r\n            logging.info('Connection to {mssql} SQL Server succeeded!'.format(mssql=SQL_SERVER))\r\n    except pyodbc.Error as e:\r\n        sqlstate = e.args&#x5B;1]\r\n        logging.error(\r\n            sqlstate)\r\n    if cnxn:\r\n        logging.info('Executing {stored_proc} stored procedure...'.format(stored_proc=_TARGET_STORED_PROC_NAME))\r\n        cursor = cnxn.cursor()\r\n        sql = '''\\\r\n                DECLARE @Return_Code INT;\r\n                EXEC @Return_Code = {stored_proc} ?,?,?;\r\n                SELECT @Return_Code AS rc;'''.format(stored_proc = _TARGET_STORED_PROC_NAME)\r\n        values = (path.basename(inputblob.name), _TARGET_SCHEMA_NAME, _TARGET_TABLE_NAME)\r\n        cursor.execute(sql, values)\r\n        rc = cursor.fetchval()\r\n        if rc == 0:\r\n            logging.info('Stored procedure {stored_proc} executed successfully!'.format(stored_proc=_TARGET_STORED_PROC_NAME))\r\n        cursor.commit()  \r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">Now that we have our function, let\u2019s create a small JSON file called \u2018customer12345.json\u2019 (I used ChatGPT for this), target table the stored procedure used in our Python script. Also, given that some REST endpoints require authentication in order to be properly invoked, we will need to create Database Scoped Credentials (DSC) to securely store authentication data (like a Bearer token for example) to call a protected endpoint. The following code creates Scoped Credential \u2018azblobstore\u2019 with SAS access token, a table called customer_interactions where unparsed JSON data will be stored, and the main stored procedure used for data acquisition. Notice that in line 42, there is also a reference to a table value function called tvf_compare_json_docs which is there to allow JSON payload comparison in the odd case the same file (with the same file name) is submitted more than once and we\u2019d like to update the original version and populated Update_DataTime field in the target table (the code behind this tvf and JSON file can be found in my OneDrive folder <a href=\"https:\/\/1drv.ms\/f\/s!AuEyKKgH71pxibdjC3Oa6z-s57Y7zw?e=MdR14C\" target=\"_blank\" rel=\"noopener\">HERE<\/a>).<\/p>\n<pre class=\"brush: sql; highlight: [100,101,102,103,104]; title: ; notranslate\" title=\"\">\r\n-- create encryption key\r\nIF NOT EXISTS\r\n(\r\n    SELECT *\r\n    FROM sys.symmetric_keys\r\n    WHERE &#x5B;name] = '##MS_DatabaseMasterKey##'\r\n)\r\nBEGIN\r\n    CREATE MASTER KEY ENCRYPTION BY PASSWORD = '$trong_Pa$$word';\r\nEND;\r\n\r\n-- create credential name\r\nIF EXISTS\r\n(\r\n    SELECT TOP (1)\r\n           1\r\n    FROM sys.database_credentials\r\n    WHERE name = 'azblobstore'\r\n)\r\nBEGIN\r\n    DROP DATABASE SCOPED CREDENTIAL azblobstore;\r\nEND;\r\n\r\n-- create database scoped credential\r\nCREATE DATABASE SCOPED CREDENTIAL &#x5B;azblobstore]\r\nWITH IDENTITY = 'SHARED ACCESS SIGNATURE',\r\n     SECRET = 'Your_Azure_Blob_Storage_SAS_Secret_Value';\r\nGO\r\n\r\n-- create target table\r\nDROP TABLE IF EXISTS &#x5B;dbo].&#x5B;customer_interactions]\r\nCREATE TABLE &#x5B;dbo].&#x5B;customer_interactions](\r\n\t&#x5B;file_id] &#x5B;UNIQUEIDENTIFIER] NOT NULL,\r\n\t&#x5B;file_name] &#x5B;NVARCHAR](1024) NULL,\r\n\t&#x5B;payload] &#x5B;NVARCHAR](MAX) NULL,\r\n\t&#x5B;sentiment] &#x5B;VARCHAR](20) NULL,\r\n\t&#x5B;insert_datetime] &#x5B;DATETIME2](7) NULL,\r\n\t&#x5B;update_datetime] &#x5B;DATETIME2](7) NULL,\r\n CONSTRAINT &#x5B;file_name] PRIMARY KEY CLUSTERED \r\n(\r\n\t&#x5B;file_id] ASC\r\n)WITH (STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, OPTIMIZE_FOR_SEQUENTIAL_KEY = OFF) ON &#x5B;PRIMARY]\r\n) ON &#x5B;PRIMARY] TEXTIMAGE_ON &#x5B;PRIMARY]\r\nGO\r\nALTER TABLE &#x5B;dbo].&#x5B;customer_interactions] ADD  CONSTRAINT &#x5B;df_file_id]  DEFAULT (NEWSEQUENTIALID()) FOR &#x5B;file_id]\r\nGO\r\nALTER TABLE &#x5B;dbo].&#x5B;customer_interactions] ADD  DEFAULT (NULL) FOR &#x5B;update_datetime]\r\nGO\r\n\r\n-- create usp_load_from_azure_blob stored procedure\r\nCREATE OR ALTER PROCEDURE &#x5B;dbo].&#x5B;usp_load_from_azure_blob]\r\n(\r\n    @file_name VARCHAR(1024),\r\n    @schema_name sysname,\r\n    @table_name sysname,\r\n    @table_spec sysname = NULL\r\n)\r\nAS\r\nBEGIN\r\n    SET NOCOUNT ON;\r\n    DECLARE @has_identity_column INT;\r\n    DECLARE @new_json NVARCHAR(MAX);\r\n    DECLARE @old_json NVARCHAR(MAX);\r\n\tDECLARE @new_json_file_name NVARCHAR(1024);\r\n    DECLARE @old_json_file_name NVARCHAR(1024);\r\n    DECLARE @error_message VARCHAR(MAX);\r\n    DECLARE @url NVARCHAR(MAX) = CONCAT('https:\/\/your_storage_account_name.blob.core.windows.net\/input-json\/', @file_name);\r\n    DECLARE @response NVARCHAR(MAX);\r\n\tDECLARE @time_zone VARCHAR (128)\r\n\r\n    IF @table_name IS NULL\r\n        SELECT @table_name = PARSENAME(@table_spec, 1);\r\n    IF @schema_name IS NULL\r\n        SELECT @schema_name = PARSENAME(@table_spec, 2);\r\n    IF @table_name IS NULL\r\n       OR @schema_name IS NULL\r\n    BEGIN\r\n        SET @error_message = 'Target DB, schema or table name was not provided. Bailing out!';\r\n        RAISERROR(   @error_message, \r\n                     16,             \r\n                     1          \r\n                 );\r\n        RETURN;\r\n    END;\r\n\r\n\tIF NOT EXISTS\r\n\t(\r\n\t    SELECT current_utc_offset\r\n\t    FROM sys.time_zone_info\r\n\t    WHERE name = 'AUS Eastern Standard Time'\r\n\t)\r\n\tBEGIN\r\n\t    SET @time_zone = 'UTC';\r\n\tEND\r\n\tELSE\r\n\tBEGIN\r\n\t    SET @time_zone = 'AUS Eastern Standard Time';\r\n\tEND;\r\n\r\n    EXEC sp_invoke_external_rest_endpoint @url = @url,\r\n                                          @method = 'GET',\r\n                                          @headers = '{&quot;Accept&quot;:&quot;application\/json&quot;}',\r\n                                          @credential = azblobstore,\r\n                                          @response = @response OUTPUT;\r\n\r\n    IF TRIM(JSON_VALUE(@response, '$.response.status.http.code')) &lt;&gt; '200'\r\n       AND TRIM(JSON_VALUE(@response, '$.response.status.http.description')) &lt;&gt; 'OK'\r\n    BEGIN\r\n        SET @error_message = 'Rest call response was unsuccessfull. Bailing out!';\r\n        RAISERROR(   @error_message, \r\n                     16,           \r\n                     1              \r\n                 );\r\n        RETURN;\r\n    END;\r\n\r\n\tSET @new_json =\r\n\t(\r\n\t    SELECT JSON_QUERY(@response, '$.result')\r\n\t);\r\n\tSET @old_json =\r\n\t(\r\n\t    SELECT payload FROM dbo.customer_interactions WHERE file_name = @file_name\r\n\t);\r\n\tSET @new_json_file_name = @file_name;\r\n\tSET @old_json_file_name =\r\n\t(\r\n\t    SELECT file_name FROM dbo.customer_interactions WHERE file_name = @file_name\r\n\t);\r\n\r\n\r\n    IF (ISJSON(@new_json) &lt; 1)\r\n    BEGIN\r\n        SET @error_message\r\n            = 'Provided source JSON payload is not properly formatted or the file does not exist. Bailing out!';\r\n        RAISERROR(   @error_message, \r\n                     16,             \r\n                     1               \r\n                 );\r\n        RETURN;\r\n    END;\r\n\r\n    DROP TABLE IF EXISTS #returntable;\r\n    SELECT *\r\n    INTO #returntable\r\n    FROM dbo.tvf_compare_json_docs(@new_json, @old_json);\r\n\r\n    DECLARE @select_sql NVARCHAR(200) =\r\n            (\r\n                SELECT 'SELECT * FROM ' + QUOTENAME(@schema_name) + '.' + QUOTENAME(@table_name)\r\n            );\r\n\r\n    SELECT @has_identity_column = MAX(CONVERT(INT, is_identity_column))\r\n    FROM sys.dm_exec_describe_first_result_set(@select_sql, NULL, 1) AS f;\r\n\r\n\tDECLARE @delete_cmd VARCHAR(MAX)\r\n\t    = 'DELETE FROM ' + QUOTENAME(@schema_name) + '.' + QUOTENAME(@table_name) + ' WHERE file_name = ''' + @file_name\r\n\t      + ''';';\r\n\tDECLARE @update_cmd VARCHAR(MAX)\r\n\t    = 'UPDATE ' + QUOTENAME(@schema_name) + '.' + QUOTENAME(@table_name) + ' SET payload  = ''' + @new_json\r\n\t      + ''', sentiment = NULL, update_datetime = SYSDATETIME() AT TIME ZONE ''UTC'' AT TIME ZONE '''+@time_zone+''' WHERE file_name = ''' + @file_name + ''';';\r\n\tDECLARE @insert_cmd VARCHAR(MAX)\r\n\t    = 'INSERT INTO ' + QUOTENAME(@schema_name) + '.' + QUOTENAME(@table_name) + ' (file_name, payload, insert_datetime) \r\n\t\t\tSELECT ''' + @file_name + ''', ''' + @new_json + ''',  SYSDATETIME() AT TIME ZONE ''UTC'' AT TIME ZONE '''+@time_zone+''';';\r\n\r\n\r\n    DECLARE @command NVARCHAR(MAX)\r\n        =\r\n            (\r\n                SELECT CASE\r\n                           WHEN @old_json IS NOT NULL AND @old_json_file_name IS NOT NULL AND @old_json_file_name = @new_json_file_name\r\n                                AND EXISTS\r\n                                    (\r\n                                        SELECT TOP (1) 1 FROM #returntable WHERE SideIndicator = '&lt;&gt;'\r\n                                    ) THEN\r\n                               @update_cmd\r\n\t\t\t\t\t\t\tWHEN @old_json IS NOT NULL AND @old_json_file_name IS NOT NULL AND @old_json_file_name = @new_json_file_name\r\n                                AND NOT EXISTS\r\n                                    (\r\n                                        SELECT TOP (1) 1 FROM #returntable WHERE SideIndicator = '&lt;&gt;'\r\n                                    ) THEN ''\r\n                           ELSE \r\n                               CASE\r\n                                   WHEN @old_json IS NOT NULL AND @old_json_file_name IS NOT NULL AND @old_json_file_name = @new_json_file_name THEN\r\n                                       @delete_cmd\r\n                                   ELSE\r\n                                       ''\r\n                               END\r\n                               + CASE\r\n                                     WHEN @has_identity_column &gt; 0 THEN\r\n                                         ' SET IDENTITY_INSERT ' + QUOTENAME(@schema_name) + '.'\r\n                                         + QUOTENAME(@table_name) + ' OFF; '\r\n                                     ELSE\r\n                                         ''\r\n                                 END + @insert_cmd\r\n                               + CASE\r\n                                     WHEN @has_identity_column &gt; 0 THEN\r\n                                         ' SET IDENTITY_INSERT ' + QUOTENAME(@schema_name) + '.'\r\n                                         + QUOTENAME(@table_name) + ' ON '\r\n                                     ELSE\r\n                                         ''\r\n                                 END\r\n                       END\r\n            );\r\n    EXEC (@command);\r\nEND;\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">The main part is as per lines 100-104 where SQL Server sp_invoke_external_rest_endpoint system stored procedure is used for data acquisition. We\u2019re using GET HTTP method (must be one of the following values: GET, POST, PUT, PATCH, DELETE, HEAD), passing the previously created Database Scoped Credentials in the @credential parameter and using concatenated blob URL and file name as the @url parameter. All going well, execution will return 0 if the HTTPS call was done, the HTTP code received is of 2xx status (Success) and the returned JSON in the @response parameter can be further parsed (if required) using SQL Server JSON-specific syntax.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">We now have our JSON file content in the target table but, as per the original requirement, we also need to ascertain client\u2019s conversation sentiment which can help us get the overall gauge on how our customers\u2019 cohort is tracking with respect to the service satisfaction. Again, previously, that would have been a laborious, if not challenging task for someone who doesn\u2019t have a lot of experience in applications integration and Azure ecosystem of services. However, now it\u2019s just a matter of provisioning Azure Cognitive Service account (something we can be easily done from Azure portal) and creating a database trigger used to execute Cognitive Services API call using the same system stored procedure we used before.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Let\u2019s go ahead and save our Azure Cognitive Services authentication key as a DSC, and wrap the sp_invoke_external_rest_endpoint call in a separate stored procedure which also parses JSON payload to extract sentiment value. We will also create a database trigger to automated procedure execution and invoke it every time a record is inserted or updated.<\/p>\n<pre class=\"brush: sql; highlight: [49,50,51,52,53,54]; title: ; notranslate\" title=\"\">\r\n-- create database scoped credential\r\nIF EXISTS\r\n(\r\n    SELECT TOP (1)\r\n           1\r\n    FROM sys.database_credentials\r\n    WHERE name = 'Your_Cognitive_Services_Endpoint_URL'\r\n)\r\nBEGIN\r\n    DROP DATABASE SCOPED CREDENTIAL &#x5B;Your_Cognitive_Services_Endpoint_URL];\r\nEND;\r\n\r\nCREATE DATABASE SCOPED CREDENTIAL &#x5B;Your_Cognitive_Services_Endpoint_URL]\r\nWITH IDENTITY = 'HTTPEndpointHeaders',\r\n     SECRET = '{&quot;Ocp-Apim-Subscription-Key&quot;:&quot;Your_Key_Value&quot;}';\r\nGO\r\n\r\n-- create usp_run_sentiment_analysis stored procedure\r\nCREATE OR ALTER PROCEDURE &#x5B;dbo].&#x5B;usp_run_sentiment_analysis]\r\n(@file_id UNIQUEIDENTIFIER)\r\nAS\r\nBEGIN\r\n    DECLARE @error_message VARCHAR(MAX);\r\n    DECLARE @url NVARCHAR(2000) = N'https:\/\/Your_Cognitive_Services_Endpoint_URL\/text\/analytics\/v3.0\/sentiment';\r\n    DECLARE @response NVARCHAR(MAX);\r\n    DECLARE @json NVARCHAR(MAX) =\r\n            (\r\n                SELECT payload FROM &#x5B;dbo].&#x5B;customer_interactions] WHERE file_id = @file_id\r\n            );\r\n\r\n    DECLARE @customer_text NVARCHAR(MAX) =\r\n            (\r\n                SELECT STRING_AGG(message, ' ') AS customer_text\r\n                FROM\r\n                    OPENJSON(@json, '$.conversation')\r\n                    WITH\r\n                    (\r\n                        speaker NVARCHAR(100),\r\n                        message NVARCHAR(MAX) '$.message'\r\n                    )\r\n                WHERE speaker = 'Customer'\r\n            );\r\n\r\n    DECLARE @payload NVARCHAR(MAX)\r\n        = N'{&quot;documents&quot;: &#x5B;{&quot;id&quot;: &quot;1&quot;, &quot;language&quot;: &quot;en&quot;, &quot;text&quot;: &quot;' + @customer_text + N'&quot;}]}';\r\n    DECLARE @headers NVARCHAR(102) = N'{&quot;Content-Type&quot;: &quot;application\/json&quot;}';\r\n\r\n\r\n    EXEC sp_invoke_external_rest_endpoint @url = @url,\r\n                                          @method = 'POST',\r\n                                          @headers = @headers,\r\n                                          @payload = @payload,\r\n                                          @credential = &#x5B;Your_Cognitive_Services_Endpoint_URL],\r\n                                          @response = @response OUTPUT;\r\n\r\n    IF TRIM(JSON_VALUE(@response, '$.response.status.http.code')) &lt;&gt; '200'\r\n    BEGIN\r\n        SET @error_message = 'Rest call response was unsuccessful. Bailing out!';\r\n        RAISERROR(   @error_message, \r\n                     16,             \r\n                     1          \r\n                 );\r\n        RETURN;\r\n    END;\r\n    ELSE\r\n    BEGIN\r\n        UPDATE &#x5B;dbo].&#x5B;customer_interactions]\r\n        SET sentiment =\r\n            (\r\n                SELECT TOP (1) JSON_VALUE(@response, '$.result.documents&#x5B;0].sentiment')\r\n            )\r\n        WHERE file_id = @file_id;\r\n    END;\r\nEND;\r\nGO\r\n\r\n-- create trigger_sentiment_analysis database trigger\r\nCREATE OR ALTER TRIGGER &#x5B;dbo].&#x5B;trigger_sentiment_analysis]\r\nON &#x5B;dbo].&#x5B;customer_interactions]\r\nAFTER INSERT, UPDATE\r\nAS\r\nBEGIN\r\n    SET NOCOUNT ON;\r\n    DECLARE @file_id VARCHAR(128);\r\n    SELECT @file_id = inserted.file_id\r\n    FROM inserted;\r\n    EXEC usp_run_sentiment_analysis @file_id = @file_id;\r\nEND;\r\nGO\r\n\r\nALTER TABLE &#x5B;dbo].&#x5B;customer_interactions] ENABLE TRIGGER &#x5B;trigger_sentiment_analysis];\r\nGO\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">The 3-stage logic in the above stored procedure dictates that we extract customer\u2019s text from our JSON entry, omitting everything that relates to speaker dialog, call our sentiment analysis API with this data to determine sentiment value and finally, persist it in the target table against the file_id in question. All there\u2019s left to do is to create another database trigger which activates only if the sentiment value is negative and, you guessed it, calls a stored procedure responsible for running Azure Logic App.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">This is our third Azure services integration using REST endpoint in SQL DB and it just goes to show how versatile this functionality is and how it opens a world of possibilities, all within the confines of the database and with little to no development required outside of T-SQL.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">For this part let\u2019s create a small Logic App which triggers &#8216;Send an email (V2)&#8217; task when a HTTP request is received, the final stored procedure calling this workflow and a database trigger to automate execution process. Also, to make it more interesting, we\u2019ll pass customer&#8217;s feedback text and date\/time this file was created at to our email content so that whoever receives this correspondence does not have to wonder what text triggered this workflow.<\/p>\n<p class=\"Standard\" style=\"text-align: justify;\">Our Logic App and the final piece of SQL code will look like this:<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/03\/Azure_SQLDB_HTTP_REST_Endpoint_Email_Logic_App.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4818\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/03\/Azure_SQLDB_HTTP_REST_Endpoint_Email_Logic_App.png\" alt=\"\" width=\"580\" height=\"773\" srcset=\"https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/03\/Azure_SQLDB_HTTP_REST_Endpoint_Email_Logic_App.png 762w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/03\/Azure_SQLDB_HTTP_REST_Endpoint_Email_Logic_App-225x300.png 225w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<pre class=\"brush: sql; highlight: [19,20,21,22,23]; title: ; notranslate\" title=\"\">\r\n-- create usp_send_email_on_negative_sentiment stored procedure\r\nCREATE OR ALTER PROCEDURE &#x5B;dbo].&#x5B;usp_send_email_on_negative_sentiment]\r\n(\r\n    @insert_date DATETIME2,\r\n    @customer_feedback NVARCHAR(MAX)\r\n)\r\nAS\r\nBEGIN\r\n    DECLARE @url NVARCHAR(MAX)\r\n        = N'Your_Logic_App_URL';\r\n    DECLARE @response NVARCHAR(MAX);\r\n    DECLARE @payload NVARCHAR(MAX) = N'{\r\n        &quot;feedback&quot;:  &quot;' + @customer_feedback + N'&quot;,\r\n        &quot;date&quot;: &quot;' + CONVERT(VARCHAR, @insert_date, 0) + N'&quot;\r\n\t\t}';\r\n\r\n    DECLARE @headers NVARCHAR(102) = N'{&quot;Content-Type&quot;: &quot;application\/json&quot;}';\r\n\r\n    EXEC sp_invoke_external_rest_endpoint @url = @url,\r\n                                          @method = 'POST',\r\n                                          @headers = @headers,\r\n                                          @payload = @payload,\r\n                                          @response = @response OUTPUT;\r\nEND;\r\nGO\r\n\r\n-- create trigger_send_email_on_negative_sentiment database trigger\r\nCREATE OR ALTER TRIGGER &#x5B;dbo].&#x5B;trigger_send_email_on_negative_sentiment]\r\nON &#x5B;dbo].&#x5B;customer_interactions]\r\nAFTER UPDATE\r\nAS\r\nBEGIN\r\n    SET NOCOUNT ON;\r\n    DECLARE @JSON NVARCHAR(MAX);\r\n    SELECT @JSON = inserted.payload\r\n    FROM Inserted;\r\n    DECLARE @customer_feedback NVARCHAR(MAX);\r\n    SET @customer_feedback =\r\n    (\r\n        SELECT STRING_AGG(message, ' ') AS customer_text\r\n        FROM\r\n            OPENJSON(@JSON, '$.conversation')\r\n            WITH\r\n            (\r\n                speaker NVARCHAR(100),\r\n                message NVARCHAR(MAX) '$.message'\r\n            )\r\n        WHERE speaker = 'Customer'\r\n    );\r\n    DECLARE @insert_date DATETIME2;\r\n    SELECT @insert_date = inserted.insert_datetime\r\n    FROM inserted;\r\n    DECLARE @sentiment VARCHAR(20);\r\n    SELECT @sentiment = inserted.sentiment\r\n    FROM inserted;\r\n    IF @sentiment = 'negative'\r\n    BEGIN\r\n        EXEC usp_send_email_on_negative_sentiment @insert_date = @insert_date,\r\n                                                  @customer_feedback = @customer_feedback;\r\n    END;\r\nEND;\r\nGO\r\n\r\nALTER TABLE &#x5B;dbo].&#x5B;customer_interactions] ENABLE TRIGGER &#x5B;trigger_send_email_on_negative_sentiment];\r\nGO\r\n<\/pre>\n<p class=\"Standard\" style=\"text-align: justify;\">We can run this workflow, end-to-end by uploading our sample JSON conversation file into Azure storage container and, providing we have the Logic App and Azure function running (either in Azure or locally with Azure Functions Core Tools), the whole process should only take a few seconds to complete (you can confirm it by looking at time stamps) and result in an email notification being received &#8211; see screenshots as per below (click on it to enlarge).<\/p>\n<p><a href=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion.png\"><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-4840\" src=\"http:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion.png\" alt=\"\" width=\"580\" height=\"1017\" srcset=\"https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion.png 1210w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion-171x300.png 171w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion-584x1024.png 584w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion-768x1346.png 768w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion-876x1536.png 876w, https:\/\/bicortex.com\/bicortex\/wp-content\/post_content\/2024\/04\/Azure_SQLDB_HTTP_REST_Endpoint_Workflow_Completion-1168x2048.png 1168w\" sizes=\"auto, (max-width: 580px) 100vw, 580px\" \/><\/a><\/p>\n<h3 style=\"text-align: center;\">Conclusion<\/h3>\n<p class=\"Standard\" style=\"text-align: justify;\">Using Azure SQL DB REST endpoint integration, a large number of Azure services can be interfaced with Azure SQL DB, further expanding and extending platform\u2019s capability. These workflows allow SQL database to act as the connecting tissue for data interoperability across API-enabled interfaces. In addition to workflow activation e.g. triggering Logic Apps or Azure functions as demonstrated above, additional use cases can include further integration with event-based architectures e.g. Azure Event Hub, creating data streams for fraud detection via Stream Analytics, websites updates using broadcasting SignalR messages or cache invalidation using Azure Functions. As long as you don\u2019t think of the feature as a MuleSoft or Boomi replacement and understand the limitations of this approach, querying REST Endpoints with Azure SQL Database opens up a lot of possibilities.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction SQL Server in-database REST API integration was always roll-your-own, bubble gum and duct tape type of affair \u2013 it was possible but never easy. Some may argue it\u2019s for all the right reasons as imposing strict distinction between database and application layers created a well-defined separation of concerns and delegated the former to do [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[64,56,5,46],"tags":[65,85,93,41,49,19],"class_list":["post-4806","post","type-post","status-publish","format-standard","hentry","category-azure","category-programming","category-sql","category-sql-server","tag-azure","tag-azure-functions","tag-azure-logic-app","tag-python","tag-sql","tag-sql-server"],"aioseo_notices":[],"jetpack_featured_media_url":"","_links":{"self":[{"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/4806","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/comments?post=4806"}],"version-history":[{"count":35,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/4806\/revisions"}],"predecessor-version":[{"id":4868,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/posts\/4806\/revisions\/4868"}],"wp:attachment":[{"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/media?parent=4806"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/categories?post=4806"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/bicortex.com\/bicortex\/wp-json\/wp\/v2\/tags?post=4806"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}