recursive.codes

recursive.codes

recursive.codes


The Personal Blog of Todd Sharp

Reliably Requesting REST Results Right From Your SQL Scripts

Posted By: Todd Sharp on 8/19/2020 12:00 GMT
Tagged: Cloud, Cloud, Database, Database

I recently blogged a comprehensive guide to invoking serverless functions in the Oracle Cloud. I was quite sure that I had included every possible method for invoking them when I published that article, but a few days later it dawned on me that I had missed one. Granted, it may be more of an edge case than a common way you might want to invoke your serverless function, but I want to highlight this method here in this blog post because this method can be used to invoke additional REST APIs (some of which might come as a surprise to you).

Calling Oracle Cloud REST APIs

Create Credentials

Note! Your credentials are stored in an encrypted format in the database.

Before we can make our call to the OCI REST API to invoke our serverless function, we must first store our credentials that will be used to authenticate our REST call. To do this, we'll use the DBMS_CLOUD.CREATE_CREDENTIALS function. We only have to do this once and to use the credentials later on we'll refer to the credential_name which the script will use to securely retrieve and decrypt our stored credentials.

Not Setup Yet? The credentials we're using here are the same credentials that you use on your local machine for the OCI CLI. If you haven't yet set up the OCI CLI, see the instructions here. The DBMS_CLOUD.CREATE_CREDENTIALS function does not support PEM keys with passphrases.

To create our credentials, we can run the following:

The private_key value above is just the key content without any of the header/footer (without any line wraps). The rest of the data is as it would be in your ~/.oci/config file. If for some reason you need to delete a credential, you can do so by name:

Invoke Serverless Function

In order to make our REST call from our script, we'll utilize the DBMS_CLOUD.SEND_REQUEST function. This would look like so:

The SEND_REQUEST function expects a credential_name, the API URI, a HTTP request method and (optionally) a headers and body parameter (both of which are expected to be a JSON_OBJECT) and returns an object of type DBMS_CLOUD_TYPES.RESP. The RESP object contains the response headers, text, and status code. The DMBS_CLOUD package contains various functions to help us retrieve the values from the RESP object. We'll use DBMS_CLOUD.GET_RESPONSE to get the response which will be a JSON string. For convenience, let's wrap this up in a function.

This serverless function is a simple "hello world" that expects a JSON object containing a single key - the name of the person to say hello to. We can call our SQL function like so:

Which gives us the JSON response from the serverless invocation.

Parse Function Result JSON

Having the JSON string is nice, but what if we wanted to do something further with the response. Say we wanted to join the data returned to us with another table or return the data as a resultset? We can take advantage of the JSON_TABLE function to help us out with that.

Which gives us:

Nice!

Endless Possibilities

Now that we've seen how to invoke a serverless function with SEND_REQUEST, it's easy to see that this method can be used to call any REST endpoint in the Oracle Cloud.

List OCI Instances

Here's an example of getting a list of compute instances in a given compartment.

Called like so:

Parsed with JSON_TABLE:   

Which produces:

As I mentioned above, this method can be used to invoke literally any OCI REST API.

Scary Movie Shawn Wayans GIF - ScaryMovie ShawnWayans ButWaitTheresMore GIFs

Calling AWS REST APIs

Yes, you read that right. Not only can you use the DBMS_CLOUD package to store OCI credentials and invoke OCI REST APIs, but you can also use it to store AWS credentials and invoke AWS REST APIs. There are a lot of clients who utilize a "multi-cloud" strategy, so it makes perfect sense to support invoking REST APIs for some of the more popular cloud providers in the industry. The process is mostly identical to the method we used above, with some slight but notable differences.  

Create AWS Credentials

The first difference is the way we create our credentials for AWS. This is outlined in the documentation, but the key here is that there is an overloaded version of CREATE_CREDENTIAL that accepts a credential_name, username and password. For AWS, the username is your access key ID and the password is your secret access key (refer to the AWS documentation if you are not familiar with these two items). This looks something like so:

List S3 Buckets

The function that we create follows the same format as above, but points at the appropriate AWS REST endpoint. Here we have a function to list all of the S3 buckets for an account:

Which we can call like so:

Since Amazon thinks we are still in 1999, we can only receive XML back from the S3 REST API.

But that's perfectly OK because we can use XMLTABLE to work with XML:

Which gives us:

Excellent!

Calling Azure REST APIs

We're not done yet! We can also work with Azure to list "containers" (buckets).

Create Azure Credentials

For Azure, the username is your Azure storage account name and the password is an Azure storage account access key.

List Azure Storage Containers

The function:

The call:

The proof that Microsoft also believes we are still stuck in the previous millennium:

The query to parse the XML into a usable format:

Summary

In this post, we looked at how to request REST results for an OCI tenancy with DBMS_CLOUD.SEND_REQUEST in your PL/SQL scripts. We also saw how that method can be utilized to call the REST APIs for your external cloud providers such as Microsoft Azure and Amazon Web Services.

Photo by Nathan Anderson on Unsplash



Related Posts

Querying Autonomous Database from an Oracle Function (The Quick, Easy & Completely Secure Way)

Querying Autonomous Database from an Oracle Function (The Quick, Easy & Completely Secure Way)

I've written many blog posts about connecting to an Autonomous DB instance in the past. Best practices evolve as tools, services, and frameworks become...

Brain to the Cloud - Part III - Examining the Relationship Between Brain Activity and Video Game Performance

Brain to the Cloud - Part III - Examining the Relationship Between Brain Activity and Video Game Performance

In my last post, we looked at the technical aspects of my Brain to the Cloud project including much of the code that was used to collect and analyze the...

Brain to the Cloud - Part II - How I Uploaded My Brain to the Cloud

Brain to the Cloud - Part II - How I Uploaded My Brain to the Cloud

In my last post, we went over the inspiration, objectives, and architecture for my Brain to the Cloud project. In this post, we'll look in-depth at the...

Note: Comments are currently closed on this blog. Disqus is simply too bloated to justify its use with the low volume of comments on this blog. Please visit my contact page if you have something to say!