With the recent announcement of a new partnership between Microsoft and Oracle for Oracle database services, I wanted to look further into setting up log collection from Oracle Cloud to Microsoft Sentinel.
When I started digging there wasn’t much information available (except some minor blog posts from the Oracle side, but I wanted to use something simpler using message streaming.
Microsoft has created an integration available in the content hub, this template deploys a set of Analytics Rules, Hunting Queries, and Workbooks.
Once you’ve deployed the content hub you will see a new data connector which we will then use to set up the integration (and yes, this integration uses Azure Functions to pull log data from the message streaming service from OCI)
When you go into the connector you will get information about deploying the ARM template which sets up the Azure function. I’ll get back to that, but first, let us set up the necessary comments in Oracle Cloud.
Firstly, you need to have an Oracle Cloud tenant and access to a root compartment, from there we need to generate an API key. Click on the profile picture and click My Profile
Click on API Keys and Click Add API Key, from there click Generate API Key Pair and Download the Private Key and then click add. (We need the Private Key for authenticating from the Function later)
Then you will get a Configuration File Preview, copy all the content here since we will need that later.
Now we need to configure the streaming of the logs. First, go into the streaming service and create a stream pool.
(Make sure it is publicly available) Next ,go into Streams and create a new stream that uses the created stream pool. Once the creation is done, make sure that you copy the OCID and Message Endpoint of the stream.
(DO NOT CLICK PRODUCE TEST MESSAGE)
Next, go into the Service Connectors and click “Create Service Connector” and here we define the source as logging and target streaming
You can leave logging at the defaults if you want to collect everything, and the target is the stream we recently created.
Now that we have the integration in place you can go back to the stream and check that audit logs are being sent to the message bus. This can be done by clicking load messages
Now that logs are being sent, we can go back to the Sentinel configuration. The configuration file for the ARM template will look something like this (based upon your earlier copied content and also the Workspace ID for your Sentinel Workspace)
Which will then deploy the Azure Function running a set of Python functions which are triggered every 5 minutes to collect and push data into the Log Analytics workspace.
Once the function is deployed, you can see that all the configuration is stored directly into the configuration of the function.
/Yikes! I’ll come back with an Azure Key Vault integration post later. But! if you entered the credentials correctly you should be able to see this when the function is run for the first time
You can then view the logs in the custom log table called OCI_Logs_CL.