recursive.codes

recursive.codes

recursive.codes


The Personal Blog of Todd Sharp

Found 21 posts matching streaming.

Complete Developers Guide To The Oracle Notification Service
https://recursive.codes/blog/post/80
When you compile a list of your favorite features in the cloud, I'd be willing to bet that notifications aren't typically cracking the "top 10" of many developers lists. It's more of a "utility" feature. Kind of like the tires on your car - you need to...
Getting Started With RabbitMQ In The Oracle Cloud
https://recursive.codes/blog/post/88
In this post I'm going to walk you through the steps that it takes to get RabbitMQ up and running for lightweight messaging in the Oracle Cloud. You're probably asking yourself why you might consider using RabbitMQ for messaging when we already have the...
Publishing To Object Storage From Oracle Streaming Service
https://recursive.codes/blog/post/92
used to configure Connect. We'll need some of the values that we collected earlier, so keep those handy. We'll also need our streaming username and our auth token. Create a file called /projects/object-storage-demo/connect-distributed.properties and...
Fighting Diabetes With Technology - How I Built An App To Count Carbs And Calculate Insulin Doses
https://recursive.codes/blog/post/94
I'd like to tell you a story about my daughter Ava. She’s a typical 13-year-old girl for the most part – she’s smart, very mature and organized - involved in honor chorus, beta club and activities like archery. She can be quiet and shy around new people...
Using Kafka Connect With Oracle Streaming Service And Autonomous DB
https://recursive.codes/blog/post/96
of the console by clicking 'Analytics' -> 'Streaming' from the console burger menu: Next, in the left hand menu of the streaming landing page, select 'Stream Pools'. Then click 'Create Stream Pool'. Give it a name and check 'Auto Create To...
Migrate Your Kafka Workloads To Oracle Cloud Streaming
https://recursive.codes/blog/post/114
less painful. I want to talk about one of those "wins" today in this post.  Kafka is undoubtedly popular for data streaming (and more) because it works well, is reliable and there are a number of SDK implementations that make working ...
Creating An ATP Instance With The OCI Service Broker
https://recursive.codes/blog/post/1155
We recently announced the release of the OCI Service Broker for Kubernetes, an implementation of the Open Service Broker API that streamlines the process of provisioning and binding to services that your cloud native applications depend on. ...
Getting Your Feet Wet With OCI Streams
https://recursive.codes/blog/post/1163
Back in December we announced the development of a new service on Oracle Cloud Infrastructure called Streaming.  The announcement, product page and documentation have a ton of use cases and information on why you might use Streaming...
Back To The Database - Part 3: Publishing Database Changes To A Stream
https://recursive.codes/blog/post/1360
In the last post, we talked at great length about consuming a stream in your Autonomous DB instance and using the messages in that stream to insert, update and delete records in a table in your DB. I highly suggest you read that post first if you haven...
Back To The Database - Part 1: Preparing To Persist Data From A Stream
https://recursive.codes/blog/post/1364
l your own with Kafka. You should use OSS, but you don’t have to and aren’t locked in to using OSS. If you’re new to OSS (or streaming as a concept), check out some of my posts here on the developer blog: https://blogs.oracle.com/developers/gettin...
Back To The Database - Prologue: What's Old Is New Again
https://recursive.codes/blog/post/1366
We’re starting to see some influencers and larger organizations scale back from an “all in” and “by the book” stance on microservices and advocate for a more sensible and logical approach in 2020.  There’s nothing wrong with the microservice pattern...
Back To The Database - Part 2: Persisting Data From A Stream
https://recursive.codes/blog/post/1362
ps We’ve covered a ton so far - we’ve created cloud credentials in our DB, learned about streams and how to get started with streaming in the Oracle Cloud, learned about stored procedures and implemented a procedure to read from a stream and insert da...
Easy Messaging With Micronaut's Kafka Support And Oracle Streaming Service
https://recursive.codes/blog/post/102
py the stream pool OCID and keep it handy for later: Create A Streams User Next up, let's create a dedicated user for the streaming service. Click on 'Users' under 'Identity' in the console sidebar menu: Click 'Create User' and populate the dial...
Tracking & Analyzing Water Usage Data in the Cloud with a Flow Sensor, Microcontroller & Autonomous DB
https://recursive.codes/blog/post/1610
This past summer, I was lucky enough to get spend some virtual time with some of the awesome interns here at Oracle. It makes me happy to see so many college students with such a passion for engineering and technology. One of the interns that I got to...
Archiving Stream Data To Object Storage With Service Connectors
https://recursive.codes/blog/post/1886
I could use an existing Object Storage bucket to archive the stream data, but for this demo, I’ll create a new bucket called streaming-archive-demo-0 that will contain all of the archived data.  Create Service Connector For simple archiving ...
Can You Invoke OCI REST APIs Directly from an Arduino (ESP-32)?
https://recursive.codes/blog/post/1906
required variables for the tenancy and instantiate the library as we did above. Then we’ll set a few variables necessary for streaming. Next, create a postMessage() function. In this function, we’ll create a JSON object and Base64 encode the value...
Message Driven Microservices & Monoliths with Micronaut - Part 4: Using RabbitMQ for Messaging
https://recursive.codes/blog/post/1646
Previously, we looked in-depth at messaging for your microservice and monoliths by using an e-commerce example. We first set up a Kafka broker and created an order microservice that published new orders to an order topic. In the next post, we created ...
Message Driven Microservices & Monoliths with Micronaut - Part 3: Switching to Oracle Streaming Service
https://recursive.codes/blog/post/1648
So far in this series (part 1, part 2), we’ve covered both the “how” and the “why” as it relates to messaging in your modern web applications. We used an e-commerce example to illustrate the need for messaging and we looked in depth at launching a local...
Message Driven Microservices & Monoliths with Micronaut - Part 1: Installing Kafka & Sending Your First Message
https://recursive.codes/blog/post/1652
miliar or uncomfortable with said tool or service. If that’s you, no worries - let's quickly discuss. Kafka is an open source streaming tool that lets you produce (sometimes called publish) key/value based messages to a queue and later on&nbs...
Bridging MQTT and Oracle Streaming Service (OSS) with Node.js
https://recursive.codes/blog/post/2054
on this blog, so you are hopefully already familiar with it, but if not you can think of it as a real-time, serverless event streaming platform that just happens to be compatible with Apache Kafka. When it comes to messaging, I've personally found tha...
Building Messaging Bridges with Node-RED
https://recursive.codes/blog/post/2056
Over the last few posts, we've looked in detail at Oracle Advanced Queuing. Most recently we looked at "bridges" - or applications that helped us broker messages between normally incompatible protocols like MQTT, AQ, and Oracle Streaming Service (OSS)...