Monday, October 16, 2023
HomeBig DataUtilizing Lifeless Letter Queues with SQL Stream Builder

Utilizing Lifeless Letter Queues with SQL Stream Builder


What’s a useless letter queue (DLQ)?

Cloudera SQL Stream builder offers non-technical customers the facility of a unified stream processing engine to allow them to combine, mixture, question, and analyze each streaming and batch information sources in a single SQL interface. This enables enterprise customers to outline occasions of curiosity for which they should constantly monitor and reply shortly. A useless letter queue (DLQ) can be utilized if there are deserialization errors when occasions are consumed from a Kafka matter. DLQ is beneficial to see if there are any failures because of invalid enter within the supply Kafka matter and makes it doable to report and debug issues associated to invalid inputs. 

Making a DLQ

We are going to use the instance schema definition offered by SSB to reveal this function. The schema has two properties: “title” and “temp” (for temperature) to seize sensor information in JSON format. Step one is to create two Kafka matters: “sensor_data” and “sensor_data_dlq” which will be achieved the next approach:

kafka-topics.sh --bootstrap-server <bootstrap-server> --create --topic sensor_data --replication-factor 1 --partitions 1

kafka-topics --bootstrap-server <bootstrap-server> --create --topic sensor_data_dlq --replication-factor 1 --partitions 1

As soon as the Kafka matters are created, we are able to arrange a Kafka supply in SSB. SSB gives a handy strategy to work with Kafka as we are able to do the entire setup utilizing the UI. In Undertaking Explorer, open the Information Sources folder. Proper clicking on “Kafka” brings up the context menu the place we are able to open the creation modal window.

We have to present a singular title for this new information supply, the checklist of brokers, and the protocol in use:

After the brand new Kafka supply is efficiently registered, the subsequent step is to create a brand new digital desk. We are able to do this from the Undertaking Explorer by proper clicking “Digital Tables” and selecting “New Kafka Desk” from the context menu. Let’s fill out the shape with the next values:

  • Desk Identify: Any distinctive title; we’ll person “sensors” on this instance
  • Kafka Cluster: Select the Kafka supply registered within the earlier step
  • Information Format: JSON
  • Matter Identify: “sensor_data” which we created earlier

 

We are able to see underneath the “Schema Definition” tab that the instance offered has the 2 fields, “title” and “temp,” as mentioned earlier. The final step is to arrange the DLQ performance, which we are able to do by going to the “Deserialization” tab. The “Deserialization Failure Handler Coverage” drop-down has the next choices:

  • “Fail”: Let the job crash after which auto-restart setting dictates what occurs subsequent
  • “Ignore”: Ignores the message that would not be deserialized, strikes to the subsequent
  • “Ignore and Log”: Identical as ignore however logs every time it encounters a deserialization failure
  • “Save to DLQ”: Sends the invalid message to the required Kafka matter

Let’s choose “Save to DLQ” and select the beforehand created “sensor_data_dlq” matter from the “DLQ Matter Identify” drop-down. We are able to click on “Create and Overview” to create the brand new digital desk.

Testing the DLQ

First, create a brand new SSB job from the Undertaking Explorer. We are able to run the next SQL question to devour the information from the Kafka matter:

SELECT * from sensors;

Within the subsequent step we’ll use the console producer and client command line instruments to work together with Kafka. Let’s ship a sound enter to the “sensor_data” matter and examine whether it is consumed by our operating job.

kafka-console-producer.sh --broker-list <dealer> --topic sensor_data

>{"title":"sensor-1", "temp": 32}

Checking again on the SSB UI, we are able to see that the brand new message has been processed:

Now, ship an invalid enter to the supply Kafka matter: 

kafka-console-producer.sh --broker-list <dealer> --topic sensor_data >invalid information

We gained’t see any new messages in SSB because the invalid enter can’t be deserialized. Let’s examine on the DLQ matter we arrange earlier to see if the invalid message was captured:

kafka-console-consumer.sh --bootstrap-server <server> --topic sensor_data_dlq --from-beginning invalid information

The invalid enter is there which verifies that the DLQ performance is working appropriately, permitting us to additional examine any deserialization error.

Conclusion

On this weblog, we coated the capabilities of the DLQ function in Flink and SSB. This function could be very helpful to gracefully deal with a failure in a knowledge pipeline because of invalid information. Utilizing this functionality, it is extremely straightforward and fast to search out out if there are any unhealthy information within the pipeline and the place the foundation reason for these unhealthy information are.

Anyone can check out SSB utilizing the Stream Processing Neighborhood Version (CSP-CE). CE makes creating stream processors straightforward, as it may be achieved proper out of your desktop or some other growth node. Analysts, information scientists, and builders can now consider new options, develop SQL-based stream processors regionally utilizing SQL Stream Builder powered by Flink, and develop Kafka Shoppers/Producers and Kafka Join Connectors, all regionally earlier than shifting to manufacturing in CDP



Supply hyperlink

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments