What is Kafka Connect?
Kafka Connect is the integration API for Apache Kafka. Check out this video for an overview of what Kafka Connect enables you to do, and how to do it.
Kafka Connect is the integration API for Apache Kafka. Check out this video for an overview of what Kafka Connect enables you to do, and how to do it.
There’s ways, and then there’s ways, to count the number of records/events/messages in a Kafka topic. Most of them are potentially inaccurate, or inefficient, or both. Here’s one that falls into the potentially inefficient category, using kafkacat
to read all the messages and pipe to wc
which with the -l
will tell you how many lines there are, and since each message is a line, how many messages you have in the Kafka topic:
$ kafkacat -b broker:29092 -t mytestopic -C -e -q| wc -l
3
Google Chrome automagically adds sites that you visit which support searching to a list of custom search engines. For each one you can set a keyword which activates it, so based on the above list if I want to search Amazon I can just type a
<tab>
and then my search term
I had the pleasure of presenting at DataEngBytes recently, and am delighted to share with you the 🗒️ slides, 👾 code, and 🎥 recording of my ✨brand new talk✨:
A tiny snippet since I wasted 10 minutes going around the houses on this one…
tl;dr: If you try to create a command that is not in lower case (e.g. Alert
not alert
) then the setMyCommands
API will return BOT_COMMAND_INVALID
The server sends Transfer-Encoding: chunked
data, and you want to work with the data as you get it, instead of waiting for the server to finish, the EOF to fire, and then process the data?
When I set out to learn Go one of the aims I had in mind was to write a version of this little Python utility which accompanies a blog I wrote recently about understanding and diagnosing problems with Kafka advertised listeners. Having successfully got Producer, Consumer, and AdminClient API examples working, it is now time to turn to that task.
Having written my first Kafka producer in Go, and even added error handling to it, the next step was to write a consumer. It follows closely the pattern of Producer code I finished up with previously, using the channel-based approach for the Consumer:
I looked last time at the very bare basics of writing a Kafka producer using Go. It worked, but only with everything lined up and pointing the right way. There was no error handling of any sorts. Let’s see about fixing this now.
With the first leg of my journey with Go done (starting from a very rudimentary base), the next step for me was to bring it into my current area of interest and work - Apache Kafka.
In the previous exercise I felt my absence of a formal CompSci background with the introduction of Binary Sorted Trees, and now I am concious of it again with learning about mutex. I’d heard of them before, mostly when Oracle performance folk were talking about wait types - TIL it stands for mutual exclusion
!
I’ve been playing around with the new SerDes (serialisers/deserialisers) that shipped with Confluent Platform 5.5 - Protobuf, and JSON Schema (these were added to the existing support for Avro). The serialisers (and associated Kafka Connect converters) take a payload and serialise it into bytes for sending to Kafka, and I was interested in what those bytes look like. For that I used my favourite Kafka swiss-army knife: kafkacat.
A Tour of Go : Goroutines was OK but as with some previous material I headed over to Go by example for clearer explanations.
This is based on the Picture generator from the Slices exercise.
I’m not intending to pick holes in the Tour…but it’s not helping itself ;-)
For an introductory text, it makes a ton of assumptions about the user. Here it introduces Readers, and the explanation is good—but the example code looks like this: