2021-10-14

Accessing Red Hat OpenShift Streams for Apache Kafka from Python

Recently Red Hat launched a way how to get managed Kafka instance and you can get one for 2 days for free. There is a limit for 1 MB per second. So far I was only using Kafka without any auth and without any encription, so here is what I had to do to make it work - typing here so I do not need to reinvent once I forgot it :-) I'm using python-kafka.

I have created a cluster and under it's "Connection" menu item I got bootstrap server jhutar--c-jc--gksg-rukm-fu-a.bf2.kafka-stage.rhcloud.com:443. It also advised me to create a service account, so I created one and it generated "Client ID" like srvc-acct-00000000-0000-0000-0000-000000000000 and "Client secret" like 00000000-0000-0000-0000-000000000000. Although "SASL/OAUTHBEARER" authentication method is recommended, as of now it is too complicated for my poor head, so I used "SASL/PLAIN" where you just use "Client ID" as username and "Client secret" as password. To create a topic, there is UI as well

To create producer and consumer:

producer = KafkaProducer(
    bootstrap_servers='jhutar--c-jc--gksg-rukm-fu-a.bf2.kafka-stage.rhcloud.com:443',
    sasl_plain_username='srvc-acct-00000000-0000-0000-0000-000000000000',
    sasl_plain_password='00000000-0000-0000-0000-000000000000',
    security_protocol='SASL_SSL',
    sasl_mechanism='PLAIN',
)

And consumer needs same parameters:

consumer = KafkaConsumer(
    '<topic>',
    bootstrap_servers='jhutar--c-jc--gksg-rukm-fu-a.bf2.kafka-stage.rhcloud.com:443',
    sasl_plain_username='srvc-acct-00000000-0000-0000-0000-000000000000',
    sasl_plain_password='00000000-0000-0000-0000-000000000000',
    security_protocol='SASL_SSL',
    sasl_mechanism='PLAIN',
)