Difference between kafka and nifi
NiFi and Kafka complements in the sense that NiFi is not a messaging queue like Apache Kafka. On the contrary, Apache NiFi is a data-flow management aka data logistics tool.
Let's assume this scenario: You have messages (in JSON format) getting streamed through Kafka and you want to validate the messages to check if the message has all the fields and if they are valid, you want the messages to land in HBase.
Here NiFi can help you with the following approach:
- NiFi has
ConsumeKafka
processors which you can configure with your Kafka broker and the group name. - Use the NiFi processor
ValidateRecord
to check if the received messages are all valid - If they are valid, you can connect the output to
PutHBaseRecord
Summarizing, NiFi basically prevents you from writing a lot of boilerplate code. In this case, a custom logic to do schema validation and writing to HBase.
Found an interesting answer on Horthonworks community questions, I share it here for the sake of completeness:
Apache NiFi and Apache Kafka are two different tools with different use-cases that may slightly overlap. Here is my understanding of the purpose of the two projects.
NiFi is "An easy to use, powerful, and reliable system to process and distribute data."
It is a visual tool (with a REST api) that implements flow-based programming to enable the user to craft flows that will take data from a large variety of different sources, perform enrichment, routing, etc on the data as it's being processed, and output the result to a large variety of destinations. During this process, it captures metadata (provenance) on what has happened to each piece of data (FlowFile) as it made its way through the Flow for audit logging and troubleshooting purposes.
"Apache Kafka is publish-subscribe messaging rethought as a distributed commit log"
It is a distributed implementation of the publish-subscribe pattern that allows developers to connect programs to each other in different languages and across a large number of machines. It is more of a building block for distributed computing than it is an all-in-one solution for processing data.