Kafka MCQs – Kafka Integration with Tools

11.) Which tool helps ingest data from Kafka into Hadoop Distributed File System (HDFS)?

A) Flume
B) Hive
C) HBase
D) Sqoop

Answer: Option A

Explanation: Apache Flume can pull data from Kafka and push to HDFS.

12.) Kafka Connect REST API can be used to:

A) Manage ZooKeeper nodes
B) List topics
C) Compress Kafka messages
D) Create connectors dynamically

Answer: Option D

Explanation: Kafka Connect REST API allows dynamic connector management.

13.) Kafka integrates with Apache NiFi through:

A) MiNiFi
B) NiFi Kafka Processor
C) Zookeeper Plugin
D) Flume Agent

Answer: Option B

Explanation: NiFi has built-in processors to publish/consume Kafka data.

14.) To import a large number of Kafka messages into a relational DB, which connector is best?

A) JDBC Sink Connector
B) HDFS Connector
C) MQTT Source Connector
D) Twitter Source Connector

Answer: Option A

Explanation: It allows streaming Kafka messages directly into RDBMS like MySQL/PostgreSQL.

15.) Kafka Connect provides connectors for which of the following systems?

A) S3
B) Cassandra
C) Elasticsearch
D) All of the above

Answer: Option D

Explanation: Kafka Connect has official or community connectors for many tools.

16.) In Kafka Connect, configuration files are written in:

A) YAML
B) XML
C) JSON or Properties format
D) Markdown

Answer: Option D

Explanation: Connector configs can be written in either .json or .properties.

17.) Which framework provides exactly-once semantics with Kafka?

A) Spark Core
B) Kafka Streams
C) Flink
D) Both B and C

Answer: Option D

Explanation: Kafka Streams and Flink support exactly-once semantics with Kafka.

18.) Kafka integrates with cloud storage (e.g., AWS S3) using:

A) Kafka Storage Manager
B) S3 Sink Connector
C) AWS Kafka Reader
Confluent Watchdog

Answer: Option B

Explanation: The S3 Sink Connector exports Kafka data into S3 buckets.

19.) Which of the following is NOT a valid Kafka connector type?

A) Source
B) Sink
C) Processor
D) Mirror

Answer: Option C

Explanation: Kafka Connect has only Source and Sink connector types. Processor is a Streams concept.

20.) What is the key purpose of integrating Kafka with Grafana?

A) Visualize Kafka metrics
B) Stream data
C) Store logs
D) Serialize messages

Answer: Option A

Explanation: Grafana, via Prometheus or JMX exporter, can visualize Kafka performance.

Leave a Reply

Your email address will not be published. Required fields are marked *