Kafka MCQs – Kafka Schema Registry (Avro Support)

Apache Kafka becomes even more powerful and flexible when integrated with Confluent Schema Registry and Apache Avro. Avro allows serialization of structured data, while Schema Registry helps enforce schemas for Kafka messages, enabling forward and backward compatibility, validation, and evolution. These concepts are critical in building robust, schema-safe Kafka pipelines.

The following multiple-choice questions (MCQs) are tailored to help developers and architects prepare for real-world scenarios and interviews involving Kafka data serialization, Avro schemas, and Schema Registry integration.

1.) What is Apache Avro?

A) A database engine
B) A logging tool
C) A messaging broker
D) A data serialization framework

Answer: Option D

Explanation: Avro is a row-based serialization framework used with Kafka to structure data.

2.) What is the primary purpose of Schema Registry?

A) Manage Kafka topics
B) Store consumer offsets
C) Manage and store Avro schemas
D) Manage broker metadata

Answer: Option C

Explanation: Schema Registry stores, retrieves, and validates Avro schemas for Kafka data.

3.) Which company maintains the Confluent Schema Registry?

A) Confluent
B) LinkedIn
C) Apache
D) Cloudera

Answer: Option A

Explanation: Confluent developed and maintains the Schema Registry as part of their Kafka ecosystem.

4.) What format is commonly used with Schema Registry for Kafka message serialization?

A) JSON
B) YAML
C) Avro
D) Protobuf

Answer: Option C

Explanation: While Protobuf and JSON are supported, Avro is the most common format with Schema Registry.

5.) In Avro, what is the schema format?

A) XML
B) YAML
C) JSON
D) Binary

Answer: Option C

Explanation: Avro schemas are defined in JSON for human readability and interoperability.

6.) What format is used to send Avro messages in Kafka?

A) Plain JSON
B) XML
C) Avro Thrift Format
D) Confluent Wire Format

Answer: Option D

Explanation: Confluent uses a custom binary format with magic byte + schema ID + payload.

7.) Which of the following is a valid Avro data type?

A) character
B) int
C) varchar
D) number

Answer: Option B

Explanation: Avro supports types like int, long, float, string, etc.

8.) Which Avro schema compatibility type allows new fields with defaults?

A) None
B) Backward
C) Forward
D) Full

Answer: Option B

Explanation: Backward compatibility allows newer schemas to read older data by providing default values.

9.) What does the magic byte in Avro-encoded Kafka messages signify?

A) Schema ID in Schema Registry
B) Compression format
C) Producer IP
D) Kafka partition info

Answer: Option A

Explanation: The magic byte indicates that a message is Avro-encoded and includes a schema ID.

10.) What port does Schema Registry typically run on by default?

A) 8081
B) 2181
C) 9092
D) 8080

Answer: Option A

Explanation: Schema Registry defaults to port 8081 for REST API communication.

Leave a Reply

Your email address will not be published. Required fields are marked *