JSON MCQs – Performance and Security

21.) What is the most efficient way to handle large JSON datasets in memory-constrained environments?

A) Load the entire JSON at once
B) Use streaming APIs to process JSON incrementally
C) Convert JSON to XML first
D) Remove all nested objects

Answer: Option B

Explanation: Streaming APIs process JSON incrementally, avoiding high memory usage for large datasets.

22.) What is the best approach to log sensitive information within JSON responses?

A) Use a secure logging framework and mask sensitive data
B) Log the raw JSON data as is
C) Use JSON.stringify()
D) Avoid logging altogether

Answer: Option A

Explanation: Secure logging frameworks with masking ensure sensitive data is not exposed in logs.

23.) Why is it important to validate JSON data received from external sources?

A) To improve parsing speed
B) To convert JSON into JavaScript objects
C) To compress JSON
D) To prevent unexpected errors and security vulnerabilities

Answer: Option D

Explanation: Validating JSON data from external sources ensures it adheres to expected structures and prevents potential security risks.

24.) What does the Content-Security-Policy header do for JSON responses?

A) Specifies how JSON is cached
B) Defines the format of JSON data
C) Ensures JSON is minified
D) Restricts where JSON can be loaded from to prevent cross-site scripting (XSS) attacks

Answer: Option D

Explanation: The Content-Security-Policy header restricts where resources, including JSON, can be loaded from, helping to prevent XSS attacks.

25.) Why is it important to remove unused fields from JSON responses?

A) To avoid key duplication
B) To reduce payload size and improve performance
C) To enhance JSON readability
D) To prevent JSON schema validation errors

Answer: Option B

Explanation: Removing unused fields minimizes the payload size, improving network performance and reducing latency.

Leave a Reply

Your email address will not be published. Required fields are marked *