Prerequisites

Before you install the Kafka OCF connector, ensure that you have performed the following:

Enable Network Connectivity

Open the outbound TCP port 443 to the Confluent Kafka server.

Create a Service Account

Create a service account for Kafka. Refer to Service Accounts for Confluent Cloud.

Permissions

Make sure that the service account has the following permissions:

  • Cluster resource permissions:

    • Create

    • Describe

    • IdempotentWrite: For producers in Idempotent mode

    • InitProducerId(idempotent): To initialize the producer(Optional)

  • Topics resource permissions:

    • Alter

    • Create

    • Describe

    • Read

    • Write

Authentication Schemes

This section describes prerequisites for the authentication schemes that the Kafka OCF connector supports in Alation. Supported schemes are a subset of the authentication methods and exclude Azure AD OAuth 2.0 code‑grant flows that require interactive token persistence. For additional authentication scheme configurations, see Appendix - Authentication Schemes.

Azure Service Principal

To use Azure Service Principal authentication with the Kafka OCF connector, you must configure a Service Principal that is authorized to access Kafka instances hosted on Azure Event Hubs via the Kafka interface.

This method is not a generic Azure‑platform authentication mechanism for arbitrary Kafka clusters running on Azure VMs or other services.

When using Azure Service Principal authentication, you must also set the Azure Resource property; this is required for Kafka instances hosted on Azure Event Hubs.

To use Azure Service Principal authentication with the Kafka OCF connector, create or reuse a Service Principal with appropriate role assignments on your Azure Event Hubs namespace. You will need its application (client) ID, tenant ID, secret or certificate, and the Azure Resource (resource URI for Event Hubs).

For advanced Kafka connection properties (including authentication mechanisms not supported by the Kafka OCF connector), refer to the JDBC driver documentation used by your Alation deployment.

Prepare the JDBC URI

Use the following JDBC URI format to connect to the Kafka data source and extract topics from the schema registry:

Important

If you cannot include the username and password in the JDBC URI for security reasons, enter them in the User and Password fields in the Authentication section of the data source settings.

apachekafka://AuthScheme=Plain;User=<User>;Password=<Password>;BootstrapServers=<Server-Broker>;UseSSL=true;RegistryUrl=<RegistryURL>;RegistryUser=<Registry-User>;RegistryPassword=<Registry-Password>;TypeDetectionScheme=SchemaRegistry;

Note

From version 24.0.9375 onwards, the Kafka OCF connector supports disabling queue metadata columns from being included in MDE.

In the JDBC URI, set the ExposeQueueMetadataColumns parameter to false to prevent the connector from extracting metadata columns.

Note

From version 24.0.9320 onwards, the Kafka OCF connector supports extraction of topics with key-only schemas, if enabled using the AllowKeyOnlyRegistryTopics parameter.

In the JDBC URI, set the AllowKeyOnlyRegistryTopics parameter to true. This setting allows the connector to extract metadata for Kafka topics that have only a key schema and no associated value schema in Schema Registry. The connector also continues to extract both key and value schemas, which is the default behavior. Set this parameter to true when you have valid key-only schemas that you want to extract.

Setting the AllowKeyOnlyRegistryTopics parameter to false (default) allows the connector to extract both key and value schemas.

To extract columns for key-only schemas, set the MessageKeyColumn and MessageKeyType parameters in the JDBC URI.

Note

You can directly connect to the schema registry without the driver contacting the broker. In that case, you don’t have to provide credentials to the broker. To enable this, use the following config in JDBC URI: SchemaRegistryOnly=true;ValidateRegistryTopics=false;

Note

You can provide the parameter values in the individual fields in the Connector Settings section (preferred). However, the value provided in the JDBC URI field takes precedence over the same parameters configured in the corresponding field in the Connector Settings section.

Permission Models

The Kafka OCF connector supports two permission models.

Schema Registry–Only Mode (Reduced Scope)

This mode limits access to Schema Registry and disables topic validation. This mode is supported. However, it intentionally reduces metadata coverage and validation safeguards.

Use this mode when topic-level permissions cannot be granted.

Append the following parameters to the JDBC URI:

SchemaRegistryOnly=true;ValidateRegistryTopics=false;

Limitations
  • Topic-to-schema validation is disabled.

  • Metadata extraction relies entirely on registry contents.

  • Metadata may be incomplete if Schema Registry subjects and Kafka topics are not aligned.

  • Some features may be limited compared to Standard Mode.

Choose the Permission Model

Metadata Extraction

Permission model

Complete and validated metadata

Standard Mode

Minimal Kafka topic permissions

Schema Registry–Only Mode