Kafka consumer related auto-config takes effect even if we are overdefining it #40174
Labels
for: team-meeting
An issue we'd like to discuss as a team to make progress
status: waiting-for-triage
An issue we've not yet triaged
Summary
Over-defining the
ConsumerFactory
instance in our configuration as stated in thespring-kafka
docs won't take effect, but the default instance will be created and used instead. With a minor modification in the bean definition, the intended behavior can be achieved, but it isn't straightforward to do so.To work the way the documentation states some
spring-boot-autoconfigure
related changes are required, or if it is not possible, then thespring-kafka
documentation should be modified accordingly.Details
I'm working with Spring Boot-based microservices connected by Kafka. The message values are in JSON format, in the serialized form of a common DTO, let's call it
Document
.spring-kafka
JsonDeserializer
for theDocument
ConsumerFactory
and applying our custom deserializer to itExpected behavior
Based on the
spring-kafka
documentation we should be able to override the default factory this way with our custom instance with the custom deserializer.What happens instead
The service will get runtime errors about failed deserialization instead, as it deserializes the message with the default into a
byte[]
and then tries to cast it into aDocument
.On the other hand, if the property-based config is provided, then the service runs without issues.
Based on these symptoms I'd say, that not our
ConsumerFactory
instance is used when creating theConsumer
, but the default one. It will pick up the property-based configuration, but if the property is not provided then it will use the default deserializer. (Verified it by putting breakpoint into theKafkaListenerContainerFactory
creation in theKafkaAnnotationDrivenConfiguration
, and the injected factory is in fact null.)Our programmatically configured
@Bean
should override the defaultConsumerFactory
(so we should be able to deserialize the messages without setting the property).Reason
I saw that in the
KafkaAnnotationDrivenConfiguration
theConsumerFactory
is injected like this:The problem I found regarding this is that if we are defining a
ConsumerFactory
instance the way it is stated in thespring-kafka
docs, it won't be injected here because theConsumerFactory<String, Document>
is not an instance of theConsumerFactory<Object, Object>
.Workarounds
@Bean
definition, then it would work as expectedConcurrentKafkaListenerContainerFactory
in our configurationConsumerFactory
with wildcards like this:Proposed solution
IMO using wildcards when injecting the
ConsumerFactory
into thekafkaListenerContainerFactory
would lead to complying with thekafka-spring
documentation, theConsumerFactory
could be used with concrete type arguments instead of wildcards.If it is not possible for some reasons I'm not aware of, then the
spring-kafka
docs should be modified to contain the correct usage in the example.Versions
spring-boot-autoconfigure:2.7.10
spring-kafka:2.8.11
This specific part of the
spring-boot-autoconfigure
implementation is the same in the latest version too as I could see, so I think the same issue would persist with a version upgrade as well, but I didn't check that.The text was updated successfully, but these errors were encountered: