Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename headless service port when TLS is enabled #2107

Closed
NotAndD opened this issue May 9, 2024 · 0 comments · Fixed by #2135
Closed

Rename headless service port when TLS is enabled #2107

NotAndD opened this issue May 9, 2024 · 0 comments · Fixed by #2135

Comments

@NotAndD
Copy link

NotAndD commented May 9, 2024

Is your feature request related to a problem? Please describe.

When adopting a tenant with Istio, proxies sidecars are added to each running Pod. Those proxies are using the svc names and svc port names to prepare a configuration in order to correctly redirect traffic. One of the Pod requirements of Istio is that:

If a pod belongs to multiple Kubernetes services, the services cannot use the same port number for different protocols, for instance HTTP and TCP.

But Istio also makes use of the names of the service ports to try understand the protocol being used by that port. The operator creates both a clusterIP and an headless service for the tenant:

  • The headless one always name it http-minio, I suppose that the traffic is not plain-text if TLS is enabled, even if the name does not change (but not sure?)
  • The ClusterIP one changes the name to https-minio if TLS is enabled

And this seems to break Istio handling of the incoming traffic, at least from my tests:

2024-05-09T06:35:50.917101Z    debug    envoy filter external/envoy/source/common/tcp_proxy/tcp_proxy.cc:430    [Tags: "ConnectionId":"262"] Creating connection to cluster outbound|443||minio.minio.svc.cluster.local    thread=21
2024-05-09T06:35:50.917148Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:291    trying to create new connection    thread=21
2024-05-09T06:35:50.917157Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:145    creating a new connection (connecting=0)    thread=21
2024-05-09T06:35:50.917223Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:1009    [Tags: "ConnectionId":"263"] connecting to 10.236.21.144:9000    thread=21
2024-05-09T06:35:50.917280Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:1028    [Tags: "ConnectionId":"263"] connection in progress    thread=21
2024-05-09T06:35:50.917296Z    debug    envoy conn_handler external/envoy/source/extensions/listener_managers/listener_manager/active_tcp_listener.cc:159    [Tags: "ConnectionId":"262"] new connection from 10.236.21.144:50942    thread=21
2024-05-09T06:35:50.917317Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:746    [Tags: "ConnectionId":"263"] connected    thread=21
2024-05-09T06:35:50.917430Z    debug    envoy filter external/envoy/source/extensions/filters/listener/original_dst/original_dst.cc:69    original_dst: set destination to 10.236.21.144:9000    thread=21
2024-05-09T06:35:50.917476Z    debug    envoy filter external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:137    tls:onServerName(), requestedServerName: outbound_.443_._.minio.minio.svc.cluster.local    thread=21
2024-05-09T06:35:50.917536Z    debug    envoy conn_handler external/envoy/source/extensions/listener_managers/listener_manager/active_tcp_listener.cc:159    [Tags: "ConnectionId":"264"] new connection from 10.236.21.144:45194    thread=21
2024-05-09T06:35:50.917920Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:328    [Tags: "ConnectionId":"263"] attaching to next stream    thread=21
2024-05-09T06:35:50.917930Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:182    [Tags: "ConnectionId":"263"] creating stream    thread=21
2024-05-09T06:35:50.917937Z    debug    envoy router external/envoy/source/common/tcp_proxy/upstream.cc:207    Attached upstream connection [C263] to downstream connection [C262]    thread=21
2024-05-09T06:35:50.917945Z    debug    envoy filter external/envoy/source/common/tcp_proxy/tcp_proxy.cc:817    [Tags: "ConnectionId":"262"] TCP:onUpstreamEvent(), requestedServerName:     thread=21
2024-05-09T06:35:50.918055Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:391    [Tags: "ConnectionId":"264"] new stream    thread=21
2024-05-09T06:35:50.918087Z    debug    envoy http external/envoy/source/common/http/filter_manager.cc:1035    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] Sending local reply with details http1.codec_error    thread=21
2024-05-09T06:35:50.918147Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:1794    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] closing connection due to connection close header    thread=21
2024-05-09T06:35:50.918163Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:1863    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] encoding headers via codec (end_stream=false):
':status', '400'
'content-length', '11'
'content-type', 'text/plain'
'date', 'Thu, 09 May 2024 06:35:50 GMT'
'server', 'istio-envoy'
'connection', 'close'
    thread=21
2024-05-09T06:35:50.918176Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:1968    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] Codec completed encoding stream.    thread=21
2024-05-09T06:35:50.918182Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:243    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] doEndStream() resetting stream  thread=21
2024-05-09T06:35:50.918186Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:1932    [Tags: "ConnectionId":"264","StreamId":"829618103751368357"] stream reset: reset reason: local reset, response details: http1.codec_error    thread=21
2024-05-09T06:35:50.918243Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:146    [Tags: "ConnectionId":"264"] closing data_to_write=162 type=2    thread=21
2024-05-09T06:35:50.918251Z    debug    envoy connection external/envoy/source/common/network/connection_impl_base.cc:47    [Tags: "ConnectionId":"264"] setting delayed close timer with timeout 1000 ms    thread=21
2024-05-09T06:35:50.918264Z    debug    envoy http external/envoy/source/common/http/conn_manager_impl.cc:439    [Tags: "ConnectionId":"264"] dispatch error: http/1.1 protocol error: HPE_INVALID_METHOD    thread=21
2024-05-09T06:35:50.918266Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:146    [Tags: "ConnectionId":"264"] closing data_to_write=162 type=2    thread=21
2024-05-09T06:35:50.918295Z    debug    envoy connection external/envoy/source/common/network/connection_impl.cc:788    [Tags: "ConnectionId":"264"] write flush complete    thread=21
2024-05-09T06:35:50.918466Z    debug    envoy connection external/envoy/source/extensions/transport_sockets/tls/ssl_socket.cc:329    [Tags: "ConnectionId":"263"] SSL shutdown: rc=0    thread=21
2024-05-09T06:35:51.065839Z    debug    envoy filter external/envoy/source/extensions/filters/listener/original_dst/original_dst.cc:69    original_dst: set destination to 10.236.15.158:443    thread=21
2024-05-09T06:35:51.065921Z    debug    envoy filter external/envoy/source/common/tcp_proxy/tcp_proxy.cc:246    [Tags: "ConnectionId":"265"] new tcp proxy session    thread=21
2024-05-09T06:35:51.065944Z    debug    envoy filter external/envoy/source/common/tcp_proxy/tcp_proxy.cc:430    [Tags: "ConnectionId":"265"] Creating connection to cluster outbound|443||minio.minio.svc.cluster.local    thread=21
2024-05-09T06:35:51.065986Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:291    trying to create new connection    thread=21
2024-05-09T06:35:51.065994Z    debug    envoy pool external/envoy/source/common/conn_pool/conn_pool_base.cc:145    creating a new connection (connecting=0)    thread=21

Describe the solution you'd like

Very small change, to rename the headless service port to match the other service port in case TLS is enabled. I've tested this change manually (scaling the operator down, then editing the headless service port name) and communication to the tenant starts working again. I couldn't test it with a tenant with multiple Pods tho, had only a simple one-pod-tenant available.

I'm happy to make the change if the above analysis sounds correct, but since I know very little (both about Golang and about the operator code) I fear this change, that to me looks little, might break something else which is expecting the port name of the headless service to remain stable or a similar thing.

Describe alternatives you've considered

Not sure of other alternatives, except maybe some configurations on Istio side that I am not aware of. Istio protocol selection suggests to either set the port name or to specify the appProtocol field, it always talks about the service itself, and never about other solutions like custom annotations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant